Article ID: | iaor20173557 |
Volume: | 28 |
Issue: | 4 |
Start Page Number: | 729 |
End Page Number: | 743 |
Publication Date: | Aug 2017 |
Journal: | Organization Science |
Authors: | Oliver Nick, Calvard Thomas, Potocnik Kristina |
Keywords: | organization, behaviour, information, transportation: air |
Organizations, particularly those for whom safety and reliability are crucial, develop routines to protect them from failure. But even highly reliable organizations are not immune to disaster and prolonged periods of safe operation are punctuated by occasional catastrophes. Scholars of safety science label this the ‘paradox of almost totally safe systems,’ noting that systems that are very safe under normal conditions may be vulnerable under unusual ones. In this paper, we explain, develop, and apply the concept of ‘organizational limits’ to this puzzle through an analysis of the loss of Air France 447. We show that an initial, relatively minor limit violation set in train a cascade of human and technological limit violations, with catastrophic consequences. Focusing on cockpit automation, we argue that the same measures that make a system safe and predictable may introduce restrictions on cognition, which over time, inhibit or erode the disturbance‐handling capability of the actors involved. We also note limits to cognition in system design processes that make it difficult to foresee complex interactions. We discuss the implications of our findings for predictability and control in contexts beyond aviation and ways in which these problems might be addressed.