Organizational Learning at NASA: The Challenger and Columbia Accidents

  • Julianne G. Mahler &
  • Maureen Hogan Casamayou
Georgetown University Press: 2009. 256 pp. $29.95 9781589012660 | ISBN: 978-1-5890-1266-0

Mark Twain once said that “History doesn't repeat itself, but it does rhyme.” Julianne Mahler, a political scientist at George Mason University in Fairfax, Virginia, makes a case for this observation in her examination of NASA's organizational responses to the losses of the space shuttles Challenger in 1986 and Columbia in 2003. She asks what NASA learned from the first accident, how that changed it and whether those changes were in effect when the second accident occurred. Given the high-risk nature of human space flight, was NASA a 'learning organization'?

Organizational Learning at NASA doesn't add to the extensive official record. But the book is a clear and insightful discussion of the factors that contributed to both accidents.

Mahler analyses information-processing structures, relations with contractors, political and budgetary pressures and organizational culture, including rivalries between NASA field centres. These four factors were cited as contributing causes to the shuttle failures through defects in mission management, safety monitoring and responses to schedule and budget pressures.

The Challenger tragedy forced NASA to change its organizational culture. Credit: NASA

The book contrasts prescriptions from two organizational theories: 'normal accident' and 'high reliability'. The former states that some accidents are unavoidable in complex systems that are tightly coupled, such as nuclear power stations. Unexpected interactions among such a system's components can escalate a small problem into a major failure more quickly than human operators can respond. Thus a tension arises between the need for both centralized control, to ensure safe operations, and decentralized authority, to generate creative solutions to unexpected problems. These two goals conflict; yet society requires that complex systems such as air-traffic control operate safely. With great effort this can be achieved and centralization and decentralization can coexist; these are called high-reliability organizations.

High-reliability organizations seem ideal for managing a space mission. Their organization is hierarchical and centralized for routine decision-making. But when systems are stressed, other modes come into play. Cooperative relations emerge between front-line practitioners and senior officials; ranks fade as everyone focuses on fixing the problem and ensuring the mission's success. The organization responds to emergency situations that threaten major system failure with pre-planned procedures that are rehearsed and updated. Thus high-reliability organizations are highly self-conscious about learning, in order to respond to unexpected, non-routine events.

The Columbia Accident Investigation Board recommended that NASA adopt elements of high-reliability theory, although it observed that “neither High Reliability Theory nor Normal Accident Theory is entirely appropriate”. But the histories of Challenger and Columbia show that NASA has not fully integrated learning into its organization. After Challenger, there was a greater willingness to communicate problems up the management chain and clear lines of authority and accountability were created. But they blurred over the years as a result of budgetary and organizational turmoil.

Only minor errors of interpretation or omission creep in to Mahler's analysis. For example, the congressional direction to use commercial software — when no suitable product existed — isn't mentioned in her discussion of NASA's difficulty in implementing a financial reporting system for project managers. This omission could lead readers to conclude that difficulties were solely due to NASA's shortcomings. But imposed requirements also contributed to decisions made.

Budget and schedule pressures came from Congress and the White House. And political pressures from congressional supporters influenced the creation of specialist 'lead centres', rivalries between which may have hindered communications. Thus it is not always obvious what lessons are being taught and by whom. If an organization is struggling to survive or to do too much with too little, the attention required for mission success may falter. As Mahler says, “NASA did learn in some cases, at some times, about some things.” Unfortunately, in spaceflight, especially human spaceflight, that isn't good enough.