Finden Sie Ihren nächsten buch Favoriten

Werden Sie noch heute Mitglied und lesen Sie 30 Tage kostenlos
The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, Enlarged Edition

The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, Enlarged Edition

Vorschau lesen

The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA, Enlarged Edition

Bewertungen:
3.5/5 (4 Bewertungen)
Länge:
947 Seiten
15 Stunden
Freigegeben:
Jan 4, 2016
ISBN:
9780226346960
Format:
Buch

Beschreibung

When the Space Shuttle Challenger exploded on January 28, 1986, millions of Americans became bound together in a single, historic moment. Many still vividly remember exactly where they were and what they were doing when they heard about the tragedy. Diane Vaughan recreates the steps leading up to that fateful decision, contradicting conventional interpretations to prove that what occurred at NASA was not skullduggery or misconduct but a disastrous mistake.
Why did NASA managers, who not only had all the information prior to the launch but also were warned against it, decide to proceed? In retelling how the decision unfolded through the eyes of the managers and the engineers, Vaughan uncovers an incremental descent into poor judgment, supported by a culture of high-risk technology. She reveals how and why NASA insiders, when repeatedly faced with evidence that something was wrong, normalized the deviance so that it became acceptable to them. In a new preface, Vaughan reveals the ramifications for this book and for her when a similar decision-making process brought down NASA's Space Shuttle Columbia in 2003.
Freigegeben:
Jan 4, 2016
ISBN:
9780226346960
Format:
Buch

Über den Autor


Ähnlich wie The Challenger Launch Decision

Ähnliche Bücher
Ähnliche Artikel

Buchvorschau

The Challenger Launch Decision - Diane Vaughan

The University of Chicago Press, Chicago 60637

The University of Chicago Press, Ltd., London

© 1996, 2016 by The University of Chicago

All rights reserved. Published 2016.

Printed in the United States of America

25 24 23 22 21 20 19 18 17 16 1 2 3 4 5

ISBN-13: 978-0-226-34682-3 (paper)

ISBN-13: 978-0-226-34696-0 (e-book)

DOI: 10.7208/chicago/9780226346960.001.0001

Library of Congress Cataloging-in-Publication Data

Vaughan, Diane, author.

The Challenger launch decision : risky technology, culture, and deviance at NASA / Diane Vaughan ; with a new preface.

pages cm

Includes bibliographical references and index.

ISBN 978-0-226-34682-3 (pbk. : alk. paper) — ISBN 978-0-226-34696-0 (ebook) 1. Challenger (Spacecraft)—Accidents. 2. United States. National Aeronautics and Space Administration—Management. 3. Aerospace industries—United States. 4. Organizational behavior—Case studies. 5. Decision making—Case studies. I. Title.

TL867.C467 2016

363.12'465—dc23

2015021066

This paper meets the requirements of ANSI/NISO Z39.48-1992 (Permanence of Paper).

The Challenger Launch Decision

Risky Technology, Culture, and Deviance at NASA

With a New Preface

Diane Vaughan

The University of Chicago Press

Chicago and London

For

Katherine, Zachary, and Sara Vaughan

Kristen, Lindsey, and Melissa Mortensen

Sophie and Cameron Nicoll

Contents

List of Figures and Tables

Preface to the 2016 Edition

Preface

One

THE EVE OF THE LAUNCH

Two

LEARNING CULTURE, REVISING HISTORY

Three

RISK, WORK GROUP CULTURE, AND THE NORMALIZATION OF DEVIANCE

Four

THE NORMALIZATION OF DEVIANCE, 1981-84

Five

THE NORMALIZATION OF DEVIANCE, 1985

Six

THE CULTURE OF PRODUCTION

Seven

STRUCTURAL SECRECY

Eight

THE EVE OF THE LAUNCH REVISITED

Nine

CONFORMITY AND TRAGEDY

Ten

LESSONS LEARNED

Appendix A

COST/SAFETY TRADE-OFFS? SCRAPPING THE ESCAPE ROCKETS AND THE SRB CONTRACT AWARD DECISION

Appendix B

SUPPORTING CHARTS AND DOCUMENTS

Appendix C

ON THEORY ELABORATION, ORGANIZATIONS, AND HISTORICAL ETHNOGRAPHY

Acknowledgments

Notes

Bibliography

Index

Figures and Tables

All figures are reproductions of documents appearing in Report to the President of the Presidential Commission on the Space Shuttle Accident (Washington, D.C.: Government Printing Office, 1986).

Figures

1. Space Shuttle System

2. Solid Rocket Booster

3. Booster Field Joint

4. Flight Readiness Review and Shuttle Program Management Structure

5. Joint Rotation

6. STS 41-G Flight Readiness Review Charts

7. Marshall Request for Briefing after STS 51-C

8. STS 51-E Flight Readiness Review Charts

9. STS 51-F Flight Readiness Review Chart

10. August 19, 1985, NASA Headquarters Briefing Chart

11. January 27, 1986, Teleconference Charts

12. Final Thiokol Launch Recommendation

13. Postaccident Temperature Analysis

B1. Final Teleconference Participants

B2. Relevant Organization Charts: NASA and Morton Thiokol

B3. Critical Items List Entry, Solid Rocket Booster, Criticality 1R

B4. Critical Items List Entry, Solid Rocket Booster, Criticality 1

B5. Postaccident Trend Analysis: Anomalies, Leak Check Pressure, and Joint Temperature

B6. Engineering Memoranda Written after O-ring Concern Escalates

Tables

1. Shuttle Launch Delays

A1. Contractor Proposals for Solid Rocket Motor

Preface to the 2016 Edition

When you put something into print, you never know how far your ideas will travel or what the response will be. The 1996 publication of this book took me on an unexpected odyssey, away from my academic world into public sociology and into the exotic realms of politics, investigative secrecy, rocket science, and ultimately, policy sociology. For me, public sociology has always been about teaching new audiences, but from the beginning of this experience, my vision of how organizations work was changed by what I learned from others—first from reporters and e-mail correspondents, then from the Columbia Accident Investigation Board and its staff. However, the most important lessons I learned were to come from my encounters with NASA.

On January 28, 1986, I was one of the millions who heard the news and then watched, riveted, as over and over on TV NASA’s Space Shuttle Challenger rose from the launch pad, then turned, and at the words "Challenger, go at throttle up," exploded, pieces raining down into the ocean. Lost were seven NASA astronauts, one of them Christa McAuliffe, the Teacher in Space. The collective shock continued as the Presidential Commission investigating the accident began its public hearings. They identified the technical cause as a failure of the rubber O-rings to seal the shuttle’s Solid Rocket Booster joints. But the NASA organization also had failed.

In an urgent teleconference on the eve of the Challenger launch, the engineers who worked closely on the Solid Rocket Boosters had protested against launching, given the unprecedented cold overnight temperatures predicted for Florida, but managers overrode their protests and proceeded, with disastrous consequences. More startling still, the investigation revealed the shuttle had been flying with damage to the O-rings since the first shuttle mission in 1981. The Commission’s official report blamed NASA’s managers’ flawed decision making: on the eve of the launch, they had succumbed to production pressure and launched, dismissing the concerns of their engineers and violating rules about passing information about the teleconference up the hierarchy in order to keep to the launch schedule.

Based on additional Commission findings, I began this research with the hypothesis that the Challenger accident was not caused by amorally calculating individual managers, as the report suggested, but was an example of organizational misconduct. All the classic signs were there: production pressures, risk taking and violation of rules in pursuit of organizational goals, and regulatory failure. Initially, I envisioned a short article. However, once I got beyond the readily available information and deep into the supplemental volumes of the Commission’s report, I found information that contradicted most of my starting assumptions about what had happened. No rules had been violated on the eve of the launch. Instead managers had conformed to NASA’s rules and procedures. More incredible, in the years preceding Challenger, despite worsening O-ring damage, managers and engineers alike repeatedly assessed the O-ring damage as an acceptable risk in engineering documents. How was this possible? As the complexity of what happened materialized, the short article I imagined at the outset was reimagined as a chapter in a book and then grew into a book in its own right.

In 1992, after six years analyzing the history of launch decision making in the years before the Challenger launch, I had found no evidence of rule violation and misconduct by individuals. Instead, the key to accepting risk in the past was what I called the normalization of deviance: having accepted anomalous O-ring performance once, precedent became the basis for accepting it again and again. The normalization of deviance was a product of NASA’s organizational system: the connection between environment, organization, and individual interpretation, meaning, and action. At this point, I turned to events on the eve of the Challenger launch. I reconstructed the sequence of events that night based on the National Archives interviews with all participants, not just the few who appeared before the Commission. I was astonished to discover that the patterns of the past repeated: the same factors that had NASA flying with flaws on the O-ring since 1981 were reproduced on the eve of the launch. The decision was not explained by amoral, calculating managers who violated rules in pursuit of organizational goals, but was a mistake based on conformity—conformity to cultural beliefs, organizational rules and norms, and NASA’s bureaucratic, political, and technical culture.

In June 1995, I submitted the final manuscript to my publisher. I was worried. It was five years after my expected completion date. Had I published five years earlier, my analysis would have been wrong. But now, with my analysis well documented and solid, the Challenger accident was not the major question of our time. Moreover, the text was full of scholarship and technical information, and at 500 pages, the book was so long that I thought no one would read it. Worse, if they did, I worried no one would believe me: my analysis contradicted much of the published scholarship, including the report of the Presidential Commission.

Then my intrepid publisher decided to get the book out in six months, in time for the tenth anniversary of the accident. Ironically noting they were making a decision to launch under production pressure—six months from manuscript to books-in-stores is very unusual in scholarly publishing—they set the publication date for January 28, 1996. On schedule, in November 1995, the bound page proofs landed on the desks of 400 members of the media, reminding them of the anniversary to come. Many, I later learned, had covered the accident. It had struck a chord. From November through the first weeks of January, I was busy with interviews and tapings that would be ready for release on the anniversary and week following.

On the Tuesday before, Malcolm Gladwell published Blowup in the New Yorker, featuring the book. On Sunday, the publication date, the book was reviewed in forty major papers across the country and two in England. Space Center city papers published special sections, referencing it. The New York Times ran a six-column front-page story on NASA and Challenger, mentioning the book in the lead paragraphs. Incredible. I was swept up in the public response. I heard from engineers and people in many different kinds of organizations who recognized the analogies between what happened at NASA and the situations at their organizations. NASA is us, some wrote. Struggling to keep up with my teaching and other academic responsibilities, I was busy for months with speaking engagements: initially, corporations and government organizations concerned about risk and preventing catastrophic outcomes; then later, academic institutions. I heard nothing from NASA.

Writing the final pages of this book in 1995, I had emphasized that mistakes are inevitable in complex organizational systems, like NASA’s, due to the intricate connections between actions of powerful actors in an organization’s environment, their effect on the organization—its structure and culture—and the consequent effect on how decisions are made. In the last paragraph, I predicted another accident because I saw that NASA’s supportive postaccident environment was gone. Actions of White House, Congress, and NASA leaders had recreated the space agency’s environment of scarcity, reproducing production pressures and the very organizational conditions that set the Challenger accident in motion. But I had not fully imagined another accident and its immediate and historical import.

COLUMBIA AND THE CAIB

Seventeen years and 88 launches after Challenger, it happened again. At 9:00 a.m. on February 1, 2003, NASA’s Space Shuttle Columbia disintegrated over Texas, having reentered the earth’s atmosphere, its mission completed. Again, seven astronauts lost their lives. Like the rest of the country, I had gone on to other things. I again experienced shock and dismay, grieving more than before because I knew the organization and its people from my archival work, and I knew some NASA and contractor personnel well from personal interviews and continuing conversations during the years of my research. I understood the emotional cost of the first accident and the effort to rebuild the system. This second loss had to be of staggering impact. What had happened?

Information was soon forthcoming from NASA’s internal investigation. On the day of the accident, in a press conference, NASA revealed that cameras at the launch site had shown a large piece of foam insulation breaking off the shuttle’s external fuel tank and hitting the leading edge of the Orbiter’s wing. Unsure of the foam piece’s size and the amount of damage to the protective heat tiles, NASA’s speculation was that the enormous heat during reentry into the earth’s atmosphere might have burned a hole in the wing and destroyed the Orbiter. Even as I watched the developing news story, I began receiving phone and e-mail inquiries from journalists about the accident.

The next day, the Space Shuttle Program Manager spoke at the daily press conference to update the public on the investigation. He carried with him a piece of foam about the size of a small suitcase, approximating the size of the one that hit the wing. He explained that NASA knew that foam debris had repeatedly hit the Orbiter at shuttle launches over the years. Each time, the protective heat tiles were damaged or lost, so they had to be replaced before the next launch. It was a maintenance problem, nothing serious. He dismissed it as a probable cause of Columbia’s demise, stating, We were comfortable with it. Both his words and his matter-of-fact manner were stunning: a long problem history, repeatedly flying with flaws, accepting risk as routine—this looked like the normalization of deviance to me. Reporters who had covered Challenger and were familiar with my book also saw the parallels, got in touch, and wrote stories that suggested the similarities between Challenger and Columbia.

I began having conversations with a new generation of journalists, teaching Challenger and its sociological explanation. Soon, concepts from the book—organizational culture, missed signals, the normalization of deviance—appeared in print, without reference to their origin. The advent and widespread adoption of e-mail since Challenger made continuing conversations with a wider public possible. I was not solely teaching others, I was learning from them. I was getting breaking news before it broke from reporters, who consulted with me about the interpretation of what they were finding and stories in progress. New information also came from many of my e-mail correspondents: current and former NASA engineers and quality assurance personnel, scientists, space buffs, risk management professionals, readers of the book, and employees from all kinds of organizations, who, like those who wrote after Challenger, worried that NASA is us.

NASA initiated an accident contingency plan, created after Challenger, establishing the Columbia Accident Investigation Board (CAIB). The contingency plan automatically designated that in the event of an accident, an investigation would be conducted by people who at that time occupied government positions that gave them expertise in accident investigation and safety. These people included scientists and military experts representing the Air Force, Department of Transportation, NASA Ames Research Center, Federal Aviation Administration, US Navy, and US Space Command. The CAIB mobilized quickly in the weeks after the accident, taking up residence near Johnson Space Center in Houston. NASA appointed Admiral Harold Gehman (retired), who was experienced in accident investigations and a known government critic, as CAIB Chair. In response to controversy about the Board members’ interdependence with government, Admiral Gehman and the Board added five members, among them Sally Ride, widely known as the First Woman in Space and a member of the Challenger’s Presidential Commission.

The Board’s activities were fully covered by the media. Dividing up the work, its members were searching for the technical cause of the failure and why NASA had been flying for years with known flaws. These were the same questions raised in the investigation after Challenger, except this time the flaw was foam debris hits to the Orbiter. They soon uncovered another major similarity with Challenger: engineering concerns were not taken seriously by their superiors. NASA engineers saw the films of the debris hit and, worried about the unusual size of the foam, requested satellite photos of the still-orbiting shuttle in order to clarify more precisely where the foam had hit and the amount of damage to the Orbiter. But their requests were denied by the Director of the Mission Management Team, the unit responsible for managing postlaunch operations and the landing.

As these parallels between the two accidents continued to surface, I remained a regular contact for the press. The CAIB began televised public hearings in Houston. Near the end of March, I was invited to testify at a two-hour session on April 23. My presentation would be followed by questions from the Board, which could be a great opportunity to teach about how things go wrong in organizations. On the other hand, I had never been involved in a government investigation. Allowing as much time for panic as the three-week deadline for preparation allowed, I hurried to assemble and analyze the data I had gathered from reporters, newspapers, science/technology magazines, and accounts from engineers and scientists. My testimony would compare the causes of the Challenger and Columbia accidents. I would argue that the organizational causes of Challenger had never been fixed and had brought down Columbia. In CAIB press conferences, Admiral Gehman seemed to validate my findings and provided new evidence supporting them.

My Houston visit began at the CAIB offices the day before my testimony. Security measures included an escort for the day—a political science PhD on the research staff of one of the Board members, who took me to the 9:00 a.m. meeting of the Board and staff, led by Admiral Gehman. The daily stand-up briefing, a military ritual, was key to coordination. Each person updated the group on recent developments under their expertise. The Admiral (not in military dress) was a master at democratic practice, giving Board and staff comments equal attention. Sitting with staff in chairs around the long conference table, I was introduced to the group, which included military and civilian experts. After the briefing, my escort confided that the Admiral had read my book early in the investigation, saw the analogies with Challenger, and persuaded of the sociological perspective, was using it to guide their analysis because it fit their data.

In the afternoon, I met with the two Board groups working on the organizational causes. Present were Major General Ken Hess, chief of safety, US Air Force, who headed the organizations group, and astronaut and physicist Dr. Sally Ride, in charge of the group that was examining both the history of decisions about the foam debris hits and the sequence of events and decision making by the Mission Management Team while Columbia was in orbit. In a sleeves-rolled-up working session, we discussed the significance and interpretation of some of their groups’ recent discoveries, including a comparison of similarities and differences between Columbia and Challenger, which both they and I had been tracking. The missing link was that so far they had no evidence about how production pressures might have figured into the postlaunch decision making by NASA’s Mission Management Team.

My next meeting was with the Board, where I would make an informal presentation before them, in advance of the formal one in the public arena the next day. Although I stuck to my prepared presentation, the setting and discussion were both informal, their questions out of curiosity and interest rather than adversarial critique. They clearly got the sociological analysis, so the questions were aimed at comparing my analysis with their own evidence and conclusions. But in addition, they were concerned about what the similar organizational causes for the accidents meant for preventing another accident because they would be responsible for policy recommendations. To demonstrate, I suggested examples of changes that would target the organizational causes they were finding. At the end, we all adjourned for a relaxed dinner at a small restaurant nearby. I returned to my hotel, working late to review my testimony, without adding any of the investigations’ secrets that had been disclosed to me.

The day had defied my every expectation for an adversarial encounter, but the next day revived them. The public hearings were held in the ballroom of my hotel. I peeked in at the morning’s proceedings as I was on my way to the green room. Visually dramatic, the ballroom was dominated by a large commemorative representation of the Columbia mission logo, which had been worn as a patch on the Columbia astronaut space suits, that was centered on a wall of navy velvet draperies. The spatial arrangement for the action announced an interrogation. On opposite sides of the room were two long tables covered with floor-length white cloths, distant and situated at angles to one another. At the left table were two witnesses; on the right, questioning them, were seven members of the CAIB, the Admiral and two others in full military dress. The questioning area was brilliantly lit for TV. In the shadows the audience sat in metal folding chairs, the first three rows packed with about forty journalists, the rest empty save for a few members of the public. On the side were technicians operating banks of equipment for lighting and sound. The hearings were being broadcast live on NASA TV and video streamed to media outlets across the country and on the Internet.

Hands suddenly cold and clammy, I arrived at the green room just as the morning session concluded. The Board members stood eating sandwiches while they relayed the events of the morning to me. When the lunch break ended, so did our collegial relationship: we lined up, walked through the velvet curtains, took our seats on opposite sides, and became part of the ongoing public spectacle. I made the same presentation I made the day before. Arguing that the social causes of Challenger had not been fixed and had caused Columbia, I explained that both accidents were the result of an organizational system failure. Then I laid out the parts of the system and the connections between them, beginning with Challenger.

Translating the book into lay language for the public, I nonetheless retained the key concepts—culture, structure, missed signals, the normalization of deviance—in an example-based demonstration of how that system worked, first with Challenger, then Columbia. For each in turn, I showed the connections between the parts of the system: how NASA’s institutional environment—historic political and budgetary decisions—affected NASA’s structure and culture, which in turn affected decisions about O-ring erosion and foam debris damage. I gave examples from my research to demonstrate the parallels but also pointed out the differences as I went along. I concluded by stressing that changing personnel and fixing the technology would not be enough; preventing another accident would require specifically connecting strategies for change with the organizational causes of the agency’s problems. The Board’s questions were more difficult than the ones they had asked the day before, but appropriate for NASA personnel watching the hearings. Exhibiting their painstakingly acquired understanding of how NASA operated, many questioners probed for possible solutions to NASA’s organizational problems.

The next day my testimony was summarized in major newspapers and distributed by wire services. In the concluding moments of my testimony, a Board member asked if NASA had contacted me after the publication of my book. I answered that they had never called, a minor point of my testimony that the press in attendance jumped on as a punch line for their stories. In a press conference, NASA Administrator Sean O’Keefe was queried repeatedly about my testimony. Book sales must be up, he muttered. Back in Boston a few days later, I received a phone call from a NASA associate administrator. This was my first direct contact with anyone from NASA Headquarters. He was congenial but insistent: I was wrong. There were no similarities between the causes of the two accidents. Many changes had been made post-Challenger, and he knew because he was the administrator in charge of them. He described them in detail, reinforcing his point. I mainly listened, taken aback by the strength of his convictions. No similarities? How could he not see them now, after the fact?

CONSTRUCTING THE REPORT

My relationship with the CAIB continued in Boston, based on an agreement that I would give feedback on draft chapters addressing the organizational causes. When the Board and staff moved from Houston to Washington, DC, in June to continue the investigation and write the report, I was asked to join them for a few days. I continued my role as consultant on data interpretation and gave feedback on chapter drafts. Already threaded through the chapters were concepts from Challenger applied to Columbia: organizational structure and culture; weak, routine, and missed signals; the normalization of deviance. Including these organizational factors was a first for accident investigation reports. This was not the traditional human factors analysis that combined individual failure with technical failure. Instead, the CAIB report introduced an expanded causal model that gave equal prominence to the technical causes (part 1) and the social causes (part 2) of the accident. The part 2 editor, a doctoral student from the Kennedy School of Government who would play a crucial role in the structure and editing of the report, explained the sequence of events that led to this outcome.

While reading my book in February, Admiral Gehman was convinced that a large part of the report should address the social causes of the accident, even before the final five Board members were appointed. While making an outline together, he and the editor constructed and organized the part 2 chapters to conform to Challenger’s three-part causal model, which then was approved by the Board. In sequence, the part 2 chapters were NASA’s political-economic history (environment), its decision-making processes (individual meaning and choice), and its organizational structure and culture. Elated by this news, I was nonetheless worried that the chapter titles would be confusing and undermine the usefulness of the Board’s report. The Board chose titles to emphasize that the causal factors were not all of equal significance. Respectively, the titles were Beyond the Proximate Cause (history/environment), The Accident’s Underlying Cause (individual decision making), and Factors that Contributed to the Loss (organization). They believed the proximate cause—NASA’s historic political, economic, and social circumstances—was the least important; the underlying cause—the decision-making chapter—the most important. My concern was that readers, especially NASA and Congress, would not be able to decipher the meaning of these fine distinctions in causal importance. Moreover, distinguishing them as of lesser or greater causal importance would limit the Board’s policy recommendations, missing the crucial importance of external political factors to internal culture and structure, and thus decisions. For Challenger, the three were interrelated and each equally significant, as I felt the evidence convincingly showed they were with Columbia.

Keenly aware of my marginal status but concerned about the report’s reception and encouraged by the Admiral’s democratic process and by my boss, Ken Hess, I walked down the hall to the Admiral’s small office with a revised outline that gave the chapters substantive titles and equal responsibility in the causal sequence. The Admiral was interested and endorsed the new titles, but he disagreed about giving equal weight to the history chapter. History is a scene-setter, not a cause, he said. I explained my stance, giving examples from both Columbia and Challenger data. He remained unconvinced but open to discuss further. Encouraged, I suggested I write a short document that compared the two accidents, showing how historic conditions in NASA’s environment had indeed had a causal effect. It would be a writing experiment to show him how history worked on organizations, with no attendant obligation on his part to use it. He was interested. At the end of June, I retreated to Boston to write. Three weeks later, I sent a draft tentatively titled "History as Cause: Columbia and Challenger." The Admiral was convinced to such an extent that he accepted it as a chapter in the report, along with its implications for the Board’s expanded causal model and its policy recommendations.

When I returned to the CAIB, it was in a permanent capacity as a staff member. I had never imagined or sought this. My role as academic consultant had incrementally expanded to full-time participation in the ongoing creation of the report. Thus Ken had proposed putting me on salary for the rest of the summer until the report was complete. I was caught up in the ongoing mystery and collective mission, so I was happy to be part of the continuing excitement. The change was not without its complications for my public sociology, however. No longer an independent observer, I had moved into the domain of policy sociology. I had formally taken sides, a change from the independent stance of professional sociology I held while writing the Challenger book. Consequently, I cut back my contacts with the press, except for matters that did not compromise the Board’s ongoing work. I was told that as a scholar, I could speak on any issue, as long as I did not disclose privileged information or appear to be representing the Board. I recognized that my continued relationship with the press was a good thing, from the Board’s point of view, because I consistently reinforced their sociological message and could speak at times that they could not. However, I had been having running conversations with the various members of the press since February and felt torn between loyalties.

The report had to be published August 26 in order to be read by Congress before it reconvened after Labor Day. Every day the political significance of the investigation was salient in the CAIB offices. Members of the Board were meeting with Congress to protect certain interviews from release to them; the media were making every effort to try to find out what was going on; security was increased, as weekly we were given new instructions to protect against leaks. Internally, the off-site staff investigators returned and were writing. I was integrating their new evidence into my chapter and integrating sociological themes into theirs. A breakthrough came when one of Sally Ride’s staff returned with solid evidence that meeting the next scheduled launch date had affected the Mission Management Team Director’s decision not to get satellite images of the orbiting shuttle. The Director was concerned that rerouting the Orbiter toward a satellite would take time, thus delaying the next shuttle launch, so on those grounds and the belief that the foam debris was a maintenance problem only, she decided against it.

On August 6, the Board met to review the penultimate chapter drafts of part 2, Why the Accident Occurred. To open the section, they composed an Organizational Cause Statement, unprecedented in accident investigation history. A rigorous review of draft chapters followed, including mine. As time grew shorter, the work days grew longer. On August 17, the report had to go to the printer. Disagreements about the final organization of part 2 were still ongoing. On August 21, an exhausted but excited group of editors and staff flew with the completed manuscript in a NASA jet to Seattle, where the initial 500 copies of the report were printed and packaged and returned with the group to Washington, DC, on August 25, the day before the established date of release. The Sunday New York Times ran an article featuring the Admiral, who announced that the report would give equal weight to the technical causes and the social causes of the accident, simultaneously revealing me as the author of chapter 8, "History as Cause: Columbia and Challenger."

The next day at 12:30 p.m., the Board had a press conference that was broadcast live on all major networks, NASA TV, and the Internet. The Admiral stated the Board’s findings: the Columbia disaster, like Challenger, was caused by a failure of NASA’s organizational system. He stressed that unless the Board’s recommendations for technical, institutional, organizational, and cultural changes were implemented, an organizational system failure was likely to repeat. The next day, the language of sociology was embedded in the news. Space Center cities published entire sections about the social causes; the New York Times excerpted the part 2 chapters. The Board next explained their expanded causal model to the US House Committee on Science. In his prepared remarks, the Admiral read the report’s Organizational Cause statement, saying, We believe that these factors are just as much to blame as the foam. Next the Board briefed a joint session of the House and Senate. Then the members dispersed to national and international forums, explaining their expanded causal model in response to requests that had been made far in advance.

I, too, traveled to convey the Board’s message, but to an unexpected audience. I was invited to NASA, courtesy of the press. On the afternoon of the report release, NASA Administrator Sean O’Keefe held a press conference in which he announced that NASA would fully comply with the Board’s recommendations for change, which targeted both the technical and organizational causes of the accident. Because they had not had a chance to study the report yet, he limited his remarks, saying that he would have more to say later. Then he asked for questions. The first question was, Have you read Diane Vaughan’s book yet? O’Keefe responded, Yes, we’re all reading it. Some people from here have contacted her, the first several months ago. Apparently he was referring to the associate administrator who phoned me in April. An Associated Press reporter, who interviewed me in Houston after my testimony, called to verify. Asking if I had heard from NASA, I told her of my single contact with the associate administrator. She followed up by calling him. He agreed about the content of our April conversation and told her he had been wrong. Within hours NASA called, inviting me to dinner at headquarters to talk with top officials. Sticking relentlessly to her target, the reporter’s AP wire story was headed, NASA Finally Looks to Sociologist.

NASA REVISITED

The dinner was at NASA Headquarters in Washington. Meeting me at the elevator was the very associate administrator who had phoned me in April. Welcoming me warmly, he immediately began talking about how he had been wrong and why he thought as he did at the time. Words came tumbling from him. He was so apologetic and eager to convey the wrongness of his logic that I was convinced he was sincere. We walked to the recently renamed Columbia Café. A small informal dining room, it had been renamed in honor of the Columbia crew and its mission. Mounted on the walls were historic photos of other failed missions and the astronauts who had died attempting them. It was hard to imagine it as a place for a festive occasion, but the space was a fitting backdrop for the seriousness of the conversation that was to unfold that day.

Six round tables, each seating four, were waiting. I met people whose names were familiar from the investigation; a few talked to me about their history with either one or both shuttle accidents. I was assigned to a table with two former astronauts, one now NASA’s chief scientist, the other director of safety. The empty seat was reserved for Sean O’Keefe, who had been delayed at the White House. I was already seated when a Board member whom I did not know well showed up, invited because he had missed the Board’s briefing of the report at NASA Headquarters on the day of its release. He carried prepared comments. I noted that he had been asked to speak in advance and I had not, but I was seated at O’Keefe’s table (places marked by place cards) and he was not. I wondered if he was confused about this arrangement that contradicted the Board hierarchy; I was not. I had been elevated in status by virtue of press leverage. O’Keefe arrived near the end of dinner. Remaining seated, he welcomed us both, following with spontaneous and moving comments on the impact of the accident on NASA, both on individuals and the organization. Explaining that the Columbia Café was created as a reminder of past failures, he vowed that NASA would never repeat them.

Then the Board member was invited to speak. He stood, following his notes but speaking freely and, as O’Keefe had, from the heart. His comments were in keeping with his history: a technically skilled industry leader responsible for development of heavy equipment for nuclear submarines and aircraft carriers who had served on major panels evaluating NASA’s International Space Station and the Hubble Telescope. What he was expressing was anger. He harshly condemned NASA for perpetuating the problems that had caused Challenger and for what the Board had called a broken safety culture. O’Keefe responded with an equally strong statement about how they were already into processing the report and considering how to implement it, pushing back with Let us get up off the mat! Then the group began a discussion of the CAIB recommendations for change that would alter the social causes of the accident. O’Keefe whispered to me that I could comment as well but should not feel obligated, because I had not been asked in advance to prepare comments.

What struck me about the conversations before, during, and after dinner was that NASA didn’t know how to implement the Board’s recommendations for structural and cultural change, especially the latter. First, these were all scientists and engineers, and culture is a fuzzy concept to them. Second, they thought they had a solid safety culture. After all, many successful missions had been launched between Challenger and Columbia, and the safety culture had apparently worked well for them. Now there is an accident and suddenly the safety culture is broken? How were they to know whether a change would undo something that had been effective, or whether the change would create new problems? These were good, logical questions. Moreover, the practical implications were overwhelming for them, as they would be for most executives facing a major change mandated by outsiders. In addition, they were still grieving for the loss of the crew, the mission, and the public revelations that, once again, they had been fatally wrong. And finally, production pressure was alive and well, this time imposed by the CAIB’s report: they could not launch again until both the technical and organizational changes had been successfully implemented.

When it was my turn to speak, I assured them that only they knew NASA well enough to understand how it worked, so the best I could do would be give examples, based on the report chapters. I began explaining the principles behind the Board’s findings, identifying necessary organizational changes indicated by each of the social cause chapters and giving examples that targeted the causes revealed in each. With frustration, they talked about the difficulty of implementing the Board’s most important structural change into the existing organizational structure. The Board’s goal was to prevent operations mandates from eroding the safety culture. The recommendation was to separate operations (and thus schedule, economy, and production) from engineering decisions and safety responsibilities. As it stood, the bean counters on the operations side of the house were responsible for final budget decisions for the engineering side of the house. The CAIB recommendation was to separate the two structures and the budget function, giving engineering its own budget with an engineer having the final say-so.

Although this change would produce the cultural effect the Board endorsed, NASA personnel were going to have to reorganize many other parts of their complex organizational system to do it. Considering the cost and complexity of such a change, two administrators asked why it was not sufficient just to change the personnel who were responsible for flawed decisions. These two administrators were struggling with the very practices that are problems in most organizations when it comes to change: thinking in terms of individual causes rather than organizational system causes, and considering how to change a complex system where change in one part has consequences for another. Recognizing that the most my dinner comments could accomplish would be to get them thinking in that moment in organizational terms, I later typed my comments for distribution with a reading list of key texts on organization theory, risk, and disaster. Reflecting my changing course, the reporters who just two weeks earlier had populated my incoming e-mail were replaced by NASA personnel.

In October, I was invited to the annual NASA 40 Top Leaders Conference. The leaders were the key personnel from all NASA Centers. Neither they nor the NASA administrators attending were known to me. After dinner, a NASA administrator spoke. He advised everyone to read the report carefully because not everything in it is true, and that NASA lawyers were studying it to decide what it would mean to fully comply as opposed to legally comply with the Board’s recommendations. He followed with a major pep talk. He began by quoting from widely read business books about motivation and change, moved on to some exciting NASA scientific innovations underway, and then discussed the progress toward meeting the technical changes required by the CAIB in order to meet the criteria and deadline for the event now known as Return to Flight.

Contradicting O’Keefe’s public statement on full compliance, his talk appeared to be indicating actions leading to cultural persistence, not change, from the agency leadership. His reference to business texts was troubling: one of the changes in NASA at the inception of the Shuttle Program was the conversion of the original pure technical culture to one that operated more like a business, with production pressures shaping technical concerns. Moreover, at the same time that he was conveying that NASA had an exciting future, he also reinforced that they were in a hurry to qualify for the Return to Flight in order to get there.

I spoke next, beginning with an analysis of the causes of Challenger because some present were not with NASA in 1986 and the years of rebuilding that followed. Making my point about the importance of connecting strategies for control with the organizational causes of a problem, I explained how the Presidential Commission’s recommendations failed to address the full range of the social causes, and how, by giving central attention to individual management failures, the Commission had inadvertently laid the groundwork for the second accident. Following with a brief summary of the causes of Columbia, I repeated my points about changing organizational systems that I had presented at headquarters the month before. An engaged exchange extended the session two hours longer than planned, stopped because it ran into the scheduled dinner. Quite a few people brought copies of the book with them. They were caught up in the possibility that the normalization of deviance could happen again, proceeding incrementally as before, unnoticed by them, and leading to another accident.

We discussed how to attend to the problem of weak, mixed, and routine signals that had contributed to the normalization of deviance in both accidents. However, I stressed that the problem wasn’t going to be solved just by creating better systems and people to detect anomalies and clarify signals. Those changes were crucial, but protecting against the normalization of deviance also called for changes at the institutional and organizational levels. These more challenging changes had to originate in actions from White House, Congress, and NASA leadership. One person, struggling with the culture connection, said, But we had a strong safety culture that management supported. There were signs everywhere to remind us of the importance of safety. I responded that posting signs reminding people of safety could not compete with the signals sent by NASA administrators that meeting the schedule was a priority. I suggested the importance of internal research on culture, guided by outsiders who brought to their analysis a stranger’s point of view—organizational behavior specialists, sociologists, and cultural analysts—in order to monitor how culture operates instead of waiting until something bad happens and understanding its contribution in retrospect.

As dinner concluded that night, I was asked to meet with a small group to talk further. Six people were there. They wanted to see what I thought about various ideas they had for change. With background training in science and engineering and NASA experience, they knew the agency well. I was struck by their quick grasp of organizational analysis. Not only had they absorbed the idea of connecting strategies for change with the social causes of NASA’s problems, but they could apply them based on knowledge and experience that only insiders have. For example, one person in the group, a former astronaut, brought up the crucial contribution to the tragic outcome made by the Mission Management Team Director, whose job was to supervise the mission after launch through its landing. The Mission Management Team was designed as a decentralized operation, so in an emergency, information and input would come from all relevant parties, regardless of rank.

However, the Mission Management Team Director instead had operated hierarchically, ignoring the input of engineers and listening instead to a single person at a higher level. The Mission Management Team Director disregarded engineers’ independent requests for imagery to the Department of Defense because the requests had not gone through normal bureaucratic channels. The former astronaut saw this as an organizational problem, not individual personality. He observed that the position preceding the Mission Management Team directorship in the career track was Launch Director, where decision-making power was highly centralized. His proposed solution was that before assuming the Mission Management Team position, all Launch Directors undergo extensive retraining in decentralized decision making and repeat periodically once in the position.

The next day, the Space Shuttle Program representatives and those from the International Space Station Program met in separate sessions to discuss their ideas for change. I went with the Space Shuttle people. In this less public setting, they openly expressed their frustration and anger with the administrators above them. After opening the session, the Space Shuttle Program Manager leading the discussion protested that their experience was identical to that of NASA engineers at the time of Challenger and again for Columbia: the Shuttle Program leaders were repeatedly stifled by the refusal of those above them in the hierarchy to take seriously engineering concerns about safety.

Speaking about budget requests for hardware, he said, We can’t just make a request, justify it with evidence, and receive it. We have to go back again and again and justify, justify, justify. This was exactly the cultural problem originating from production needs driving engineering decisions that the CAIB was trying to solve with its structural solution. At meals and breaks throughout the day, I met people dedicated to doing everything they could to prevent a third accident. I saw that they caught on to making change guided by organizational principles. However, I also saw what they were up against. My two days there showed the reproduction of the social causes of Challenger and Columbia, even at this key transitional moment when the agency was trying to change.

Although O’Keefe was publicly adamant about complying with the CAIB recommendations, the administrator who spoke the night before conveyed a different message to insiders. One of the participants captured the contradiction when she said, Am I supposed to tell my people that we are going to comply, or that we are going to legally comply? The CAIB report recommendations stated clearly that leaders—NASA, the White House, Congress—make choices that create cultures; those same leaders are responsible for changing it. However, there were many indications that NASA leaders were taking actions that would perpetuate the institutional conditions that contributed to the two accidents, even as their forty top leaders were embracing change, seeking to alter the inner workings of the agency to promote better decision making and safety.

Before the conference ended, someone from headquarters explained my invitation by saying, We thought you were a NASA critic, so at the headquarters dinner we didn’t know what to expect, but we saw that you were a social analyst. He asked me if I would help them. I was intrigued by the opportunity to finally study NASA from the inside, but the events of the weekend left me skeptical about the success of my efforts. I responded that I would not, due to conflict of interest. It appeared that the CAIB would participate in evaluating NASA changes prior to the Return to Flight, and if so, I would be involved. Privately, I wondered if my presence at NASA would be for symbolic reasons only, and foresaw negative consequences for my work and self if NASA top administrators decided to legally comply as opposed to fully comply.

LESSONS LEARNED

I went to NASA Headquarters and to the leadership conference believing these visits were a unique opportunity to teach organizational principles to a new audience, and I came away sure that I had learned more from NASA than I had taught. For the first time, I was able to witness the obstacles to changing organizations as its people were confronting them. Every lesson was crucial to my working knowledge of how organizations make change (or not), the challenges they face, and the limitations and possibilities of our efforts. First, external factors and powerful actors continued to drive decision making of NASA top administrators, thus perpetuating the institutional cultural beliefs and organizational mandates that contributed to the normalization of deviance for both accidents. Second, I saw how changing organizational structure in order to change the culture had the unintended consequence of changing related structures, creating a new system that was not going to operate smoothly in the beginning, and perhaps disrupting parts of the organization that had been operating well.

Third, when instructed by official investigators to make mandated change, NASA leaders took the moment to make many changes they saw as important, in addition to those mandated. This happened post-Challenger (as I learned in my April phone conversation with a NASA administrator) and recurred post-Columbia at NASA Headquarters and at the conference. When implemented, many factors would be changing at once, their ramifications affecting many parts of the system, thus making it hard to evaluate the effect of the mandated changes independently of the others. Moreover, the changes were going to make the system more complex, and as research shows, increasing system complexity increases the possibility of failure. Changing culture was perhaps the greatest challenge for them because culture is invisible, embedded in organizational structure and hierarchy, rules, routines, and informal relations, and is enacted in taken-for-granted ways in everyday work.

But not all lessons were so pessimistic and dark. I was excited by the potential for change that showed in the insights and anger voiced by the forty top leaders. Not only did they see how their own positions were affected by the same social factors that caused the accidents (e.g., the lack of deference to technical expertise), but also they were able to use organizational principles to spot ways to change the organization so that crucial decision-making errors would not repeat. Witnessing them in action, I saw what public sociology and teaching other audiences can accomplish. Equally important, I saw our own limitations more clearly: as organization specialists, we can pass on organizational principles and teach how to implement change, but we do not know the organization as insiders do. Those who know it intimately can see applications that we, as outsiders, cannot see.

Clearly, organization specialists have an important role to play. We offer a set of tools—an organizational paradigm, principles, theories, and concepts—that take people beyond the traditional individual failure model, instead directing them toward the organizational sources of their problems. In particular, our concepts become buzzwords that resonate with people’s own experiences, helping them make sense of the world. I learned this early on, when organizational concepts from this book—structural secrecy; culture; mixed, weak, and routine signals; the normalization of deviance; organizational system failure—traveled from the book to other audiences, ultimately reaching people in other organizations, who recognized their situation and got in touch. It may be that the only measurable impact of my public sociology was on the Board and staff of the CAIB, as they returned to teaching, research, and accident investigation with a new awareness of the social. But I believe it did not stop there, based on my conversations with journalists, many NASA personnel, and e-mail correspondents who saw the analogies, got the concepts and theory, and took them further, applying them to other professionally and personally relevant situations.

The passing years have shown that the normalization of deviance is not a problem specific to NASA. In the aftermath of some harmful outcome, journalists, researchers, and official investigations have revealed the identical pattern: a long incubation period with early warning signs that were either misinterpreted, ignored, or missed completely. These missed signals didn’t stand alone, however. They were joined with other factors implicated in NASA’s accidents: competition for scarce resources, structural secrecy, regulatory failure, politics, organizational and technological complexity, hierarchy, reduced staffing, cultural beliefs, etc. Nationally, the normalization of deviance has been identified with organizational system failures resulting in harmful outcomes in policing, health care, schools, prisons, foster care, and social work.

Moreover, we have seen it in historic organizational system failures that resulted in massive social and human costs. The September 11 Commission report examined the agencies responsible for national security, discovering that early warnings they received about terrorist attacks on this country were ignored due to structural secrecy, politics, and a cultural belief, based on the past, that hijackings were not dangerous and likely to occur overseas but not here. In 2003, the torture and abuse of prisoners at Abu Ghraib during the war on Iraq had become normal and were taken for granted by prison guards, but the support for such abuse was traced to the upper echelons of US government for authorizing enhanced methods of torture. Unsurprisingly, torture and abuse were found to be normal in many US military prisons in the Middle East. Finally, analysis of the financial crisis of 2008 centered around complex technologies of investment that were innovative, deviant, and not well understood but widely adopted in the industry due to competitive pressures, and so became normal and acceptable risks. Aided and abetted by individual secrecy, structural secrecy, and technological and organizational complexity, the regulatory failure was near total. Thus normalized, no one in the industry saw the crisis and failure of banking institutions coming.

In the conclusion of this book, I wrote that change that connects strategies of control to the organizational causes of some harmful outcome can reduce the probability that it will recur, but we can never prevent a repeat because of the recalcitrance of systemic causal factors. Nonetheless, reducing the probability of a recurrence is an important goal. Organization specialists can help. We understand the effects of complex structures. On the face of it, the problem of the normalization of deviance may appear to be easy to resolve. Adding extra detection devices that identify anomalies, oversight by disinterested specialists, classification systems that refine existing categories to more clearly separate the more serious from the less serious risks—all these are important and essential, but these and external regulatory efforts all have limited effectiveness, as this book shows. The key problem to resolve is the interpretation of information and how the larger organizations’ cultural imperatives impact our interpretive abilities. Research can help people in organizations stay in touch with their organizational culture, as it affects workers at all levels. We can never totally resolve the problem of complexity, but we have to be sensitive to our organizations and how they work. While many of us work in complex organizations, we don’t realize how much the organizations that we inhabit completely inhabit us. This is as true for those powerful actors at the top of the organization responsible for creating culture as it is for the people in the tiers below them who carry out their directives and do the everyday work.

New York City

March 31, 2015

Preface

Some events are experienced by great numbers of people, diverse in interest, age, race, ethnicity, life style and life chances, gender, language, and place, who temporarily become bound together by a historic moment. The January 28, 1986, Space Shuttle Challenger disaster was such a moment. Collectively, the country grieved, and not for the first time. Many still vividly remember—and will quickly confess, when the subject comes up—exactly where they were, what they were doing, and how they felt when they heard about the tragedy. The initial shock was perpetuated by the television replays of the Challenger’s final seconds, the anguished faces of the astronauts’ families and other onlookers huddled in disbelief on bleachers at the launch pad, by the news analyses, and then by the official investigation of the Presidential Commission.

Primarily, the disaster is remembered as a technical failure. The fault lay in the rubberlike O-rings. The primary O-ring and its backup, the secondary O-ring, were designed to seal a tiny gap created by pressure at ignition in the joints of the Solid Rocket Boosters. However, O-ring resiliency was impaired by the unprecedented cold temperature that prevailed the morning of the launch. Upon ignition, hot propellant gases impinged on the O-rings, creating a flame that penetrated first the aft field joint of the right Solid Rocket Booster, then the External Tank containing liquid hydrogen and oxygen. The image retained by the American public is a billowing cloud of smoke, from which emerged two white fingers tracing the diverging paths of the Solid Rocket Boosters across the sky.

Technology was not the only culprit, however. The NASA organization was implicated. The Presidential Commission created to investigate the disaster revealed that the O-ring problem had a well-documented history at the space agency. Earliest documentation appeared in 1977—nearly four years before the first shuttle flight in 1981. Moreover, the Commission learned of a midnight-hour teleconference on the eve of the Challenger launch between NASA and Morton Thiokol in Utah, the contractor responsible for building the Solid Rocket Boosters. Worried Thiokol engineers argued against launching on the grounds that the O-rings were a threat to flight safety. NASA managers decided to proceed.

Given this history of O-ring problems and a protest by engineers, why did NASA officials go forward with the launch? As the Commission, the House Committee on Science and Technology, whistle-blowers, and journalists probed and published accounts of the events at NASA leading up to the tragedy, they created a documentary record that became the basis for the historically accepted explanation of this historic event: production pressures and managerial wrongdoing. Published accounts agreed unanimously that NASA had been experiencing economic strain since the inception of the Space Shuttle Program. Economic difficulties had focused the attention of NASA decision makers on the launch schedule, which became the means to scarce resources for the space agency. The die was cast for managerial wrongdoing as the explanation of the Challenger launch when the Presidential Commission identified a flawed decision-making process at NASA. The Commission found that NASA middle managers had routinely violated safety rules requiring information about O-ring problems be passed up the launch decision chain to top technical decision makers, so that critical information was not forwarded up the hierarchy.

But a piece of the puzzle has always been missing. The Commission’s discovery explained why top administrators failed to act, thereby publicly exonerating them. The more significant question, however, was why the NASA managers, who not only had all the information on the eve of the launch but also were warned against it, decided to proceed. This question was never directly asked and therefore never answered. In this vacuum, the conventional explanation was born and thrived: economic strain on the organization together with safety rule violations suggested that production pressures caused managers to suppress information about O-ring hazards, knowingly violating safety regulations in order to stick to the launch schedule.

No one has forgotten the astronauts, the incident, or the shape of those billowing clouds that recorded the final seconds of the Challenger‘s flight. Nonetheless, the loss of the Challenger has receded into history, as the unfolding present, urgently demanding attention, replaces the past. The details of the investigations, the flaws in the NASA organization that the Presidential Commission and the House Committee illuminated, and the questions about the NASA managers who, informed of the O-ring problem, decided to proceed with the launch have dimmed. Yet for the generations who witnessed it, the tragedy is remembered as a technical failure to which the NASA organization contributed. The details may be forgotten, but for many, the idea of wrongdoing by NASA middle managers lingers.

This book contradicts conventional interpretations of the Challenger launch decision. It presents both a revisionist history and a sociological explanation. To achieve both these ends, the core of the book is bracketed by two versions of the eve-of-launch teleconference. Chapter 1 begins with a rendering that is drawn from—and typifies—conventional accounts, followed by a tracing of the discoveries that cast managerial wrongdoing as the historically accepted explanation. Then in Chapter 8, the Chapter 1 account is repeated, juxtaposed against another, more detailed telling that contradicts many

Sie haben das Ende dieser Vorschau erreicht. Registrieren Sie sich, um mehr zu lesen!
Seite 1 von 1

Rezensionen

Was die anderen über The Challenger Launch Decision denken

3.5
4 Bewertungen / 2 Rezensionen
Wie hat es Ihnen gefallen?
Bewertung: 0 von 5 Sternen

Leser-Rezensionen

  • (5/5)
    Institutions Create and Condone RiskThe Space Shuttle Challenger exploded on January 28, 1986. To millions of viewers, it is a moment they will never forget. Official inquiries into the accident placed the blame with a “frozen, brittle O ring.” In this book, Diane Vaughan, a Boston College Professor of Sociology, does not stop there. In what I think is a brilliant piece of research, she traces the threads of the disaster's roots to fabric of NASA’s institutional life and culture.NASA saw itself competing for scarce resources. This fostered a culture that accepted risk-taking and corner-cutting as norms that shaped decision-making. Small, seemingly harmless modifications to technical and procedural standards propelled the space agency toward the disaster. No specific rules were broken, yet well-intentioned people produced great harm.Vaughan often resorts to an academic writing style, yet there is no confusion about its conclusion. “The explanation of the Challenger launch is a story of how people who worked together developed patterns that blinded them to the consequences of their actions,” wrote Dr. Vaughan.“It is not only about the development of norms but about the incremental expansion of normative boundaries: how small changes--new behaviors that were slight deviations from the normal course of events- gradually became the norm, providing a basis for accepting additional deviance. Nor rules were violated; there was no intent to do harm. Yet harm was done. Astronauts died.”For project and risk managers, this book offers a rare warning of the hazards of working in structured and institutionalized environments.
  • (5/5)
    This is the definitive book about the Challenger disaster. It's long and involved, but it will change your mind about what caused the disaster.