Columbia disaster board faults NASA communications, culture

The report cites a 'cultural fence' that impaired the flow of information

WASHINGTON -- Human space flight is an inherently risky and dangerous business, but NASA's reliance on informal communications to manage space shuttle operations -- coupled with the agency's insular culture -- turned risk and danger into disaster for the Columbia.

That's the conclusion of a long-awaited final report by the Columbia Accident Investigation Board (CAIB) released today. The board, established shortly after the Columbia disintegrated during re-entry on Feb. 1 and chaired by Hal Gehman, a retired Navy admiral, concluded that "deficiencies in communication ... were a foundation for the Columbia accident." It further found that NASA's overconfident and inflexible culture, as well as management failures, "had as much to do with this accident as the foam."

The report paints a picture of a massive bureaucracy that relied on informal e-mail communications to manage the in-flight analysis of the damage to Columbia's left wing by a piece of insulating foam that broke loose during takeoff. This led to a series of discussions that took place in a vacuum, with little or no cross-organizational communication and often no feedback from senior managers contacted by low-level engineers with concerns about the shuttle's safety, according to the report.

"There are two steps to changing the culture," said Gehman. "It takes both reorganization and leadership. All levels of leadership are going to have to look for these [bad] traits, such as stifling communication and stomping on engineers, and they're going to have to drive it out."

"Over time, a pattern of ineffective communication has resulted, leaving risks improperly defined, problems unreported and concerns unexpressed," the report said. "The question is, why?"

In its attempt to answer that question, the board discovered that the corporate culture surrounding the space shuttle program is too insular, that there are deficiencies in problem and waiver-tracking systems, and that the exchange of communication across the agency hierarchy is limited.

A major element in NASA's management and decision-making failures was its inability to integrate critical safety information and analysis, the report said. "The agency's lack of a centralized clearinghouse for integration and safety further hindered safe operations. In the Board's opinion, the Shuttle Integration and Shuttle Safety, Reliability, and Quality Assurance Offices do not fully integrate information on behalf of the Shuttle Program."

And while NASA does have an automated system in place to track so-called critical items related to safety, it's "extremely cumbersome and difficult to use at any level," the report said. As a result, the system, which contains a list of more than 5,000 "critical items" and more than 3,200 safety "waivers," often goes unused.

Columbia Accident Investigation Board Chairman Hal Gehman

Columbia Accident Investigation Board Chairman Hal Gehman

Credit: CAIB Photo by Rick Stiles 2003

"The Lessons Learned Information System database is a much simpler system to use, and it can assist with hazard identification and risk assessment," the board concluded. "However, personnel familiar with the Lessons Learned Information System indicate that design engineers and mission assurance personnel use it only on an ad hoc basis, thereby limiting its utility."

The board also made clear that it isn't the first commission to note such deficiencies. Numerous reports, including a General Accounting Office report published in 2001, highlighted "fundamental weaknesses in the collection and sharing of lessons learned" by program and project managers. That GAO report also found that "the existing workforce was stretched thin to the point where many areas critical to shuttle safety, such as mechanical engineering, computer systems and software assurance engineering, were not sufficiently staffed by qualified workers" (see story).

The CAIB report also questions whether a more efficient and interactive form of communications and information-sharing would have made a difference, given NASA's dysfunctional corporate culture.

Between Jan. 27 and Jan. 31, "phone and e-mail exchanges, primarily between NASA engineers at Langley and Johnson, illustrate another symptom of the cultural fence that impairs open communications between mission managers and working engineers," according to the report. "These exchanges and the reaction to them indicated that during the evaluation of a mission contingency, the Mission Management Team failed to disseminate information to all system and technology experts who could be consulted. These engineers -- who understood their systems and related technology -- saw the potential for a problem on landing and ran it down in case the unthinkable occurred. But their concerns never reached the managers on the Mission Management Team that had operational control over Columbia."

Responding to questions about the board's findings that NASA has routinely placed cost and schedule concerns ahead of safety, Air Force Major Gen. John Barry, director of Plans and Programs at the Air Force Materiel Command and a CAIB member, said it's imperative that NASA do what other organizations in the military and in the private sector have done: separate systems requirements functions and engineering from the operations managers.

"If the program is competing [against] cost and schedule, and they still own the requirements and waiver authority [for potential safety violations], you will sometimes find that you will compromise the waiver and the safety for cost and schedule," said Barry. "By separating [those functions], you put a check and balance in the system."

Copyright © 2003 IDG Communications, Inc.

It’s time to break the ChatGPT habit
Shop Tech Products at Amazon