News Release

'Hindsight bias' could hide real lessons of Columbia accident report, expert says

Peer-Reviewed Publication

Ohio State University

COLUMBUS, Ohio – A psychological effect known as "hindsight bias" might cause people to misinterpret the conclusions of the Columbia Accident Investigation Board (CAIB), according to an Ohio State University researcher who helped the board during its probe.

The report makes important recommendations that could not only help NASA, but also other organizations, according to David Woods, professor in the Institute for Ergonomics and co-director of the Cognitive Systems Engineering Lab at Ohio State. Woods is also a professor in the Department of Industrial, Welding, and Systems Engineering.

Woods provided technical input on decision-making, organizational factors, and hindsight bias to the CAIB during its investigation. His research on the factors that contribute to human error is referenced in Chapter 7 of the report.

Woods gave an example of the dangers of hindsight bias, citing some text from the CAIB report. "Often the first question people ask about the decision-making leading up to the Columbia accident is, 'why did NASA continue flying the Shuttle with a known problem…?'" he said.

The accident's lessons apply to all organizations that have to balance safety risks with pressure for efficiency.

The "known problem" refers to the dangers of debris damaging the Shuttle wing during takeoff, which the board has identified as the physical cause of the accident.

But as soon as the question is posed in this way, readers risk oversimplifying the situation and the uncertainties people faced before the outcome was known, Woods said.

"Because it is difficult for readers to disregard what is commonly called '20/20 hindsight,' they can misinterpret the report and label NASA as a bad organization. As a result, the same difficulties that led to the Columbia accident could go unrecognized in other organizations, too," he added.

Just as the CAIB worked hard to overcome hindsight bias and find the organizational factors that led to the accident, Woods feels that readers of the report will also need to overcome their own hindsight.

The accident's lessons apply to all organizations that have to balance safety risks with pressure for efficiency. The key difficulty, according to Woods, is this: the most critical time for organizations to uncover safety risks is the one time when they can least afford diverting resources to do so -- when they are under heavy production pressures.

In the case of NASA, the organization is pressured to fly the Shuttle on a certain schedule.

Woods noted that helping organizations maintain high safety despite production pressure is the topic of a newly emerging area of research known as resilience engineering.

"We can't change the past, despite the tragedy. But the future is open to us. Will the next accident report again describe how safety defenses eroded over time in the face of production pressure? If we recognize that the CAIB's analysis applies to all organizations, not just NASA, we can learn how to balance acute pressures for efficiency with a chronic need for high safety so that this pattern doesn't recur," he said.

With nearly 25 years of experience diagnosing the factors behind human error, Woods has conducted extensive research on how people interact with computers to make decisions in high-risk environments. His work has won awards for improving the safety of automated cockpits. In addition, Woods is a member of the National Research Council committee that is helping NASA and the Federal Aviation Administration plan the country's next-generation air transportation systems.

David Woods

###


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.