any other thing that is intrinsically unpredictable or stochastic—is not necessarily risky.
Risk can result from careless management decisions even in informed, certain situations. Risk can be technological, political, or financial. Macauley noted that in Steidle’s presentation the previous day, Steidle had commented that sustaining a long-term program through successive presidential administrations, Congresses, and budget cycles might be the greatest risk facing human space exploration.
Macauley also noted that neither uncertainty nor risk is necessarily bad and that to some degree, both can be managesd. In some cases, uncertainty can be reduced through additional research and development, as we learn by doing. Risk can be managed in a variety of ways, including private insurance for activities that take place in the private sector. In the case of government programs, however, the government typically selfinsures, so the public bears the risk.
Macauley observed that risks that are voluntarily undertaken are usually perceived differently from risks that are involuntary. Low-probability, high-consequence risk is usually perceived differently from high-probability, low-consequence risk. Loss of life is typically seen as a greater risk than loss of property, even if the property loss involves billions of dollars.
Mazur commented that deaths associated with spaceflight, if they are highly publicized, become symbolic and therefore have a much greater effect on public policy than a body count from a probabilistic risk assessment. He said that deaths of aerospace industry workers, even of astronauts, have little impact on public sentiment if they occur outside the public eye. But even one highly visible death in flight can greatly affect a program, causing long delays or even cancellation, or even an increase in funding if that seems to offer a solution. He stated that probabilistic risk assessment goes out the window when astronaut deaths make the headlines. Mazur noted that there were many more news stories covering the Challenger and Columbia accidents than the Apollo fire. As result, NASA was able to investigate the Apollo accident in-house and quickly resume the program. Because of increased media attention and the attendant public anguish, independent commissions were set up to investigate the two shuttle accidents, and these produced far longer delays in the program and severe criticism of the agency.
He pointed out that a sociological model of accident events differs from an engineering model and reveals aspects of the accidents that influence media coverage and public perception. According to Mazur, the engineering model focuses on the proximate causes of a disaster (e.g., frozen O-rings, broken foam) and their precursors (e.g., rushed launch schedules, NASA’s culture of risk). If engineers and risk analysts consider media coverage at all, they treat it peripherally, focusing on things like fairness and accuracy of the reporting. A sociological model treats an accident as a social event, the public reaction to which is greatly affected by the quantity and tone of news coverage. Just as the physical accident has precursors, journalistic coverage is also affected by important factors that precede or accompany the accident.
That a teacher was on Challenger, and that the accident itself was captured by the camera—the photo of the explosion is now iconic—reinforced the news coverage and contributed directly to President Reagan’s decision to form an investigative committee independent of NASA. (Mazur suggested how different the accident would have been, as a social event, had it occurred 1 minute later, out of camera range.) Because the Columbia accident came after September 11, 2001, and had an Israeli astronaut on board