National Academies Press: OpenBook

Hazards: Technology and Fairness (1986)

Chapter: LESSONS FOR INDUSTRY

« Previous: Three Mile Island and Bhopal: Lessons Learned and Not Learned
Suggested Citation:"LESSONS FOR INDUSTRY." National Academy of Engineering. 1986. Hazards: Technology and Fairness. Washington, DC: The National Academies Press. doi: 10.17226/650.
×
Page 199
Suggested Citation:"LESSONS FOR INDUSTRY." National Academy of Engineering. 1986. Hazards: Technology and Fairness. Washington, DC: The National Academies Press. doi: 10.17226/650.
×
Page 200

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

THREE MILE ISLAND AND BHOPAL: LESSONS LEARNED AND NOT LEARNED 199 original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the retained, and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. Utility officials did not publicly declare an emergency until 2 hours and 50 minutes into the accident. About 10 minutes later, local and state authorities were notified. The utility tried to contact the Nuclear Regulatory Commission (NRC) 20 minutes later at 7: 10 a.m., but the NRC switchboard did not open until 7:45 a.m., at which time the NRC regional office was made aware of what was happening and established an open telephone line to the control room. By 8:00 a.m. the operations center at the NRC headquarters was in operation (U. S. Nuclear Regulatory Commission, 1980a, p. 15). Reviews of the accident indicate that when the block valve was closed, the loss of water from the system stopped, and the water level in the core began to rise slowly over the next half hour. During the first few minutes after the coolant pump had been turned on, some additional water was forced into the core. Although a few feet of core remained uncovered, most of the core heating probably was over at that time and the core significantly quenched (U.S. Nuclear Regulatory Commission, 1980b, Vol. 2, part 2, p. 507). The cleanup has already lasted for more than six years and will eventually cost more than a billion dollars. The accident at Three Mile Island was caused by a combination of hard-to- handle machinery, poorly trained or incompetent operators, and a regulatory process that lulled management into neglecting its own responsibilities. Congress, the public, the regulators, and the nuclear power industry reacted. Some overreacted, but there were also positive actions. The impact of the accident seemed to have a decay time constant—that is, the time required for the impact to diminish by half—of about two years. This period is probably related to the decay time of about one year for congressional and press interest. LESSONS FOR INDUSTRY The following lessons from Three Mile Island are especially pertinent for industry: 1. Responsibility for operation belongs to management, not to any federal or state regulators. Regulators try to establish an envelope for safe operation, but it is management's responsibility to stay within that envelope, even to reduce the envelope if they believe it prudent. If the envelope is exceeded, management must try to return within it as rapidly as is safely possible. The nuclear industry and its regulators had never directly addressed the issue of who had fundamental responsibility for the safety of Three Mile Island. All reviews of the accident identified this as a major regulatory defect and a cause of confusion. The President's Commission on the Accident at Three Mile Island, chaired by John G. Kemeny, concluded: "Responsibility and accountability for safe power plant operations, includ

THREE MILE ISLAND AND BHOPAL: LESSONS LEARNED AND NOT LEARNED 200 original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the retained, and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. ing the management of a plant during an accident, should be placed on the licensee [the utility] in all circumstances" (President's Commission on the Accident at Three Mile Island, 1979, p. 64). The NRC stressed this regulatory philosophy in denying a General Public Utilities Corporation claim after the TMI accident. At that time, I elaborated on the Commission's decision: "Within the regulatory framework flowing from the Atomic Energy Act and other applicable statutes, the regulated industry (i.e., the licensee, the vendor, and the architect engineer) bears the primary responsibility for protecting the general public from the health, safety, and environmental risks posed by the generation of electricity from nuclear power" (Ahearne, 1981). 2. Probabilistic risk assessment (PRA) can be used. This technique was pioneered in the aerospace industry. The WASH 1400 report, prepared under the direction of Norman Rasmussen (U.S. Nuclear Regulatory Commission, 1975), extended PRA to the nuclear industry. In a little- noticed but major point, the report of the NRC Risk Assessment Review Group, chaired by Harold W. Lewis (U.S. Nuclear Regulatory Commission, 1978), although soundly criticizing the executive summary of WASH 1400, also stated that PRA is a good technique but that it had not been used. A check within the Nuclear Regulatory Commission showed that the regulatory staff did not understand and mistrusted PRA, so they did not use it. PRA finally has been accepted by the nuclear industry, but only after the NRC funded a joint study by the American Nuclear Society and the Institute of Electrical and Electronic Engineers to develop practical methods of using PRA in nuclear plant analysis. All hazardous industries can profitably use PRA. Even though absolute values of risk may not be calculable, the insights into potential problems will be invaluable. 3. The competence of the operating crew is critical. Many technologies are dangerous—airplanes, nuclear power plants, many chemical processes. All involve humans in their operation, and many have the potential of causing great financial damage to the operating company, as well as large societal damage. The Three Mile Island accident nearly bankrupted Metropolitan Edison and its parent company. Since operator error contributed to that accident, one lesson is to have better operators. This was a major finding of several TMI accident reviews (see, for example, U.S. Nuclear Regulatory Commission, 1980b, p. 103) and has led the NRC to publish additional requirements for the training and licensing of nuclear power plant operators. The most frustrating discussions I participated in at the Nuclear Regulatory Commission were those on operator qualifications. In general, utilities were unwilling to significantly upgrade operator requirements. To do so would require higher salaries for operators, and the concept of operators getting higher salaries than many management personnel seems anathema to

Next: LESSONS FOR REGULATORS »
Hazards: Technology and Fairness Get This Book
×
 Hazards: Technology and Fairness
Buy Paperback | $55.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

"In the burgeoning literature on technological hazards, this volume is one of the best," states Choice in a three-part approach, it addresses the moral, scientific, social, and commercial questions inherent in hazards management. Part I discusses how best to regulate hazards arising from chronic, low-level exposures and from low-probability events when science is unable to assign causes or estimate consequences of such hazards; Part II examines fairness in the distribution of risks and benefits of potentially hazardous technologies; and Part III presents practical lessons and cautions about managing hazardous technologies. Together, the three sections put hazard management into perspective, providing a broad spectrum of views and information.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!