Click for next page ( 198


The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 197
Three Mile Island and Bhop al: Lessons Learned and Not Learned JOHN F. AHEARNE The Three Mile Island Nuclear Station (TMI) consists of an 840- megawatt (MOO) reactor (Unit 1) and a 960-MW reactor (Unit 2) located about 10 miles southeast of Harrisburg, Pennsylvania, on an island in the Susquehanna River. At about 4:00 a.m., March 28, 1979, the main feed- water pumps connected with one of the two Unit 2 steam generators shut down, causing an automatic and almost simultaneous shutdown ofthe Unit 2 turbine. What caused the shutdown is not definitely known, but it may have been a pressure perturbation in the system caused by maintenance work being done at that time. With the feedwater flow stopped, the steam generators stopped removing heat from the primary system, the closed system of pressurized water that carries heat from the reactor to the steam generators and then returns to the reactor. As the heat in the primary system rose, the water pressure rose, causing a pressurizer relief valve to open and the reactor to shut down automatically. This reactor "scram" took place eight seconds after the original pump stopped. With the reactor scrammed, the nuclear fissioning in the reactor core stopped, but a very large amount of decay heat remained (equivalent to a 55-MW reactor). This decay heat cannot be turned off, and it is necessary that sufficient water and pressure be maintained after shutdown to cool a reactor. During these first few seconds, the equipment acted according to design, and the automatic responses to the interruption of heat transfer were normal. After the reactor scrammed and the relief valve lifted, however, the primary waterpressure fell to a level at which the reliefvalve was supposed to close 197

OCR for page 197
198 JOHN ~ ONE but it stuck open. Because the relief valve was open, the pressure in the primary system continued to fall. When the pressure had dropped to about 75 percent of normal, an emergency core cooling system (ECCS) came on automatically and injected cool water under high pressure into the reactor. Believing that the relief valve was closed, and seeing the water level in the pressurizer rise as the ECCS water was injected, the operators in the reactor room feared that the pressurizer would fill up with water and the system would lose its normal behavior. They did not really understand what went on in a reactor, and they did not understand what the pressurizer valve did. So, consistent with their understanding and training, they shut off one ECCS pump and throttled back the other. The water in the core then boiled. The pressurizer relief valve stayed open for about 2 hours and 20 minutes. In addition, early in the accident, the operators were also letting water out because they believed there was too much water going in. More than 30,000 gallons of water escaped from the reactor vessel. Eventually the upper portion of the core became uncovered. There were sharp increases of temperature, heavy damage to fuel, and release of radio- active fission products. The extent of the damage would be uncertain until the reactor was examined, but preliminary measurements indicated extreme damage since much of the upper fuel assembly was rubble on the bottom of the reactor vessel. The operators had several indications of unusual events. About eight minutes into the accident, a sump pump that removes overflow in the bottom of the containment building came on. Three minutes later the sump over- flowed. About 20 minutes later, the operators turned off the sump pump, but had not yet reached the conclusion that water was escaping from the reactor. (Over 8 ,000 gallons of water had been pumped out of the sump by this time. ~ About 1 hour end 15 minutes into the accident, the four big pumps that push the water through the reactor began to vibrate, because they were now pushing a water-steam mixture rather than water. The operators shut down two pumps. About half an hour later they shut down the other two. At this stage, there was no water going into the reactor core. By 6:00 a.m., radiation alarms showed that radioactive gas was in the containment building (U.S. NuclearRegulato~y Commission, 1980a,p. 11-154. The operators during this period, including the Superintendent of Techni- cal Support, had been on duty for from one to two hours and had not diagnosed what was happening. A shift supervisor arrived two hours into the accident and within 20 minutes concluded that the pressurizer relief valve was stuck open. He closed a block valve, thus stopping the loss of water nearly 20 hours and 30 minutes into the accident. Twenty-five minutes later, a reactor coolant pump was started again (U.S. Nuclear Regulatory Com- mission, 1980b, Vol. 1, p. 191.

OCR for page 197
LESSONS FROM THREE AGILE ISLAND AIND BHOPAL 199 IJtility officials did not publicly declare an emergency until 2 hours and 50 minutes into the accident. About 10 minutes later, local and state authorities were notified. The utility tried to contact the Nuclear Regulatory Commis- sion (NRC) 20 minutes later at 7:10 a.m., but the NRC switchboard did not open until 7:45 a. m., at which time the NRC regional of flee was made aware of what was happening and established an open telephone line to the control room. By 8:00 a.m. the operations center at the NRC headquarters was in operation (U. S . Nuclear Regulatory Commission, 1980a, p. 15~. Reviews of the accident indicate that when the block valve was closed, the loss of water from the system stopped, and the water level in the core began to rise slowly over the next half hour. During the first few minutes after the coolant pump had been turned on, some additional water was forced into the core. Although a few feet of core remained uncovered, most of the core heating probably was over at that time and the core significantly quenched (U.S. Nuclear Regulatory Commission, 1980b, Vol. 2, party, p. 5071. The cleanup has already lasted for more than six years and will eventually cost more than a billion dollars. The accident at Three Mile Island was caused by a combination of hard-to- handle machinery, poorly trained or incompetent operators, and a regulatory process that lulled management into neglecting its own responsibilities. Congress, the public, the regulators, and the nuclear powerindustry reacted. Some overreacted, but there were also positive actions. The impact of the accident seemed to have a decay time constant that is, the time required for the impact to diminish by half- of about two years. This period is probably related to the decay time of about one year for congressional and press interest. LESSONS FOR INDUSTRY The following lessons from Three Mile Island are especially pertinent for industry: 1. Responsibility for operation belongs to management, not to any fed- eral or state regulators. Regulators try to establish an envelope for safe operation, but it is management's responsibility to stay within that envelope, even to reduce the envelope if they believe it prudent. If the envelope is exceeded, management must try to return within it as rapidly as is safely possible. The nuclear industry and its regulators had never directly addressed the issue of who had fundamental responsibility for the safety of Three Mile Island. All reviews of the accident identified this as a major regulatory defect and a cause of confusion. The President's Commission on the Accident at Three Mile Island, chaired by John G. Kemeny, concluded: "Responsibility and accountability for safe power plant operations, includ

OCR for page 197
200 JOHN ~ AH~RNE ing the management of a plant during an accident, should be placed on the licensee [the utility] in all circumstances" (President's Commission on the Accident at Three Mile Island, 1979, p. 641. The NRC stressed this regulatory philosophy in denying a General Public Utilities Corporation claim after the TMI accident. At that time, I elaborated on the Commission's decision: "Within the regulatory framework flowing from the Atomic Energy Act and other applicable statutes, the regulated industry (i.e., the licensee, the vendor, and the architect engineer) bears the primary responsibility for protecting the general public from the health, safety, and environmental risks posed by the generation of electricity from nuclear power" (Ahearne, 1981~. 2. Probabilistic risk assessment (PRAJ can be used. This technique was pioneered in the aerospace industry. The WASH 1400 report, prepared under the direction of Norman Rasmussen (U. S. Nuclear Regulatory Com- mission, 1975), extended PRA to the nuclear industry. In a little-noticed but major point, the report of the NRC Risk Assessment Review Group, chaired by Harold W. Lewis (U.S. Nuclear Regulatory Commission, f978), although soundly criticizing the executive summary of WASH 1400, also stated that PRA is a good technique but that it had not been used. A check within the Nuclear Regulatory Commission showed that the regulatory staff did not understand and mistrusted PRA, so they did not use it. PRA finally has been accepted by the nuclear industry, but only after the NRC funded a joint study by the American Nuclear Society and the Institute of Electrical and Electronic Engineers to develop practical methods of using PRA in nuclear plant analysis. All hazardous industries can profitably use PRA. Even though absolute values of risk may not tee calculable, the insights into potential problems will be invaluable. 3. The competence of the operating crew is critical. Many technologies are dangerous-airplanes, nuclear power plants, many chemical processes. All involve humans in their operation, and many have the potential of causing great financial damage to the operating company, as well as large societal damage. The Three Mile Island accident nearly bankIupted Metropolitan Edison and its parent company. Since operator error contributed to that accident, one lesson is to have better operators. This was a major finding of several TMI accident reviews (see, for example, U.S. Nuclear Regulatory Com- mission, 1980b, p. 103) and has led the NRC to publish additional require- ments for the training and licensing of nuclear power plant operators. The most frustrating discussions I participated in at the Nuclear Regula- toIy Commission were those on operator qualifications. In general, utilities were unwilling to significantly upgrade operator requirements. To do so would require higher salaries for operators, and the concept of operators getting higher salaries than many management personnel seems anathema to

OCR for page 197
LESSONS FROM THREE MILE ISLAND AND SCHOOL 201 the utility industry. However, the concept has had a long history of accep- tance in the airline industry. 4. Hazardous technologies require highly competent managers. Admiral Rickover had a justified reputation as a superb manager of highly complex technical operations. His criteria for a good manager include the following (Rickover, 19791: "A person doing a job any job-must feel that he owns it and that he will remain on that job indefinitely." "Along with ownership comes the need for acceptance of full responsi- bility for the work.... Unless the one person truly responsible can be identified when something goes wrong, then no one has been really respon- sible." I note that high-risk industries seem to structure themselves so there is no one person responsible. "If the boss is not concerned about details, his subordinates also will not consider them important. ... It is hard, monotonous, and onerous to pay attention to details; most managers would rather focus on lofty policy mat- ters. But when the details are ignored, the project fails." " Establish simple and direct means to find out what is going on in detail in [your] area.... Most managers avoid keeping up with details; instead, they create 'management information systems.' " "Resist the natural human inclination to hope things will work out, despite evidence of doubt to the contrary.... Face the facts." Rickover also has said, and I agree (Ahearne, 1983, p. 382), that technical training is necessary to manage any technical operation competently. 5. Beware the growing influence of lawyers and the courts. According to Harvard President Bok, "Since laws seem deceptively potent and cheap, they multiply quickly. Though most of them may be plausible in isolation, they are often confusing and burdensome in the aggregate, at least to those who have to take them seriously.... For established institutions, in particu- lar, the typical result is a stifling burden of regulations, delays, and legal uncertainties that inhibit progress and allow unscrupulous parties to misuse the law to harass and manipulate their victims" (Bok, 1983, p. 124. Hazardous technology industries can see the outlines of their future in auto accident awards, car manufacturers settling out of court for large sums, and medical malpractice suits. The chemical industry, for example, has some hard days ahead with respect to lawsuits (see Huber, in this volume). LESSONS FOR REGULATORS The following are lessons from Three Mile Island for regulators 1. Prepare for accidents. Washington, D.C., is a fishbowl. When an accident occurs, pressures on federal regulators are enormous- pressures

OCR for page 197
202 JOHN F. CHINE from the news media and from Congress. During an accident is not the time to decide what to do in an accident. Regulators should be prepared even for low-probability events if the consequences can be severe. When the TMI accident occurred, the NRC had a two-room "emergency center" that was primarily a place where key staff could gather. The NRC had no detailed information readily available on the plant or on the surround- ing area. Its only link to the plant was by telephone, a link that was often noisy. The NRC had no prearranged plans about who should be at the plant or how to deal with the plant crew, and only sketchy plans for dealing with state and other federal officials. Since then the NRC and related state and federal agencies (particularly the Federal Emergency Management Agency) have made major changes in preparing for nuclear plant emergencies. For example, utilities and surrounding local and state governments are required to exercise their emergency plans at least once every two years. (It is the refusal of the local county government to do so that has kept the Shoreham plant on Long Island from being licensed for operation.) Regulators in EPA and other agencies involved with chemical and biologi- cal products should consider now the accidents to which they may be called upon to respond, and plan how to react. 2. Share the problems. Regulators may believe they have no allies. Cer- tainly they should not expect anyone to give them the benefit of the doubt. But actually the public can be a strong source of support and sometimes will even support unpopular actions if those actions are carefully explained and the public is involved in the decisions. This public involvement offers the best chance of significntly improving regulatory policy. An excellent example was when EPA Administrator William Ruckelshaus directly involved the citizens of Tacoma, Washington, in EPA decisions regarding a smeller that released arsenic into the ambient air (Ruckelshaus, 1985, p. 331. Regulators should push industry to accept more responsibility. The regu- lators cannot operate plants. Plant operators must recognize they are ulti- mately responsible for safe operation. Regulators also should work with the Congress- at times a seemingly impossible task to explain their agency's regulatory philosophy and try to get the Congress to either endorse the philosophy or modify it. Of course, the regulators must be able to describe clearly what is their regulatory philosophy. 3. Recognize the courts ' influence. The Nuclear Regulatory Commission prepares decisions knowing they will be under almost immediate attack in the D.C. Circuit Court of Appeals. The commissioners themselves also are liable to lawsuits. After the NRC allowed venting of the containment build- ing several months following the Three Mile Island accident, suit was filed against NRC commissioners as individuals for causing harm to the populace

OCR for page 197
LESSONS FROM THREE MILE ISLAND AND BHOP~L 203 around TMI. This case is still in the courts. If we lose, the effects may be paralyzing. The EPA administrator may have had this case pointed out to him as he considered whether to allow the release of bioengineered products for agricultural use. The approval for bioengineered products is a responsi- bility shared among the National Institutes of Health (NIH), EPA, the Food and Drug Administration (FDA), and the U.S. Department of Agriculture (USDA) (Sun, 1985a, 1985b, 1985c), end the D.C. Court has recently ruled that NIH must conduct environmental assessments before giving approval for biotechnology experiments (Sun, l985b). The court suit against the NRC commissioners may become a major factor in any adminstrator's , . ceclslon. 4. Educate the public about risk. This is a shared responsibility of regula- tors, industry, and the Congress. Unless risk, and comparative risk in partic- ular, can be explained and accepted by the public, hazardous technology will continue to be under attack, as will be the regulators and other public off~- cials. The Ruckelshaus example cited above illustrates a positive approach for meeting this responsibility. The Three Mile Island accident acquainted many people with the concept that the risk of an event is the probability of the event times the conse- quences. Nevertheless, this concept remains foreign to most of the public even though life is a continuous series of risk judgments. Sometimes we take steps to reduce the probability; sometimes, the consequences. For example, getting faulty brakes fixed before driving your car in rush hour traffic reduces the probability of an undesirable event; wearing a seat belt reduces some consequences. Some people argue that for high-consequence events, the risk should be the probability times a weighted consequence, where the weighting factor increases in relation to the seriousness ofthe consequence. Apparently this is the way the public comprehends risk. It would appear that much of the press and many of the more vocal members of Congress do also. THE PUBLIC The U.S. public is more willing than that of other countries to actively oppose locating or operating nuclear power plants, although this phenome- non may be spreading. Such opposition is one area of public participation in government where apathy does not prevail. The chemical industry and other industries that can be described as hazardous should be concerned. They might try to learn from hearings on nuclear power plants what causes public fear and opposition. The chemical industry should recognize that the grow- ing concern about waste disposal is leading the public to become concerned about the generators of that waste. Disposal is greatly simplified if the waste

OCR for page 197
204 JOHN ~ ~H~RNE is drastically reduced or eliminated. At least the nuclear industry produces a product, electricity, that is used locally, although this direct link has not noticeably improved the acceptance of nuclear power plants in the United States. In the chemical industry, even such a direct link to the local consumer is absent. TRANSNATIONAL CORPORATIONS In their U.S. operations, transnational corporations are like any other industry. They should estimate financial risks to themselves and environ- mental risks to society. They should be realistic, face the facts as Rickover said, and then introduce management practices and plant modifications to reduce any large risks. It is not only their responsibility to society, but it is also good financial practice. For overseas operations, these corporations should recognize the technological gaps between their knowledge and that of the countries in which they plan to operate. They should not take advantage of those gaps to reduce or dispense with any of the steps that they would take in the United States. Instead, they have a greater responsibility: where the technological base is weak, transnationals should insist on controlling design, construction, and operation. If they cannot, then the transnational corporation should not go into that country. SCIENTISTS AND ENGINEERS Scientists and engineers have had a large impact on the public's confusion about risks. Much of that impact has not been good. Alvin Weinberg (in this volume) quoted from a paper by Clark (1980), "Witches, Floods and Wonder Drugs." In that paper, Clark quotes Harvey Brooks and Granger Morgan. Brooks (1975) wrote, "Scientists inexperi- enced in the political arena, and flattered by the unaccustomed attentions of men of power, are often inveigled into stating their conclusions with a confidence not warranted by the evidence, and . . . not subject to the same sort of prompt corrective processes that they would be if confined within the scientific community." And Morgan (1978) wrote: "Good policy analysis recognizes that physical truth may be poorly or incompletely known. Its objective is to evaluate, order, and structure incomplete knowledge so as to allow decisions to be made with as complete an understanding as possible of the current state of knowledge, its limitations, and its implications." Unfortunately, many scientists and engineers do succumb, as Brooks says, and are not as careful as Morgan says they should be. These individuals should decide whether they want to be professional witnesses or advocates. Certainly, once you have analyzed the options, it is very easy to convince

OCR for page 197
LESSONS FROM THREE MILE I'D AND BHOP~L 205 yourself you know what is "the right answer." But if you want to argue that the right answer is, for example, that nuclear power is good or is bad, you probably have gone beyond your area of expertise and have become a public policy advocate. You have the right to do so, but you should be aware that even if you try to retain your cloak of technical obj activity, the public will not accept it. Instead, the public may transfer its skepticism of you to other scientists and engineers, which then opens the arena to those who knowingly play upon ignorance. And, as Clark (1980, p. 11) pointed out, this leads to the "society's attitudes toward risks such as cancer and nuclear reactors [being] not readily distinguishable from its earlier fears of the evil eye." REFERENCES Ahearne,J. F.1981. Concurring Views. U.S. NuclearRegulato~y Commission Memorandum and Order in the Matter of Federal Tort Claim of General Public Utilities Corporation et al., June8, 1981. Ahearne, J. F. 1983 Prospects for the U.S. nuclear regulatory industry. Annual Review of Energy 8:355-384. Bok, D. C. 1983. The President's Report, 1981-82. Harvard University, March. Brooks, H. 1975. Expertise and politics: Problems and tensions. Proceedings of American Philosophical Society 119:259. Clark, W. C. 1980. Witches, floods, and wonder drugs: Historical perspectives on risk management. Paper R-22, Institute of Resource Ecology, University of British Columbia, January. Morgan, M. G. 1978. Bad science and good policy analysis. Science 201 :971. President's Commission on the Accident at Three Mile Island. 1979. The Need for Change: The Legacy of TMI. Washington, D.C.: U.S. Government Printing Office. Rickover, H. G. 1979. Management in government. Management September: 16-19. Ruckelshaus, W. D. 1985. Risk, science, and democracy. Issues in Science and Technology 1: 19-38. Sun, M. 1985a. Regulator structure for biotechnology proposed. Science 227:274. Sun, M. 1985b. Rifkin and NIH win in court ruling. Science 227: 1321. Sun, M. 1985c. Biotech policy draws flood of comments. Science 228: 1296. U.S. Nuclear Regulatory Commission. 1975. Reactor Safety Study: An Assessment of Acci- dent Risks in U.S. Commercial Nuclear Plants. WASH-1400, NUREG 75/014. Washing- ton, D.C. U.S. Nuclear Regulatory Commission. 1978. Risk Assessment Review Group Report to the U.S. Nuclear Regulatory Commission. NUREG/CR-0400. Washington, D.C. U.S. Nuclear Regulator Commission. 1980a. 1979 Annual Report. Washington, D.C. U.S. Nuclear Regulatory Commission. 1980b. Three Mile Island: A Report to the Commis- sioners and to the Public. NUREG/CR-1250. Washington, D.C.