National Academies Press: OpenBook
« Previous: Need for Nuclear Power Worldwide: World Regional Energy Modeling
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 26
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 27
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 28
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 29
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 30
Suggested Citation:"Risk and Democracy." National Academy of Engineering. 1980. Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.. Washington, DC: The National Academies Press. doi: 10.17226/18568.
×
Page 31

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Risk and Democracy DAVID L. BAZELON * I would like to discuss with you the role of the courts in regulating risks generated by modern science and technology. I think our role is important, but often misunderstood. And the judicial perspective has significant consequences for engineers and other experts who contribute to public decisions about risks, such as licensing a nuclear power plant. We are among the many professions who have some rethinking to do. This is an unprecedented era of technological promise and peril. With mobility comes staggering auto accidents, plane crashes, traffic jams, and air pollution. And with the miracles of energy come the risks of coal mining accidents, nuclear reactor accidents, and even atomic terrorism. Nobody is satisfied with existing regulation of risks. For each regulation, some claim it is too lax, while others claim it is too strict. We all hear the current call for "deregulation." But the Three Mile Island review commissions highlight the need for more effective regulation. The District of Columbia Circuit Court's caseload now in- volves challenges to federal administrative action relating to matters on the frontiers of technology. What level of exposure to known car- cinogens is safe for industrial workers? Shall we ban the Concorde SST, Red Dye Number 2, or Saccharin? How can society manage radioactive wastes from nuclear reactors? Let me tell you first that the courts cannot and do not answer such questions, even when posed as challenges to administrative actions. None of us knows enough to resolve issues on the frontiers of nuclear physics, toxicology, and other specialties informing the NRC, EPA, or FDA. Courts also lack the political mandate to make the critical value choices that ultimately are reserved for the public. These decisions must be made by elected representatives or public servants legally accountable to Congress and the people. If the courts do not resolve technical disputes or value conflicts about technological changes, what are the courts' roles? Of course, there are individual nuances and shifting historical trends, but, *David L. Bazelon is Senior Circuit Judge, United States Court of Appeals for the District of Columbia. 26

27 in brief, the judicial responsibility is to assure that an agency's decisionmaking is thorough and within the bounds of reason. The agency's decisional record must disclose the evidence heard and policies con- sidered. This will permit quality checks through effective peer review, legislative oversight, and public education. Only if the decisionmakers disclose assumptions, doubts, and points of controversy can experts in universities, government, and industry evaluate the technical bases of the administrative action. Only then can they scrutinize the agency's factual determinations, bring new data to light or challenge faulty assumptions. Full disclosure of the reasons for a decision is also essential to legislative and public review. Congress and ultimately the people must make the critical value decisions about such questions as what level of radiation emissions can be accepted in the face of incomplete medical knowledge. So disclosure is essential to permit politically legitimate oversight of agencies' implicit value choices. Courts stand outside both expert and political debate. They can help to ensure that a complete and orderly administrative record is discov- ered. Courts can guarantee that all relevant information was considered and addressed. Further, courts can accustom decisionmakers to the dis- cipline of explaining their actions. Finally, courts can assure that all persons affected had an opportunity to participate in the decision. I had always thought that scientists and engineers understood this judicial function. But in recent weeks I have been surprised to find that this is news to many. Perhaps the advantages gained through the judicial tasks are also not widely known, although they benefit every- one, including decisionmakers themselves. For if the decisionmaking process is open and candid, it can expose gaps, stimulate the search for better information, and reduce the risk that important information will be overlooked or ignored. An open process can inspire more confi- dence in those who are affected. Above all, an open process protects the credibility of decisionmakers from claims that they are covering up incompetence, ignorance, or damaging information. What consequence does this all have for you, who serve as leaders or advisors in industry or government? Part of the disclosure requirement I have described falls on the agency decisionmakers that Congress made responsible for licensing nuclear power plants, approving waste disposal plants, and the like. Yet there is an equally important implication for your role. If your advice and plans are to provide adequate support when, for example, the NRC approves an operating license application, you too should disclose your assumptions and doubts, as well as the risk levels you estimate. Unless you explain the basis for your engi- neering judgments, the agency record to be reviewed by the court, and ultimately by your peers and the public, simply will not do the job. Understandably, many believe that complete disclosure of risks is unwise. I have heard experts say that they would consider not disclos- ing risks that, in their view, are insignificant in order to avoid the danger of needlessly alarming the public. It may well be that popular fears about risks from atomic energy are irrational. Public fears about nuclear plant meltdowns may in fact be disproportionate to the seriousness of the threat, when discounted by its probability. A sense of the public's irrationality may have led the Information

28 Director of the French Atomic Energy Commission to observe that publica- tion of precautions against risks "frequently has little other effect than to heighten [public] feelings of insecurity." He concluded that "there is nothing to be gained" through public debates on particular nuclear power controversies. Many of you here may agree with this sentiment. But I believe that this view is unacceptable in our country. It is also unrealistic when it comes to nuclear power. Nondisclosure does not eliminate public fears. Indeed, it can exacerbate them. The fact is, the public is already afraid. Loss of public confidence is cited by the Kemeny Commis- sion as one of the worst problems with the nuclear power industry and its regulators. Alvin Weinberg, a founding father of the Nuclear Age, I think rightly warns that nuclear power will be rejected politically not because people "will actually be hurt," but because "people will be scared out of their wits." In other ages, and other cultures, the decisions of a wise man, or shaman, would resolve all doubts. But so long as we remain a democracy, the judgment of the people will prevail. And as Thomas Jefferson said, "if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion." The genius of our system is its checks on centers of accumulated power. For this system to survive, experts must disclose their knowledge about promises and perils from technological advances. Special knowledge will undoubtedly, and rightly, give experts an important voice in political value choices. But to protect them- selves, and the country, experts cannot, and should not, arrogate the decisions to themselves. Public confidence, I submit, is possible only if experts accept the difficult tasks of explaining what they know and do not know, and how they balance risks and benefits. This message may be somewhat unfamiliar to engineers who have more experience with decisionmaking in the private sector. After all, your concerns traditionally have been to develop effective applications of scientific advances, as cheaply and as safely as possible. But today, the consequences of your judgments are of unprecedented magnitude and major public concern. Strictly private decisionmaking is no longer possible. Instead, value judgments and technical decisions deserve and require peer and public review. Consider the selection of safety systems at a nuclear power plant. Making a plant "as safe as possible" may call for redundant safety sys- tems and multiple fail-safe strategies to shut down the plant at the first sign of malfunction. Yet safety features of this kind are costly to install and even more expensive to employ. I am told that somebody decided that safety could be purchased for a lesser price at Three Mile Island. Perhaps the safety protections there were in fact adequate. Perhaps the crisis was generated "only" by the press. But the danger came far closer than anyone had predicted, and public fears were under- standably aroused. The crisis mentality might have been avoided had the public been better informed about the trade-offs behind the safety design. Implicit in that design are value judgments that may be hidden unless deliberately exposed to view. This is the case with cost-benefit

29 in general. It calls for controversial quantitative valua- tions of human life and health. It also too often presumes to compare the incomparables. How do we compare low-level, long-term radiation exposure with the benefits of nuclear power? Perhaps most troubling for our purposes, a cost-benefit calculus framed for private decision- making may significantly depart from the demands of public decision- making. A private firm is likely to consider only privately borne costs and call the rest "externalities." If a public decisionmaker relies solely on that private cost-benefit analysis, the entire range of costs and risks may not be revealed to all and sundry. I do not know if it is true, but it is said that engineers may have disincentives to disclose design defects to their private employers. A defect identified means a new cost to the manufacturer. It may even cause the loss of a contracting bid. The drive to produce the cheapest design in the shortest possible time may eliminate needed safety checks. The DC-l0 is perhaps the most notorious recent example of private compet- itive pressures shortchanging safety. Public pressures can also push hardware faster and farther than it is ready to go. Witness the current experience with the space shuttle, whose designers kept costs down by eliminating component testing but are now back at the drawing board. I do not mean to imply bad faith or incompetence. I just mean to point out that time and profit pressures may interfere with the caution crucial to public safety. The Kemeny Commission concluded that we have a mind- set problem. Infrequent accidents have produced optimism and confidence. But however infrequent, the magnitude of possible harm demands an inde- pendent and vigilent concern for safety. And only full disclosure can assure that a particular mind-set does not preclude external safety checks. The need for disclosure may call for a change in a basic engineering approach. Countless innovations have been perfected privately by engi- neers through trial and error. But the blowups of experimental rail- road boilers of yesteryear never posed the magnitude of public risk now present if a 747 plane crashes or a nuclear reactor malfunctions. With public consequences of this sort, an engineering assessment of general theoretical feasibility, if relied upon, may not be enough to instill public confidence. Moreover, an agency does not have the leeway to conclude that an unresolved issue can be worked out later, if the sta- tute demands adequate evidence now. Consider the problem of nuclear waste disposal. Many engineers believe that the solution is within reach—in theory. It has taken the industry a long time to take the problem seriously, even though it has been the public's major concern about nuclear power for years. This problem came to my attention in a case in our court, Natural Resources Defense Council v. Nuclear Regulatory Commission. I became concerned because the NRC had relied exclusively on vague assurances by agency personnel that nuclear waste disposal problems as yet unsolved would be solved. Our court reversed the agency's decision in order to permit a fuller inquiry. My objection was not founded on any disagreement with the conclusion that nuclear waste disposal can be managed. Nor did I criticize the NRC for failing to develop fool-proof solutions to the

30 problem. What I found unacceptable was the almost cavalier treatment of the issue by the agency, and its apparent refusal to come to grips with the limits of its knowledge. The commission gave no serious response to criticisms brought to its attention. No technical oversight within the agency was demonstrated, and no peer review by the expert community at large was possible. In this case, perhaps better known under the name of Vermont Yankee, the Supreme Court unanimously rejected our decision. That Court con- cluded that we had imposed on the agency procedures not required by law. Nevertheless, the Court returned the case for us to determine whether the record supported the substantive conclusions of the NRC. In so doing, the Court reaffirmed the fundamental requirement of full disclo- sure on the record. This includes thorough exploration of uncertainties, even if engineering practice would otherwise leave a problem alone until it demanded practical solution. I was heartened by a thoughtful letter I recently received on this subject from a professor of nuclear engineering at a midwestern univer- sity. He wrote that the value system of the engineer includes accep- tance of "an uncertain level of risk" because his decisions must be quick to be cost-effective. He said that compared to other risks asso- ciated with nuclear power, the waste disposal problem is "minute" to the engineer. Yet this professor acknowledged that others view the level of risk from a different set of values. For example, some seem to feel that any risk is too much. He concluded, and I quote, I believe that now the technical community is learning that their value system and that of the public [do] not coincide, and some- times [do] not even seem to overlap. I also believe that it has been the courts that have mostly impressed this on them. When public values are called into play by engineering decisions, dis- closure of known risks and unresolved problems is the only course that will protect public decisionmaking. I have been told about a final engineering trait that poses problems for public decisionmaking. That is the profession's general aversion to taking public stands on safety issues. This is not only a problem for engineers. A prominent professor of medicine recently criticized his profession for its silence throughout the Three Mile Island incident. No one in the medical profession corrected the media story that the radiation leaks were no worse than those from a single X-ray shot per person. Apparently, this view neglects the more serious cumulative effect of the leaks. I certainly do not know enough to judge the sever- ity of the health risk. But erroneous palliatives will not diminish whatever risk there was. In fact, some are now charging that better medical precautions should have been mobilized to counteract whatever danger the radiation posed. In addition, the mental stress from uncer- tainty is perhaps the most serious health effect from the Three Mile Island incident, according to the Kemeny Commission. The medical pro- fession's failure to take a leadership role must in part be blamed on both counts. Engineers may be particularly reluctant to speak out about

3l indeterminate risks because they would rather be silent than misstate the risks. But engineers must realize that decisions will be improved, and public understanding enhanced, if experts reveal exactly what they do know. Industry disincentives may, however, contribute to engineers' reluctance to "go public." I do not need to remind this group of the Bay Area Rapid Transit engineers who were fired after their safety con- cerns about the system's automatic train control became public. But I do not believe that fear of reprisals causes the engineering profession's reticence. A more dominant problem is that loyalties to employers and other concerns can cause us to ignore broader public needs. The engineering profession's duty to the public is acknowledged in its ethical canons. But I do not believe that duty has been dealt with adequately. The Code of Engineering Ethics, approved by the Engineering Council for Professional Development in l974, calls upon engineers to advance the profession by "serving with fidelity the public, their employers, and clients." However admirable a sentiment, this principle provides no structure to direct the engineer who notes a divergence between public and private interests. A number of engineering societies have adopted what looks to be a more instructive guidepost, as part of a statement on "employment guidelines." This statement directs the professional employee to withhold plans that do not meet accepted pro- fessional standards and to present clearly the consequences to be expected if that professional judgment is not followed. Adm. Hyman Rickover, the father of the nuclear submarine, put a similar view quite succinctly. He very recently urged all in the nuclear field to "face the facts and brutally make needed changes, despite significant costs and schedule delays." None of this is easy. The costs and delays from brutal honesty and reevaluation will make your life harder, as they make life more diffi- cult for a great many other professionals. Disclosure may scare people. It may scare the public to hear, as the Kemeny Commission has reported, that engineers have not designed sufficient safety checks for many fore- seeable human errors in operating nuclear power plants. But nondis- closure violates a partnership with the public that engineers have entered by ushering in a new day in technological capabilities. If technological progress is to coexist with democracy, I believe that its creators must rethink their methods and their communication with the public. At the same time, judges, regulators, and other participants in public decisionmaking must reexamine our roles against the backdrop of the ever-evolving technological landscape. However difficult, we must criticize ourselves to avoid "hardening of the arteries" in our professional conduct and moral sensibilities. We need self-regulation, not just governmental regulation, to harness newfound tools for human ends.

Next: Nuclear Power Reliability and Safety in Comparison to Other Major Technological Systems »
Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C. Get This Book
×
 Outlook for Nuclear Power: Presentations at the Technical Session of the Annual Meeting--November 1, 1979, Washington, D.C.
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!