National Academies Press: OpenBook

Hazards: Technology and Fairness (1986)

Chapter: The De Minimis Principle

« Previous: Technological Fix
Suggested Citation:"The De Minimis Principle." National Academy of Engineering. 1986. Hazards: Technology and Fairness. Washington, DC: The National Academies Press. doi: 10.17226/650.
×
Page 20
Suggested Citation:"The De Minimis Principle." National Academy of Engineering. 1986. Hazards: Technology and Fairness. Washington, DC: The National Academies Press. doi: 10.17226/650.
×
Page 21

Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

SCIENCE AND ITS LIMITS: THE REGULATOR'S DILEMMA 20 original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the retained, and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. embody the principle of inherent safety, their adoption would avoid much of the controversy over reactor safety, the Price-Anderson Act, repetition of the Three Mile Island accident, and so forth. In short, such a technical fix enables one largely to ignore the uncertainties in any prediction of core-melt probabilities. The idea of incorporating inherent or passive safety in the design of chemical plants had been proposed, unbeknownst to the nuclear community, by Trevor A. Kletz (1984) of the Loughborough University of Technology in England in 1974, shortly after the disaster at the Flixborough cyclohexane plant, which killed 28 people. One of the main consequences of the Bhopal disaster may well be the incorporation of inherent safety into new chemical plants, which is, again, a way of finessing uncertainty in predicting failure probabilities. The De Minimis Principle A perfect technical fix, such as a totally safe reactor or a crash-proof car, is usually not available, at least at affordable cost. Some low levels of exposure to materials that are toxic at high levels are inevitable, even though we can never accurately establish the risk of such exposures. One way of dealing with this situation is to invoke the principle of de minimis. This principle, as Howard Adler and I showed in 1978, argues that for insults that occur naturally and to which the biosphere has always been exposed, and presumably to which it has adapted, one should not worry about any additional manmade exposure as long as the man-made exposure is small compared to the natural exposure (Adler and Weinberg, 1978). The basic idea here is that the natural level of a ubiquitous exposure (like cosmic radiation), if it is deleterious, cannot have been very deleterious since in spite of its ubiquity, the race has survived. Moreover, we concede that we do not know and can never know what the residual effect of natural exposure really is. An additional exposure that is small compared to the natural background ought to be acceptable; at the very least, its deleterious effect, if any, can never be determined. Adler suggested that for radiation whose natural background is well known, one might choose a de minimis level as the standard deviation of the natural background, which is about 20 percent of the mean background—that is, about 20 millirems per year. This value has been used as the Environmental Protection Agency's standard for exposure to the entire radiochemical fuel cycle. We know more about the natural incidence and biological effects of radiation than we do for any other agent. It would be natural, therefore, to use the standard established for radiation as a standard for other agents. This

SCIENCE AND ITS LIMITS: THE REGULATOR'S DILEMMA 21 original typesetting files. Page breaks are true to the original; line lengths, word breaks, heading styles, and other typesetting-specific formatting, however, cannot be About this PDF file: This new digital representation of the original work has been recomposed from XML files created from the original paper book, not from the retained, and some typographic errors may have been accidentally inserted. Please use the print version of this publication as the authoritative version for attribution. approach has been used by Westermark (1980) of Sweden, who has suggested that for naturally occurring carcinogens such as arsenic, chromium, and beryllium, one might choose a de minimis level to be, say, 10 percent of the natural background. Clearly, a de minimis level will always be somewhat arbitrary. Nevertheless, it seems that unless such a level is established, we shall forever be involved in fruitless arguments, the only beneficiary of which will be the toxic tort lawyers. Could the principle of de minimis be applied in litigation in much the same way it might be applied to regulation—that is, if the exposure is below de minimis, then the blame is intrinsically unprovable and cannot be litigated? The legal de minimis level might be set higher than the regulatory de minimis; for example, the legal de minimis for radiation might be the background (since the BEIR-III report concedes that there is no way of knowing whether or not such levels are deleterious). The regulatory de minimis could justifiably be lower, simply on grounds of erring on the side of safety. One approach might be to concede that there is some level of exposure that is "beyond demonstrable effect" (BDE). This defines a "trans-scientific" threshold. A de minimis level might then be established at some fraction, say one-tenth, of this BDE level. For example, if we take the previously quoted value of 100 millirems per year of low LET (linear energy transfer) radiation as the BDE level for somatic effects, then a de minimis for low LET might be set at 10 millirems per year. Of course, such a procedure would evoke much controversy as to what the BDE level is or whether 10 is an ample safety factor. This example demonstrates, however, that at least in the case of low-level radiation, a scientific committee was able to agree on a BDE level. The safety factor of 10 cannot be adjudicated on scientific grounds. The most one can say is that tradition often supports a safety factor of 10—for example, the old standard for public exposure (500 millirems per year) was set at one-tenth of the tolerance level for workers (5,000 millirems per year). Can a principle of de minimis be applied to accidents? The idea is that accidents that are sufficiently rare might be regarded somehow in the same category as acts of God, and compensated accordingly. We already recognize that natural disasters should be compensated by the society as a whole. One can argue that an accident whose occurrence requires an exceedingly unlikely sequence of untoward events might also be regarded as an act of God. Thus, the Price-Anderson Act (42 U.S.C. 2210) might be modified so that, quite explicitly, accidents whose consequences exceeded a certain level, and whose probability as estimated by PRA would be less than, say, 10-9 per year, would be treated as acts of God. Compensation in excess of the amount stipulated in the revised act would be the responsibility of Congress. The cutoff for compensation, or for probabilities, would be negotia

Next: CONCLUSIONS »
Hazards: Technology and Fairness Get This Book
×
 Hazards: Technology and Fairness
Buy Paperback | $55.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

"In the burgeoning literature on technological hazards, this volume is one of the best," states Choice in a three-part approach, it addresses the moral, scientific, social, and commercial questions inherent in hazards management. Part I discusses how best to regulate hazards arising from chronic, low-level exposures and from low-probability events when science is unable to assign causes or estimate consequences of such hazards; Part II examines fairness in the distribution of risks and benefits of potentially hazardous technologies; and Part III presents practical lessons and cautions about managing hazardous technologies. Together, the three sections put hazard management into perspective, providing a broad spectrum of views and information.

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  6. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  7. ×

    View our suggested citation for this chapter.

    « Back Next »
  8. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!