Mathematical Models Used for Site Closure Decisions
Shlomo P. Neuman and Benjamin Ross
The U.S. Department of Energy (DOE) faces difficult decisions concerning the disposition or closure of sites contaminated with radioactive, toxic, and hazardous materials. Given current knowledge and technology, it is neither economically nor technically feasible to release all DOE sites for unrestricted use in the foreseeable future. It will therefore be necessary to keep many sites under some form of control well into the future. The DOE is considering long-term stewardship to encompass all activities that are required to maintain an adequate level of protection to human health and the environment from hazards posed by nuclear and chemical materials, waste, and residual contamination remaining after cleanup is completed. As part of its decision process the DOE will need to assess the consequences of alternative remediation, restoration, control, and/or release scenarios at each site. In particular, it will need to assess potential risks and hazards posed to human health and the ecology by contaminants that remain at a site following remediation and restoration, regardless of whether the site is released or remains under DOE control. This includes assessing the long-term performance of engineered barriers to contaminant migration at the site.
Assessments of the hazards posed by sites where contamination will remain into the distant future are known as risk assessments or performance assessments. The term performance assessment usually refers to evaluation of the extent to which an engineered system satisfies predetermined design or performance criteria. In the context of contaminated sites, the system of concern usually includes both engineered and natural components, and performance criteria relate both to the design of engineered remedies and to human and ecological safety measures. Such safety measures may (but need not) be cast in the form of risk criteria; in the latter case, one speaks of risk assessment. Any risk or performance assessment uses mathematical models, usually but not always implemented on computers, which describe the processes that operate at the site. The models rely, however, on information about the site, including its physical properties and the pathways of human exposure to contamination. This information determines what parts of the mathematical models are deemed relevant and what parameter values and forcing terms (e.g., source terms, initial and boundary conditions) are input as data.
PRESENTLY AVAILABLE MODELS
Two kinds of models are typically used to predict the behavior of a site contaminated with radioactivity or toxic chemicals:
a hydrologic transport model that predicts how dissolved contamination will be transported in groundwater; and
a “risk” model that computes the transfer of contaminants through different portions of the surface environment, the exposure of humans to contaminants in the environment, and the resulting health effects.
These models can be supplemented with a variety of other models, such as:
for radioactive contamination, a direct exposure model that computes the dose to humans from radiation emitted by contamination in the ground (this pathway does not exist for chemical contaminants);
a leaching model that describes how contamination passes from the solid phase into the aqueous phase;
a vadose zone model that describes how contamination moves downward from the point of disposal toward the water table, or upward with some contaminants such as radon;
an air dispersion model that describes the transport of dust that blows off contaminated soil, or of gaseous contaminants such as radon; and
an ecological risk model that evaluates how the contamination affects ecosystems.
Most commonly, models used at DOE remediation sites involve direct exposure, hydrologic transport, and risk. Modeling can be done with separate models for each part of the problem, or one model can handle the entire problem. Prominent among the multimedia or multiple-pathway risk assessment models that try to carry out all steps in risk assessment in a single model are RESRAD (Yu et al., 1993; Chen et al., 1991; Chen et al., 1995), MMSOILS (U.S. Environmental Protection Agency, 1996; Chen et al., 1995), and Multimedia Environmental Pollutant Assessment Systems (MEPAS) (Buck et al., 1995; Buck et al., 1997; Chen et al., 1995; Doctor, et al., 1990; Streile et al., 1996; Whelan, et al., 1996).
Direct Exposure to Radiation
Waste units where direct exposure to radiation is the major hazard are frequently modeled with the RESRAD code. This model was initially developed to implement DOE's Residual Radioactive Material Guidelines and the U.S. Nuclear Regulatory Commission (USNRC) procedures to assess site decommissioning. It was subsequently expanded by incorporating a risk model and simple models of hydrologic transport and leaching. RESRAD is used heavily in DOE decision-making. For example, it features prominently in a recent DOE document (U.S. Department of Energy, 1996), which addresses remedial designs and remedial actions for high-priority waste sites in the 100 Area of Hanford. The same document is expected to form the base for remedial actions across the 100 Area liquid waste disposal sites with an intention to revise it for future remedial actions. RESRAD has also been used extensively in decision-making about cleanup of areas in Nevada that were contaminated with plutonium by testing of nuclear weapons. Because it comprehensively implements the DOE and USNRC guidelines and has been thoroughly tested, RESRAD is a reliable tool for solving direct exposure problems. DOE 's reliance on RESRAD at sites where the major hazard is ground shine or inhalation of radioactive dust (also addressed by the DOE and USNRC guidelines that RESRAD implements) is quite appropriate. However, as discussed below, the other subunits of RESRAD cannot be relied upon in the same way.
So-called “risk” models actually carry out only a part of the computations that go into a risk assessment. These models identify pathways of exposure and calculate human intake, dose, and detriment. They generally take the concentrations of contamination in soil and surface waters as an input; these quantities must be measured, calculated by a separate model, or in the case of an integrated performance assessment model calculated by a separate submodel.
Essentially, the risk models implement the Risk Assessment Guidance for Superfund, which combines a linear “box model” of ecosystem transfers with coefficients published by the U.S. Environmental Protection Agency
(EPA) that give the harm per unit of chemical contaminant ingested by a human being. The coefficients for carcinogens are based on a linear no-threshold model of detriment; for non-carcinogens, there is assumed to be a threshold below which no harm occurs. For radionuclides, risk coefficients are derived from human exposure data and are published by the International Commission on Radiological Protection (ICRP) and the National Council on Radiation Protection and Measurements (NCRP).
Ecosystem transfers are usually modeled as a linear system. “Default” values for the transfer coefficients that define this linear system have also been published by EPA and USNRC. There are so many of these transfer coefficients that it is impossible to measure them all, so use of the default values is essential, but these values will not always be correct. The proper practice is to adopt the default values for pathways of little importance, but to take care to base transfer coefficients for the dominant exposure pathways on site-specific information.
Risk modeling can be carried out either with a computer program or on a spreadsheet. In non-DOE contamination sites, spreadsheet analysis is more common. A recent study commissioned by DOE (Regens et al., 1999) has evaluated RESRAD, MMSOILS and MEPAS and compared them with the spreadsheet approach. It found that the computer models had little or no practical advantage over spreadsheets in usability and efficiency.
Groundwater Transport Models
It is now widely recognized that the subsurface is a complex, multiscale, spatially variable natural environment that can never be fully characterized. Hence the results of even the most thorough site characterization and monitoring efforts are ambiguous and uncertain. To address uncertainties, it has become common to analyze hydrogeologic data statistically and flow and transport stochastically. The most common and straightforward method of stochastic flow and transport analysis involves repeated simulations by means of detailed numerical models in which the material properties (such as permeability and porosity) and forcing terms (sources and boundary conditions) vary randomly from one simulation to another. Permeability and porosity are known to be spatially auto-and cross-correlated on a variety of scales. By taking account of such correlations, and forcing the random variables to conform to measurements, one obtains conditional Monte Carlo solutions to the stochastic flow and transport problems. Upon averaging these solutions one obtains optimum unbiased predictors of system behavior under uncertainty. Upon calculating the variance of the Monte Carlo solutions one obtains a measure of predictive uncertainty.
Vadose Zone Models
Virtually all existing multimedia risk assessment models view fluid flow and radionuclide transport in the vadose zone as moving vertically downward at a uniform and steady rate. Though many recognize that this conceptual model is oversimplified, it is often defended as being conservative, in that mathematical models based upon it overpredict contaminant concentrations at receptor locations and associated risks. Reliance on vadose zone monitoring is important in arid and semiarid environments where unsaturated soil conditions may prevail to considerable depths, as at Hanford Site in Washington, the Idaho National Engineering and Environmental Laboratory, and the Nevada Test Site (including Yucca Mountain, the site currently being evaluated as a potential geological repository for high-level wastes and commercial spent nuclear fuel). It is much less important in moderate and humid environments where the vadose zone tends to be shallow and hydrologic variables can be monitored effectively, with relative ease, at and below the water table.
The transfer of contaminants from the immobile soil phase to groundwater is generally modeled with relatively simple analytical expressions. The choice among these expressions depends on the physical and chemical form of the contamination. Radioactive wastes are generally solids. Two commonly used models are the “leach-limited” and “solubility-limited” models. In the “leach-limited” model, the radionuclides are considered to be incorporated into a solid matrix (crystalline or non-crystalline) that releases minor impurities into groundwater as
it alters or dissolves. All radioactive species in the matrix are released in proportionate amounts. In the “solubility-limited” model, the concentration of each radioelement in groundwater is equal to or less than its solubility.
Similarly there are two alternative models for dissolution of organics. If a contaminant is adsorbed to soil particles, the concentration in groundwater will be proportional to the concentration in the soil. On the other hand, if a separate non-aqueous phase liquid (NAPL) is present, the concentration in groundwater in direct contact with the NAPL will be equal to the compound's “effective solubility.” The effective solubility is, approximately, the product of the solubility of the pure compound in water multiplied by the fraction of the NAPL that the compound constitutes. When, as is usual, the NAPL is present in disconnected zones of residual contamination, there will be dilution due to the fact that only some of the water that passes through the source will come into direct contact with the NAPL.
CONSTRAINTS AND LIMITATIONS OF MODELS
The models used in support of site decisions are necessarily imperfect reflections of the real environment. Some major limitations of currently available models are described in this section.
Risk Assessment Models
The 1998 Consortium for Environmental Risk Evaluation (CERE) study (Regens et al., 1999) examined the applicability of multimedia risk assessment models to real DOE sites through two case studies using actual data. One site examined was a solid waste storage area at Oak Ridge National Laboratory (ORNL) in Tennessee with trenches containing alpha-contaminated low-level waste, remote-handled transuranic wastes deposited in concrete casks and combination (wood/metal) boxes, and a small number of steel drams. The other case study concerned Operable Unit 2 at the Rocky Flats Environmental Technology Site (RFETS) in Colorado, which contains drums of radioactive-contaminated oils and solvents, plutonium-239 contaminated soils, liquid chemical waste in disposal trenches, and an inactive Reactive Metal Destruction site. The model comparison indicated that 1) the exposure and risk assessment frameworks in all three models follow DOE, EPA and USNRC guidelines; 2) existing major differences between the models are due to their differing objectives—where the capabilities of the models overlap, such differences are due to the formulation of transport components; 3) the models yield results that differ by up to three or four orders of magnitude; 4) the models are in many ways similar to traditional spreadsheet analyses; and 5) the primary benefit of the screening-level risk assessment process is to identify chemicals and pathways that make the largest contributions to overall risk.
Spreadsheets (or paper-and-pencil calculations) are much more flexible than computer models. For example, all existing multimedia models consider a single source for each surface water pathway. At large DOE sites like ORNL and the Savannah River Site (SRS) in South Carolina, contaminant loading to surface water bodies is likely to involve creeks and rivers intersecting several contaminant plumes at various locations, and surface runoff from multiple sources may impact a single stream at several points. Other calculations that existing models cannot handle include the combination of stream flows and contaminant loadings as tributary creeks flow into larger creeks and streams, sediment uptake, and contaminant decay processes.
Another great disadvantage of the computer models is that the assumptions (where things most commonly go wrong) are buried in the computer code. This creates a strong presumption in favor of default assumptions, which can easily go wrong. For example, if local populations engage in subsistence fishing, the default value will underestimate fish consumption and lead the modeler to overlook an important pathway due to bioaccumulation in fish. If vacation homes have water intake pipes that lie on the bed of a lake, default assumptions about mixing in the lake will cause the homeowners' exposure to groundwater that discharges into the lake to be underestimated. The way to uncover such mistakes is to have the widest possible review and criticism. Review by local community members, who are often more familiar with the realities of a site than outside experts, is especially valuable. The effect of using a computer program rather than a spreadsheet (or paper-and-pencil calculation) to do the risk assessment is that the assumptions that most need review are hidden where they are least accessible.
In general, CERE found that the advantages of multimedia models are not as great as anticipated. Differing
objectives and lack of transparency make model application difficult; application to real situations may require considerable ingenuity and expertise on the part of the user. CERE also observed that multimedia models do not provide absolute estimates of risk, but rather conditional estimates based on multiple assumptions about source term, environmental settings, transport characteristics, exposure scenarios, toxicity, and other variables. While the magnitude of risk estimates produced by multimedia models differ, they do tend to agree on the most significant contaminants and the most important pathways of exposure. These observations would be equally applicable to spreadsheets and other methods of risk assessment that do not rely on computer programs.
Groundwater Transport Models
In principle, the Monte Carlo method of uncertainty analysis should be easy to implement in conjunction with a risk assessment methodology of the kind just described. The only potential obstacle for such implementation is the large amount of computer time that may be required to repeat detailed hydrologic model simulations many times so as to generate a meaningful statistical sample of equally likely random flow and transport solutions. The large amount of computer time required by conditional Monte Carlo simulations conducted by means of detailed, state-of-the-art hydrologic models is often cited as a justification for either foregoing such simulations completely (and with them, the opportunity to quantify prediction uncertainty) or for using highly simplified models. It is the consensus of many hydrologists that, given the critical importance of groundwater flow and transport models in assessing risks and hazards from subsurface contamination, it is better to run a small number of simulations with detailed models that incorporate the known physics and geology of the sites rather than a large number of simulations with oversimplified models that may disregard crucial information.
Deterministic models are unable to account for uncertainties in input data and therefore yield outputs (such as contaminant concentrations, exposure doses and risks) of unknown reliability. Without providing quantitative information about the uncertainty (hence reliability) of its outputs, a model cannot be used to assess 1) the worth of additional data through site characterization, 2) the reliability of a proposed environmental monitoring system, or 3) the uncertainty associated with predicted site performance measures (such as future contaminant concentrations, doses, and risks). Hence, uncertainty analysis must be an integral part of future performance or risk assessment effort by the DOE. When the main uncertainties are quantifiable, the simplest way to accomplish this is to operate the corresponding models in a conditional Monte Carlo mode as described earlier.
But some major sources of uncertainty are difficult to quantify. A model may reflect an inaccurate conceptualmathematical representation of site hydrology and subsurface transport processes. For example, long-established conceptual and mathematical models of groundwater flow have come into question at the Nevada Test Site. Groundwater in southern Nevada flows long distances, often passing through several topographic basins between recharge and discharge. For many years, models of this system were based on a conceptual framework originally established by Winograd and Thordarson (1975), who were unable to determine the northern boundary of the flow system because they had very few data north of the test site. Maps in their report ended at 38°N latitude, well beyond the limit of their data. Subsequent studies that used isotopic variations to infer the origin of groundwater considered only recharge areas within the boundaries of the Winograd and Thordarson maps. Water found in parts of the test site with low concentrations of oxygen-18 and carbon-14 was interpreted as water that had recharged in the pluvial conditions of the late Pleistocene, about 10,000 years ago (Claassen, 1985). This conclusion implied that groundwater moves very slowly in the test site area. Recently, Davisson et al. (1999) proposed a new interpretation in which most water with low oxygen-18 concentrations originated in recharge areas north of the area studied by Winograd and Thordarson. The low carbon-14 content of this water is explained in this view by isotopic exchange with carbonate rocks. This interpretation suggests much greater speeds of groundwater movement. Whichever interpretation of the southern Nevada flow system ultimately turns out to be correct, this story illustrates how a concept initially introduced as an unverified simplification can become embedded in scientific thinking as an unexamined assumption that greatly influences conclusions.
As another example, actinides such as plutonium and americium are strongly adsorbed or insoluble in laboratory experiments, and most computer models assume that actinides only move when they are in a dissolved state. The assumption of thermodynamic equilibrium between dissolved and adsorbed phases implies that actinides
move very slowly in the subsurface. But sorption on colloidal particles of clay, silica, or organic material may significantly enhance their mobility. Two wells completed in the vicinity of the TYBO underground nuclear test site on Pahute Mesa, at the Nevada Test Site, were sampled as they were pumped. The sampling revealed the presence of plutonium, in association with colloids, at significant concentrations in well ER-20-5 #1, 278 m west of TYBO at a depth of 860 m, and at very small concentrations in a deeper aquifer penetrated by well ER-20-5 #3, 30 m south of #1 at a depth of 1,309 m (Kersting et al., 1999).
Vadose Zone Models
To better understand fluid flow and contaminant transport processes in the vadose zone, one must recognize that unsaturated soils and rocks form part of a complex three-dimensional, multiphase, heterogeneous, and anisotropic hydrogeologic system. This system does not constitute a perfect sequence of horizontal layers with homogeneous properties as would be needed for flow and transport to be uniform in the vertical direction. If it did, flow and transport rates would be controlled by the least permeable layer and would therefore be correspondingly low.
In reality, unsaturated medium properties vary spatially in a complex manner, which often allows fluids and contaminants to move around low-permeability obstacles much faster than would be possible in the perfectly stratified case. Preferential flow through high-permeability channels, the formation of unstable fingers, and development of fractures can further enhance the rate of contaminant migration from a source in the vadose zone to the water table. Preferential flow and fingering have been widely documented in laboratory and field studies, demonstrated numerically, and explained theoretically (Chen et al., 1995). Birkholzer and Tsang (1997) have shown numerically that solutes in randomly heterogeneous unsaturated soils migrate rapidly along narrow channels, which are random and vary dynamically with the flow regime. Ignoring these and other phenomena such as the intermittence of infiltration, by assuming that flow is perfectly uniform and vertical as is done in existing multimedia models, renders these models nonconservative in that they underestimate (rather than overestimate, as claimed erroneously by their adherents) contaminant mass flow rates through the vadose zone. Another complicating factor that needs to be considered at more humid sites such as ORNL and SRS is the possibility that contaminants could seep laterally through the soils in a shallow unsaturated zone and into small surface depressions as has been observed in the field, and explained theoretically, by Zaslavsky and Sinai (1981). In addition, flow and contaminant transport in the vadose zone are not always directed downward toward the water table.
A panel of four experts concluded that characterization of the vadose zone is an essential step toward understanding contamination of the groundwater, assessing the resulting health risks, and defining the concomitant groundwater monitoring program needed to verify the risk assessments (Conaway et al., 1997). The panel concluded that reliable computer models of groundwater contamination could not be developed without reliable data on the transport of contaminants within the vadose zone. As that subject is poorly understood, previous and ongoing computer modeling efforts are inadequate and based on unrealistic and sometimes optimistic assumptions that render their output entirely unreliable.
Downward migration from the Hanford Site tanks provides a strong warning about the dangers of oversimplifying the vadose zone (U.S. Department of Energy, 1998). Because DOE had assumed that wastes would move slowly, if at all, through the vadose zone, it never issued a comprehensive plan to assess vadose zone conditions at Hanford and funded few studies of flow or transport through it. Experts have repeatedly advised DOE that its concept of vadose zone hydrology had been potentially flawed, but the expert advice remained unheeded for a long time.
Beginning in 1994, DOE's Grand Junction Office, using technology developed to detect uranium ore deposits, performed tests in about 800 boreholes in the single-shell farm at Hanford. The tests were intended to provide baseline information about the distribution of certain radioactive wastes, but they also enabled the team to identify radioactive substances at considerable depths in the vadose zone. The team found indications of possible new leaks in some tank farms and deep contamination by some radionuclides in several farms. Cesium was discovered at a depth of 125 ft below one single-shell tank farm, and just above the water table under another tank farm. After deepening the well near the first farm, DOE found cesium at a depth of 142 ft and technetium at a depth of 177 ft (Rust Geotech, 1996). A study by the Los Alamos National Laboratory has shown leaks at one farm to be three to
six times greater than previously reported. A January 1988 report by Pacific Northwest National Laboratory has shown that wastes from one farm have reached groundwater.
In December 1997, the DOE announced publicly that highly radioactive wastes from previously leaking underground storage tanks had migrated all the way down to groundwater. DOE now acknowledges that there are significant uncertainties and data gaps in its understanding of the inventory, distribution, and movement of contaminants in the vadose zone at Hanford. Yet the agency is only now starting to develop a comprehensive strategy for investigating the vadose zone (U.S. Department of Energy, 1998, 1999).
To properly model leaching of a contaminant into groundwater, one must select a model that corresponds to the physical and chemical state of the contaminant. Order-of-magnitude errors can result if this is done incorrectly. The dissolution of radioactive wastes will be underestimated if the mineral that is assumed to limit solubility does not precipitate, either because of kinetic constraints or because the oxidation state of the element has not been correctly identified. Organic contaminant dissolution is frequently modeled by an adsorption-based equation. When NAPL is present, the adsorption-based equations may greatly overestimate the dissolution rate. Johnson et al. (1990) observe that the NAPL model is almost always better for hydrocarbon spills, and comment that the frequent use of the adsorption equation in modeling is “due to its mathematical characteristics, rather any model validation. . . .”
Compatibility of Models with Measurements
It is essential to ensure that models are consistent with field measurements of environmental variables. This is particularly important in using multimedia models, whose input and output variables frequently are not directly observable. In order to assess potential risks and hazards from residual contaminants under various cleanup and land/water use scenarios, one should ideally have detailed information about their nature, quantity and location; the manner and rate at which they could be mobilized to migrate toward human and/or ecological receptors; the pathways and rates of their migration; their concentrations at receptor locations; associated doses to receptors; and their effects on receptor health. In reality, information about current site conditions is limited and so is the ability of models to predict future conditions at most sites. These limitations stem from the fact that environmental and bioecological processes, which control contaminant behavior and its health effects at most sites, are extremely complex and therefore exceedingly difficult to describe.
The simplified multimedia models described above often have hidden built-in assumptions that will lead to errors at sites where they do not apply. Because such a model can neither be applied directly to real data nor confirmed experimentally, it is difficult to apply correctly and nearly impossible to evaluate. Use of multimedia models should be confined to problems where the multimedia models incorporate a state-of-the-art submodel (such as direct exposure to gamma radiation in RESRAD) or where assumptions about non-measurable variables are imposed by regulatory fiat (such as the cancer risk factors determined by EPA).
The principle of parsimony should be used to differentiate between alternative operational models. This principle states that among all operational models that one can use to explain a given set of experimental data, one should select the model that is conceptually least complex and involves the smallest number of unknown (fitting) parameters. (This principle can also be stated under a scientific and philosophic rule known as Occam's razor, stating that the simplest of compelling theories should be preferred to the more complex.) When the database is limited and/or of poor quality, one has little justification for selecting an elaborate model with numerous parameters. Instead, a simpler model should be preferred that has fewer parameters, which nevertheless reflects adequately the underlying hydrogeologic structure of the system, and the corresponding flow and transport regime. An inadequate model structure (conceptualization) is far more detrimental to its predictive ability than is a suboptimal set of model parameters.
Risk, Values, and Decision-Making
Decisions that balance risk against cost and other values are among the hardest choices that public officials are called on to make. The difficulties of measuring and communicating risk compound the difficulties created by the need to balance incommensurate values held by different individuals and even within the same individual.
When day-to-day decisions are made about known present-day exposures to chemicals or radiation, the difficulties of doing a risk assessment are frequently avoided by relying on exposure guidelines. The difficult balancing of risk, cost, and uncertainty has already been done by the regulatory agency that set the guidelines. However, decisions about site disposition involve future risks, where not only is the effect on human health of an exposure uncertain, but it is impossible to know whether the exposure will even occur. Thus, while regulatory guidelines can be very useful for making decisions, especially where a conservative analysis predicts exposures below present-day limits, they cannot solve all problems.
The complexities of risk assessment have been the theme of a series of National Research Council reports, including Risk Assessment in the Federal Government: Managing the Process (National Research Council, 1983) and Science and Judgment in Risk Assessment (National Research Council, 1994). A recent report entitled Understanding Risk: Informing Decisions in a Democratic Society (National Research Council, 1996) directly addressed the question of how risk assessments can be made useful in public decision-making. This report concludes that “risk characterization should be a decision-driven activity, directed toward informing choices and solving problems. ” The report emphasizes the need for risk characterization to consider the values and interests of all interested and affected parties. It describes risk characterization not as a purely technical analysis, but as:
the outcome of an analytic-deliberative process. Its success depends critically on systematic analysis that is appropriate to the problem, responds to the needs of interested and affected parties, and treats uncertainties of importance to the decision problem in a comprehensible way. Success also depends on deliberations that formulate the decision problem, guide analysis to improve decision participants' understanding, seek the meaning of analytic findings and uncertainties, and improve the ability of interested and affected parties to participate effectively in the risk decision process. The process must have an appropriately diverse participation or representation of the spectrum of interested and affected parties, of decision-makers, and of specialists in risk analysis at each step.
The imperfections of risk assessment as a tool for predicting the long-term behavior of wastes in the sites makes this recommendation particularly relevant for decision-making about site disposition. Because calculations of long-term risk necessarily rely on unverifiable assumptions about the future behavior of people and institutions, it is essential that the assumptions made in the analysis are widely understood by and acceptable to the parties involved in the decision.
There has been a tendency by the DOE and some other agencies to rely excessively on models in the context of waste disposal and site contamination issues. Models have been used repeatedly to “demonstrate” that a potential waste disposal site or remedial option complies with regulations and is therefore “safe.” More often than not, the ability of models to provide such safety assurances has been taken for granted without a serious attempt to validate them against site data. This is especially true about one-dimensional “multimedia” or “multiple-pathway” dose and/or risk assessment models such as RESRAD, MMSOILS, MEPAS, and DandD (Beyeler et al., 1998; Gallegos et al., 1998), which are based on a limited menu of highly simplified conceptual models, are often used (for screening as well as more advanced investigative purposes) with generic parameters and inputs rather than with site-specific data, and are virtually never compared against actual site conditions. It is however also true, albeit to a lesser extent, about more complex two- and three-dimensional subsurface flow and contaminant transport models that incorporate various details of site geology. The tendency has been to rely on models at the expense of detailed site investigations, site monitoring, and field experimentation. In fact, models have often been
used to “demonstrate” that additional site or experimental data would be of little value for a project. The reasons for this state of affairs are easily identified as regulatory and budgetary pressures.
It is often tempting to “demonstrate” by means of a model that a given waste disposal or remedial option is safe, or that additional site data would be of little value, by basing the model on assumptions, parameters and inputs that favor a predetermined outcome. A common example of such bias is the assignment of lower permeabilities to a groundwater flow model than is justified by available data. It is likewise tempting to help a model appear credible by basing it on a unique system conceptualization and subjecting it to sensitivity and uncertainty analyses in which parameters and input variables are constrained to vary within narrower ranges of values than is warranted by the available information. Such practices are common and ultimately detract from the credibility of agencies that employ them.
Models are appropriate, often essential, tools for risk assessment and decision-making concerning cleanup and management of contaminated, or potentially contaminated, sites. However, it is inappropriate to use models as “black boxes” without tailoring them to site conditions and basing them firmly on site data. Neither disregard of models nor overreliance on them are desirable.
The environment constitutes a complex system that can be described neither with perfect accuracy nor with complete certainty. It is imperative that uncertainties in system conceptualization and model parameters and inputs be properly assessed and translated into corresponding uncertainties in risk and decisions concerning risk management. The quantification of uncertainties requires a statistically meaningful amount of quality site data. Where sufficient site data are not obtainable, uncertainty must be assessed through a rigorous critical review and sensitivity analyses.
Models and their applications must be transparent to avoid hidden assumptions. Model results must not be accepted blindly because hidden assumptions are easily manipulated to achieve desired outcomes.
Decisions concerning site disposition and risk management should account explicitly and realistically for lack of information and uncertainty.
The monitoring of site conditions and contamination is an imperfect art. It is important that uncertainty associated with monitoring results be assessed a priori and factored explicitly into site remedial design and post-closure management.
Where effective and affordable science and technology are not readily available for site characterization, remediation, monitoring, and analyses, the DOE should initiate and pursue vigorously a suitable research and development program. The goals of this program should be both short- and long-term. The program should engage a broad array of talents and specialties from government, industry, and academia in order to maintain a proper balance between disciplines and basic as well as applied research.
Beyeler, W.E., T.J. Brown, W.A. Hareland, S. Conrad, N. Olague, D. Brosseau, E. Kalimina, D.P. Gallegos, and P.A. Davis. 1998 (January 30). Review of Parameter Data for the NUREG/CR-5512 Residual Farmer Scenario and Probability Distributions for the DandD Parameter Analysis. Letter Report for NRC Project JCN W6227,U.S. Nuclear Regulatory Commission , Washington, D.C.
Birkholzer, J., and C-F. Tsang. 1997 (October 1). Solute channeling in unsaturated heterogeneous porous media. Water Resources Research 33(10):2221-2238.
Buck, J.W., G. Whelan, J.G. Droppo, Jr., D.L. Strenge, K.L. Castleton, J.P. McDonald, C. Sato, and G.P. Streile. 1995.Multimedia Environmental Pollutant Assessment System (MEPAS) Application Guidance. Pacific Northwest National LaboratoryPNL-10395, Richland, Wash.
Buck, J.W., D.L. Strenge, B.L. Hoopes, J.P. McDonald, K.J. Castleton, M.A. Pelton, and G.M. Gelston. 1997. Description of Multimedia Environmental Pollutant Assessment System (MEPAS) Version 3.2 Modification for the Nuclear Regulatory Commission . U.S. Nuclear Regulatory CommissionNUREG/CR-6566, Washington, D.C.
Chen, J.-J., C. Yu, and A.J. Zielen. 1991. RESRAD Parameter Sensitivity Analysis. Argonne National Laboratory ANL/EAIS-3, Argonne, Ill.
Chen, J.-J., J.G. Droppo, E.R. Failace, E.K. Gnanapragasam, R. Johns, G. Laniak, C. Lew, W. Mills, L. Owens, D.L. Strenge, J.F. Sutherland, C.C. Travis, G. Whelan, and C. Yu. 1995. Benchmarking Analysis of Three Multimedia Models; RESRAD, MMSOILS, and MEPAS. U.S. Department of EnergyDOE/ORO-2033, Washington, D.C.
Chen, G., M.Taniguchi, and S.P. Neuman. 1995 (May). An Overview of Instability and Fingering During Immiscible Fluid Flow in Porous and Fractured Media. Report NUREG/CR-6308, prepared for U. S. Nuclear Regulatory Commission , Washington, D.C.
Claassen, H.C. 1985. Sources and Mechanisms of Recharge for Ground Water in the West-Central Amargosa Desert, Nevada: A Geochemical Interpretation. U. S. Geological Survey Professional Paper 712-F, Washington, D.C. 31pp.
Conaway, J.G., R.J. Luxmoore, J.M. Matuszek, and R.O. Patt. 1997 (April). Tank Waste Remediation System Vadose Zone Contamination Issue: Independent Expert Panel Status Report. DOE/RL-97-49 Rev.0, Richland, Wash.
Davisson, M.L., D.K. Smith, and T.P. Rose. 1999. Isotope hydrology of southern Nevada groundwater: Stable isotopes and radiocarbon. Water Resources Research 35(1):279.
Doctor, P.G., T.B. Miley, and C.E. Cowan. 1990. Multimedia Environmental Pollutant Assessment System (MEPAS) Sensitivity Analysis of Computer Codes. Pacific Northwest Laboratory PNL-7296, Richland, Wash.
Gallegos, D.P., T.J. Brown, P.A. Davis, and C. Daily. 1998. Use of DandD for Dose Assessment Under NRC's Radiological Criteria for License Termination Rule, p. 13-27. In T.J. Nicholson and J.D. Parrott [ed.] Proceedingss of the Workshop on Review of Dose Modeling Methods for Demonstration of Compliance with the Radiological Criteria for License Termination. U.S. Nuclear Regulatory Commission NUREG/CP-0163, Washington, D.C.
Johnson, P.C., M.B. Hertz, and D.L. Byers. 1990. Estimates for hydrocarbon vapor emissions resulting from service station remediations and buried gasoline-contaminated soils, p. 295-326. In P.T. Kostecki and E. J. Calabrese [ed.] Petroleum Contaminated Soils, Vol. 3, Lewis Publishers, Chelsea, Mich.
Kersting, A.B., D.W. Efurd, D.L. Finnegan, D.J. Rokop, D.K. Smith, and J.L. Thompson. 1999 (January 7). Migration of plutonium in ground water at the Nevada Test Site. Nature 397:56-59.
National Research Council. 1983. Risk Assessment in the Federal Government: Managing the Process. Committee on the Institutional Means for Assessment of Risks to Public Health, National Academy Press, Washington, D.C. 191 pp.
National Research Council. 1994. Science and Judgement in Risk Assessment. Committee on Risk Assessment of Hazardous Air Pollutants, National Academy Press, Washington, D.C.
National Research Council. 1996.Understanding Risk; Informing Decisions in a Democratic Society. Committee on Risk Characterization, National Academy Press, Washington, D.C. 249 pp.
Regens, J.L., C. Travis, K.R. Obenshain, C. Whipple, J.T. Gunter, V. Miller, D. Hoel, G. Chieruzzi, M. Clauberg, and P.D. Wills. 1999. Multimedia Modeling and Risk Assessment. Medical University of South Carolina Press, Columbia, S.C.
Rust Geotech. 1996. Vadose Zone Monitoring Project at the Hanford Tank Farms: Tank Summary Data Report for Tank SX-112. Grand Junction Projects OfficeReport GJ-HAN-14, Tank SX112, Grand Junction, Colo.
Streile, G.P., K.D. Shields, J.L. Stroh, L.M. Bagaasen, G. Whelan, J.P. McDonald, J.G. Droppo, and J.W. Buck. 1996. The Multimedia Environmental Pollutant Assessment System (MEPAS): Source-Term Release Formulations. Pacific Northwest National LaboratoryPNNL-11248, Richland, Wash.
U.S. Department of Energy. 1996 (June). Remedial Design Report/Remedial Action Work Plan for the 100 Area. Richland Office DOE/RL-96-17, Richland, Wash.
U.S. Department of Energy. 1998 (December 17). Groundwater/Vadose Zone Integration Project Specifications. Richland Operations Office DOE/RL-98-48, Draft C, Richland, Wash.
U.S. Department of Energy. 1999 (June). Groundwater/Vadose Zone Integration Project: Volume I-Summary Description; Volume II-Science and Technology Summary and Description; and Volume III-Background Information and State of Knowledge. Richland Operations Office DOE/RL-98-48, Rev.0, Richland, Wash.
U.S. Environmental Protection Agency. 1996. MMSOILS Model. Washington, D.C.
Whelan, G., J.P. McDonald, and C. Sato. 1996. Multimedia Environmental Pollutant Assessment System (MEPAS): Groundwater Pathway Formulations. Pacific Northwest National Laboratory PNNL-10907, Richland, Wash.
Winograd, I.J., and W. Thordarson. 1975. Hydrogeologic and Hydrochemical Framework, South-Central Great Basin, Nevada-California, with Special Reference to the Nevada Test Site. U.S. Geological Survey Professional Paper 712-C, Washington, D.C. 125 pp.
Yu, C., A.J. Zielen, J-J. Cheng, Y.C. Yuan, L.G. Jones, D.J. LePoire, Y.Y. Wang, C.O. Loureiro, E.K. Gnanapragasam, E. Faillace, A. Wallo III, W.A. Williams, and H. Peterson. 1993. Manual for Implementing Residual Radioactive Material Guidelines Using RESRAD Version 5.0. Argonne National Laboratory ANL/EAD/LD-2, Argonne, Ill.
Zaslavaky, D., and G. Sinai 1981. Surface hydrology: I-V. Journal of the Hydrology Division, American Society of Civil Engineers 107(HYI):1-93.