Future Modeling Issues
Modeling will continue to have a central role in future environmental regulatory activities. This is because models are at the nexus of science and policy (Gilman 2005). Critical for this endeavor will be how models incorporate the ever-increasing amounts of observations of natural and human processes and environmental impacts. Vast new measurement programs in fields as diverse as genomics to earth observation systems at scales from the nano to the global pose significant opportunities and challenges for modeling. Although observations alone can influence policy, it is the analysis of this information with models that will allow the full realization of the importance of these measurement programs.
Environmental regulatory modeling also will be greatly influenced by new scientific understandings and enhanced modeling technologies. The potential to incorporate greater understanding of environmental processes, such as the creation of airborne particulate matter from gaseous precursors and the physiological and pharmacokinetic absorption, disposition, metabolism, and excretion of a chemical in the body, is already offering great improvements to modeling capabilities. However, the new information and capabilities come at a time of increasing demand for greater scrutiny of regulatory activities by stakeholders and the public. Thus, improving environmental regulatory modeling does not necessarily imply using the most complex models. New modeling technologies, including developing modular modeling codes or user-friendly
programming languages, also can improve modeling transparency and can better match complexity needs to computational tools.
EXPANSION OF MEASUREMENT SYSTEMS
The relationship of models to measurements has been a critical issue throughout the history of modeling. The rapid increase of information about environmental processes, human-environment interactions, and human and environmental impacts brings new challenges to this relationship in the future. The spectrum of new information that will be available to the environmental regulatory process is vast and beyond the scope of this report. Two examples are discussed to indicate the diverse sources of information that have the potential to be available to modeling.
One end of the spectrum could be considered the genomics revolution, which has enabled the analysis of all the genes in a cell at the DNA, mRNA, protein, or metabolite level (NRC 2006b). These tools can be used to better understand the susceptibility of individuals or subpopulations to chemicals, as well as their responses to chemicals (toxicogenomics). For example, genomics tools provide a means to examine changes in gene expression and to examine how these indicators might be used to understand human health impacts (EPA 2004g). Although the capability to understand the potential for toxicants to impact human genes has been present for many years, the innovation of high throughput testing technologies has profoundly expanded the capability to better measure genomic changes (NRC 2006b). The dramatically increasing amounts of information from genomic technologies have spawned a new science called infomatics to enable orderly analysis of vast data sets. Infomatics includes a wide variety of statistical and other computational models at the “research” level rather than at the “regulatory” level at this time. However, substantially more sophisticated computational toxicology methods, including the use of computational models of biological systems and phenomena, will be needed to link genomics data to quantitative estimates of human health risks before the full potential for this information will be realized (NRC 2006b).
Another end of the spectrum of measurement systems that will influence regulatory modeling is the rapid increase in data from environmental satellites and weather data (Foley 2005). The information from these systems provides a truly global climate observation system as well
as highly resolved spatial and temporal observations of meteorological phenomenon (Bates 2004). Such measurements may help to discern information on climatic variability, water resources, ecosystem changes, air pollution episodes, and a wide array of other possible applications. Although the sheer volume of data creates unprecedented challenges for data-handling operations, a more fundamental challenge is the scientific use of this information (Kahn 1997).
IMPROVEMENTS IN MODEL METHODS AND TECHNOLOGIES
As with the wide range of new measurement systems that are potentially available, a wide range of modeling approaches and technologies are increasingly applied in the environmental regulatory setting. Again, the spectrum of possible technologies and methods is vast and beyond the scope of this report. The committee discusses two areas as examples: integrated environmental modeling approaches and user-friendly modeling technologies.
One area is the increasing development of integrated modeling approaches. A major difference between “today’s” approach and “tomor-row’s” approach may be that high-quality models can enable an assessor to describe computationally with reasonable accuracy the relationships depicted in Figure 2-1 in Chapter 2—from source emissions and human activities that give rise to these emissions to adverse outcomes. The continuum from sources to human health responses in the human health risk assessment paradigm is described in many sources (e.g., NRC 1983, 1994; Lioy 1990) and demonstrated in the approach taken by the National Research Council committee in developing research priorities for airborne particulate matter (NRC 1998, 1999b, 2001d, 2004c). Recent advances in modeling tools have greatly enhanced the capabilities to perform computationally intensive multiscale source-to-dose and exposure assessment for a wide range of environmental contaminants (Foley et al. 2003). For example, Georgopoulos et al. (2005a,b) described an integrated source-to-dose modeling framework for assessing population exposures to fine particulate matter, ozone, and air toxics that links emissions, meteorological, air quality, exposure, and dosimetry models. The use of integrated modeling approaches for the environment is not confined to the human health risk assessment field.
Other examples of such integrated environmental modeling approaches that are emerging can be found in the following fields:
Watershed modeling—The BASINS modeling framework includes watershed nutrient loading and transport models and instream water quality models that operate with a geographical information system (EPA 2001d).
Risk assessment—The TRIM.FaTE model is a multimedia compartmental model to help assess multimedia chemical fate, transport, and exposure and risk of pollutants in the ambient environment (Efroymson and Murphy 2001; EPA 2003g).
Hazardous waste risk assessment—The multimedia, multipathway, and multireceptor exposure and risk assessment (3MRA) model can assess potential human and ecological health risks using transport, fate, exposure, and toxicity (EPA 2003h).
Global change fields—These models link models of energy-economic processes to environmental models (e.g., Rotmans 1990; Holmes and Ellis 1999) and models that link air quality, weather, and climate (Jacobson 2001; Liao et al. 2003, 2004).
These integrated modeling frameworks are typically written in a modular form, as discussed in Chapter 3, which allows users to easily add or remove parts of the model to tailor individual applications to the problem at hand. Software platforms, such as the framework for risk analysis in multimedia environmental systems (FRAMES), are often used to link models and databases under one integrated system. Typically, a user interface facilitates such development.
However, the ever-larger and more-sophisticated models may not necessarily make better regulatory tools. Clarke (2004) and Perciasepe (2005) raise the possibility that pursuing larger and more-sophisticated models make them less and less able to be evaluated and more impenetrable to the public and decision makers.
Other modeling technologies have attempted to improve transparency and build a stronger bridge to the public and decision makers through the use of user-friendly graphic simulation software. One approach is to utilize object-oriented programming languages that allow individual components of a model to be visually and mathematically linked in a user environment that displays how different elements of a model interrelated and that allows users to easily modify the relationship among components. One use of this approach has been in conflict resolu-
tion over water resources. Known as share-vision modeling, it involves the common development of a single model or modeling framework by a diverse group of stakeholders involved in a water resources issue facilitated by object-oriented programming software (Lund and Palmer 1997). This approach has been recommended by the Institute for Water Resources as a way to bridge the gap between the specialized water models and the human decision process (Werick 1995).
CHANGES IN PERSPECTIVES ON MODEL USE IN REGULATORY DECISION MAKING
The use of models in the regulatory process in the future also may be affected by changing perspectives of decision makers on the most effective way to use them. Two general approaches are weight-of-evidence and adaptive management strategies. The weight-of-evidence approach has been used long before the original National Research Council’s “Red Book” on risk assessment practices (NRC 1983), although definitions and methods for carrying out weight-of-evidence analyses vary (Weed 2005). However, all definitions in the modeling setting recognize that models cannot be used to define a precise “bright line,” for example, between attainment and nonattainment of ambient environmental standards. Dolwick (2005) described how the regional air quality modeling community evolved from using models to define in an absolute sense whether a location’s emissions reduction plans will result in attainment of National Ambient Air Quality Standards (NAAQS) to using models in a weight-of-evidence approach as the primary element in a suite of tools that includes emissions and air quality monitoring. EPA (2006n) described the agency’s guidance on implementing the weight-of-evidence approach for ozone, fine particulate matter, and regional haze standards. The Air Quality Management (AQM) Work Group, which is composed of stakeholders from state and local governments and some industry and nonprofit organizations, endorsed the weight-of-evidence approach as a way to reduce reliance on modeling data as the centerpiece for air quality attainment demonstrations and increase the use of monitoring data and analyses of monitoring data (AQM Work Group 2005). Although the weight-of-evidence approach appropriately recognizes that models are not “truth generators,” it must be used in an unbiased manner so that, for example, it is no more likely to be used to relax regulatory requirements
than to strengthen them, even when modeling uncertainties cut both ways (NRC 2004a).
Adaptive strategies recognize the importance of improving environmental management strategies as new measurements and modeling analyses become available. Although providing a single definition for such terms as “adaptive management” and “adaptive implementation” suffers from the same problem as defining weight of evidence, some environmental regulatory activities clearly recognize an adaptive approach in which management strategies are later modified based on new modeling, measurements, and research. For example, the Clean Air Act calls for the NAAQS for each criteria pollutant to be reviewed periodically to consider recent scientific findings. The objective of this review is to decide whether the current NAAQS for that pollutant should be revised. Although the process of reviewing and implementing changes in the standards is cumbersome and has not been kept up with the 5-year review cycle mandated in the legislation, the history of the Clean Air Act has seen important revisions to air quality standards as a result of these reviews. Another example is in the cleanup of large mining megasites, where the amount and wide distribution of contaminated materials preclude complete remediation with traditional cleanup approaches envisioned under the Superfund Act. EPA recognizes that many contaminated mining megasites will require operation and maintenance in perpetuity (EPA 2004h). Under conditions where remediation is a long-term process involving many separate projects, some of which cannot be specified at the outset, the agency is forced into an adaptive approach requiring periodic progress reviews and adjustments to unsuccessful remedies. An NRC report focusing on mine-related contamination in the Coeur D’Alene River Basin mining megasite recommended that EPA establish a rigorous, adaptive management process for such mining megasites, a process having well-defined performance milestones, monitoring strategies, and evaluation criteria (NRC 2005a; Gustavson et al. 2007). A final example of an adaptive strategy in environmental regulatory activities is the California Air Resources Board (CARB) process for periodic review and revision, if necessary, of California motor-vehicle emissions standards (NRC 2005c). Because of the far-reaching and long-term nature of the California standards, CARB committed to a biennial review of its motor-vehicle emissions standards program to monitor manufacturer compliance plans, to identify any problems with the feasibility of its demanding program, and to modify the standards if deemed necessary (e.g., CARB 1994, 2000b). This process resulted in modifica-
tions to California’s standards, most notably to its zero emissions vehicle mandate (CARB 2004).
Models have a prominent future in the environmental decision-making process because their value clearly outweighs their inherent imperfections. The use of environmental regulatory models in the future will have to deal effectively with the vastly increasing amounts of data, improvements in modeling methods and technologies, and changing perspective on how best to use the results of models in the regulatory process. The imperfect nature of modeling means that models will always have the potential for improvement through the integration of new scientific understandings and data sources. However, no advances in science, no matter how great, will ever make it possible to build a scientifically complete model or prove that a given model is correct in all respects. In addition, a more complete model is not necessarily a better one for the purposes of policy making. A good model is one that achieves the right balance between simplicity and complexity to address the question at hand.
The history of environmental analysis has focused on the primary need to understand the impacts of humans on the environment and to assess potential strategies to mitigate adverse impacts. This was the objective of Man and Nature (Marsh 1864) over 150 years ago—to describe “the character and approximately the extent of the changes produced by human actions in the physical conditions of the globe…” and to “suggest the possibility and the importance of the restoration of disturbed harmonies.” Although the extent of the impacts and the models used to analyze impacts and develop responses look quite different today, these fundamental objectives remains the same. However, the successful use of new discoveries concerning environmental and human interactions is dependent on a holistic approach to generating data and interpreting the meaning of such data. Computational models will continue to provide linkages for interpretation, but as science gets more complex, it can easily become more isolated from nonscientists, whose distrust of science might increase. Ultimately, this can seriously damage the scientific endeavor. Thus, it is incumbent on both scientists and nonscientists to develop a strong communication bridge. Scientists need to find ways to express their findings to nonscientists. Nonscientists also have an obliga-