CHARACTERIZING UNCERTAINTY

PETER STONE

Peter Stone, from the Massachusetts Institute of Technology, explained that in order to provide useful information about climate change given all the uncertainties, you have to use probabilistic approaches. In the MIT model, PDFs are developed for the key uncertain elements of the climate system, including sensitivity (Str), rate of ocean heat uptake, and aerosol radiative forcing. In addition, the model includes the major economic uncertainties that affect CO2 emissions, including changes in labor productivity (which implicitly includes population growth), efficiency of energy use, and the cost of non-carbon technologies. These inputs are propagated through the model and yield an output in the form of a PDF for global mean temperature change and other climate variables of interest. With this approach, they can show how the potential distribution of climatic changes would differ for various policy responses.

The main advantage of the model is its broad scope (that is, its inclusion of economic variables along with critical physical variables of the oceans and atmosphere). The trade-off is that it is a two-dimensional zonal mean model, and thus cannot represent longitudinal details. Also, the lack of a third dimensional leads to important challenges in simulating the two-dimensional character of the circulation properly.

The value of this type of analysis is that it provides a measure of how much you reduce the high-risk outcomes under various policy response scenarios (i.e., future emissions). They find that the different policy response options do not lead to significant changes in the most probable outcome, but large effects are seen in the tail of the distribution (Webster et al., 2003). Thus, understanding the probability of high-risk outcomes may depend strongly on constraining uncertainties in the various input functions.

A Monte Carlo approach is used to define the PDF of the outcomes (such as the increases in global mean surface temperature by 2100), and the distribution limits are basically determined by the number of runs carried out. More than 250 model runs are required to accurately define the 5-95 percent confidence limits. To further constrain the tail of the distribution (i.e., get a 99 percent confidence limit), you have to consider whether it is worth the resources required to carry out the extra runs that would be needed.

Discussion

Leggett: Some people feel that defining the distribution tail is an important goal, but can we use this information, even if it is obtainable? Can we really assign any meaningful difference between the 98th, 99th, 99.99th percentiles of a distribution?

Wigley: His simple model allows an essentially unlimited number of runs, but the problem is that the high-end tail of the output distribution is strongly affected by small changes in the input values. This presents an inherent uncertainty.

Stone: Since uncertainty increases the farther you go into the future, part of the problem may be our insistent focus on projecting to the year 2100. The uncertainties in many critical variables are much smaller if one focuses on shorter time spans.

Mahlman: However, the use of shorter time spans omits critical information on how global warming plays out over its natural time scales (many centuries).

Prather: It would be useful to look at the range and distribution of potential impacts, not just a global mean temperature change. Even a relatively low climate sensitivity value may yield some significant impacts (for instance, high-latitude warming that leads to melting of the Greenland ice sheet).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 17
CHARACTERIZING UNCERTAINTY PETER STONE Peter Stone, from the Massachusetts Institute of Technology, explained that in order to provide useful information about climate change given all the uncertainties, you have to use probabilistic approaches. In the MIT model, PDFs are developed for the key uncertain elements of the climate system, including sensitivity (Str), rate of ocean heat uptake, and aerosol radiative forcing. In addition, the model includes the major economic uncertainties that affect CO2 emissions, including changes in labor productivity (which implicitly includes population growth), efficiency of energy use, and the cost of non-carbon technologies. These inputs are propagated through the model and yield an output in the form of a PDF for global mean temperature change and other climate variables of interest. With this approach, they can show how the potential distribution of climatic changes would differ for various policy responses. The main advantage of the model is its broad scope (that is, its inclusion of economic variables along with critical physical variables of the oceans and atmosphere). The trade-off is that it is a two-dimensional zonal mean model, and thus cannot represent longitudinal details. Also, the lack of a third dimensional leads to important challenges in simulating the two-dimensional character of the circulation properly. The value of this type of analysis is that it provides a measure of how much you reduce the high-risk outcomes under various policy response scenarios (i.e., future emissions). They find that the different policy response options do not lead to significant changes in the most probable outcome, but large effects are seen in the tail of the distribution (Webster et al., 2003). Thus, understanding the probability of high-risk outcomes may depend strongly on constraining uncertainties in the various input functions. A Monte Carlo approach is used to define the PDF of the outcomes (such as the increases in global mean surface temperature by 2100), and the distribution limits are basically determined by the number of runs carried out. More than 250 model runs are required to accurately define the 5-95 percent confidence limits. To further constrain the tail of the distribution (i.e., get a 99 percent confidence limit), you have to consider whether it is worth the resources required to carry out the extra runs that would be needed. Discussion Leggett: Some people feel that defining the distribution tail is an important goal, but can we use this information, even if it is obtainable? Can we really assign any meaningful difference between the 98th, 99th, 99.99th percentiles of a distribution? Wigley: His simple model allows an essentially unlimited number of runs, but the problem is that the high-end tail of the output distribution is strongly affected by small changes in the input values. This presents an inherent uncertainty. Stone: Since uncertainty increases the farther you go into the future, part of the problem may be our insistent focus on projecting to the year 2100. The uncertainties in many critical variables are much smaller if one focuses on shorter time spans. Mahlman: However, the use of shorter time spans omits critical information on how global warming plays out over its natural time scales (many centuries). Prather: It would be useful to look at the range and distribution of potential impacts, not just a global mean temperature change. Even a relatively low climate sensitivity value may yield some significant impacts (for instance, high-latitude warming that leads to melting of the Greenland ice sheet).

OCR for page 17
MICHAEL SCHLESINGER Michael Schlesinger, from the University of Illinois, discussed the major uncertainties that affect estimates of climate sensitivity obtained by simulating the record of observed hemispheric-mean near-surface air temperatures using a simple climate-ocean model. One of the largest uncertainties is in radiative forcing. With a simple energy balance climate/upwelling diffusion-ocean model, he and Natalia Andronova carried out a series of simulations using various combinations of radiative forcing agents (including long-lived greenhouse gases [GHGs], tropospheric ozone, volcanoes, solar irradiance changes, and anthropogenic sulfate aerosols), and determined which values for sensitivity (Seq) and sulfate radiative forcing in a year (1990) gave the best reproduction of the observed temperature record. They found that the sensitivity value obtained from the model is highly dependent upon which forcings are included (Schlesinger and Ramankutty, 1992; Andronova and Schlesinger, 2000, 2001). The climate sensitivity needed to reproduce the observed changes in near-surface temperature from the middle of the nineteenth century to the present is inversely proportional to the magnitude of the radiative forcing at the top of the atmosphere (or tropopause). Thus, if the radiative forcing is increased—for example, by including putative changes in the solar irradiance—the climate sensitivity needed to reproduce the observed changes in near-surface temperature decreases by about 50 percent (see Schlesinger and Ramankutty, 1992; Andronova and Schlesinger, 2001). However, we do not know whether the solar irradiance changed as has been constructed from indirect evidence. This uncertainty in the radiative forcing, not only by the sun but also by volcanoes and anthropogenic aerosols, contributes significant uncertainty in the inferred climate sensitivity. As the model simulation proceeds through the decades of the last century, the researchers determine what sensitivity value is needed in order to reproduce the observational record, given the radiative forcing that occurred up to that time. If the sulfate radiative forcing is known, one can home in on the true sensitivity value by learning over time. In addition, if they can estimate (rather than prescribe) sulfate radiative forcing, then they can continue this learning process into the future. Thus, the uncertainty in climate sensitivity due to climate noise can be reduced by learning over time, that is, by performing future estimations using longer observational records. These studies utilized two different temperature records (Folland et al., 2001; Jones and Moberg, 2003) that differed in their southern hemisphere values, and thus in the interhemispheric temperature difference. As a result, the estimated sulfate radiative forcing, and thus the estimated sensitivity value, differed significantly depending upon which data record was used. Thus, if the radiative forcing by aerosols cannot be learned exogenously, but only endogenously from the observed temperature changes, then the uncertainty in southern hemisphere temperature changes must be reduced. With better aerosol radiative forcing observations, one could prescribe the forcing in the model, rather than the current approach of estimating forcing from the interhemispheric temperature difference. As a result, one could use global mean temperature values as a constraint, and thus diminish the uncertainties related to climatic noise. Discussion Leggett: Another problem is that the model considers only sulfate; the uncertainties related to other types of aerosols have not even been accounted for. Penner: Note that the interhemispheric temperature differences would be amplified if the radiative forcing associated with biomass aerosols were included in the analysis. Neglecting this influence means that the uncertainty in southern hemisphere temperature changes is even more important. Schlesinger: The effect of other aerosol types was indirectly considered in one model scenario in which there was no aerosol forcing. This case represented the possible cancellation of the negative radiative forcing from sulfate aerosol by the (putative) positive radiative forcing from carbonaceous aerosol.

OCR for page 17
MICHAEL MANN Michael Mann, from the University of Virginia, explained that a potential strategy to constrain the PDF of climate sensitivity, and particularly to address the dilemmas raised by previous speakers (i.e., our inability to quantify the radiative forcing from anthropogenic aerosols), is to make use of information from the paleo-record. This is a period in which anthropogenic forcing effects are not a factor, and one can make use of the relationships between natural forcings and temperature variations on longer time scales. The trade off is that the temperature reconstructions have a much greater uncertainty than direct observations from recent decades (Figure 6, pg. 32). Comparisons of proxy-reconstructed northern hemisphere mean temperature with model-based estimates of climate-forced variability over the past millennium imply a sensitivity value in the range of 1.5 to 4.5°C. The University of Bern (Switzerland) coupled climate-carbon cycle model, furthermore, allows prediction of natural variations in CO2 as a response to large-scale surface temperature changes, albeit with substantial uncertainty. Using this model, researchers were able to reproduce both reconstructed surface temperatures and observed pre-anthropogenic natural variations in CO2 concentration (as derived from ice core records) within this same climate sensitivity range. When interpreting historical temperature reconstructions, it is important to recognize that many proxy records are subject to geographical and/or seasonal biases. For instance, some temperature reconstructions emphasize continental, extratropical, summer temperatures. Some climatic responses to forcings are regional in nature, and this too affects how one interprets proxy records. For example, some people think that the northern hemisphere temperature reconstruction by Esper and colleagues (Esper, 2002) may reflect an exaggerated hemispheric surface temperature response to volcanic activity, due to the restricted extratropical geographical locations and limited (summer) seasonal sensitivity of the proxy data used. Likewise, proxy reconstructions that target integrated annual mean temperature tend to mask seasonally specific anomalies that could provide potential dynamical insights. Spatial and seasonal variability in the response to forcing underscores the importance of taking regional and seasonal sampling into account in estimating global sensitivity. Discussion Prather: I am worried about using the paleo-temperature record as an indicator of sensitivity (Seq) because it depends on the state of the atmosphere, and today’s atmospheric composition and radiative forcing are much different than in past eras. Mahlman: Also, the quality of the paleodata may be inadequate to provide clear values of Seq. RICHARD MOSS Richard Moss, from the U.S. Global Change Research Program (USGCRP) and the Climate Change Science Program (CCSP), noted that policy makers understand the concept of global climate sensitivity in a narrower way than does the scientific community. They may be familiar with the standard definition of global average equilibrium surface temperature change due to a doubling of CO2, but they usually do not understand that the other definitions of climate sensitivity (see Box 1) depend on other aspects of the climate system such as the rate of increase of greenhouse gases and the rate of oceanic heat uptake. From the policy maker’s perspective, the global sensitivity estimate is a useful way to cut out a lot of details that they do not understand, but this gives a false impression that the problem can be reduced to a single number. Scientists must think carefully about alternate ways to communicate climate response, and the resulting impacts, to the policy community. Sensitivity is a useful tool for communicating, but it should not be viewed as a “holy grail” or as the best way to communicate in a simple and understandable fashion. Moss described his work with Steve Schneider to persuade scientists involved in the IPCC/TAR to use a more systematic approach in defining uncertainties (see Box 2). They had mixed experience in getting the different IPCC working groups to adhere to these principles. Working Group I (science) followed the proposed

OCR for page 17
framework in general, but substituted its own terminology in some cases. Working Group III (mitigation) disregarded the idea entirely. BOX 2 SUMMARY OF STEPS RECOMMENDED FOR ASSESSING UNCERTAINTY IN THE IPCC/TAR Identify the most important factors and uncertainties that are likely to affect the conclusions. Also specify which important factors or variables are being treated exogenously or fixed. Document ranges and distributions in the literature, including sources of information about the key causes of uncertainty. It is important to consider the types of evidence available to support a finding (e.g., distinguish findings that are well established through observations and tested theory from those that are not so well established). Given the nature of the uncertainties and the state of science, make an initial determination of the appropriate level of precision. Is the state of science such that only qualitative estimates are possible, or is quantification possible and if so, to how many significant digits? Quantitatively or qualitatively characterize the distribution of values that a parameter, variable, or outcome may take. First identify the end points of the range and/or any high-consequence, low-probability outcomes. Specify what portion of the range is included in the estimate (e.g., this is a 90 percent confidence interval) and what the range is based on. Then provide an assessment of the general shape of the distribution and its central tendency, if appropriate. Using the terms described below, rate and describe the state of scientific information on which the conclusions and/or estimates (i.e., from step 4) are based. Prepare a “traceable account” of how the estimates were constructed that describes the reasons for adopting a particular probability distribution, including important lines of evidence used, standards of evidence applied, approaches to combining or reconciling multiple lines of evidence, and critical uncertainties. OPTIONAL: Use formal probabilistic frameworks for assessing expert judgment as appropriate. Moss and Schneider, 2000 So how should we go forward in characterizing uncertainty related to climate sensitivity? In terms of research needs, there is value in the multiplicity of approaches being used by the research community. Note that the leadership of the next IPCC assessment intends to carefully examine recent developments in our scientific understanding of the sensitivity (Seq and Str) parameters. In terms of communication needs, we have to think about the concept of multiple metrics. What are we conveying to policy makers about the seriousness of the problem and the progress we are making? How does a reduction in uncertainty have the potential to affect the decision-making process? Moss would like to see the scientific community be more systematic in describing uncertainties and levels of confidence. Using a Bayesean updating process is one important way to make the products more useful.