Regardless, it is still useful for policy makers to get a sense of the kinds of changes that could occur, even if these are not precise projections. Clearly, the above questions highlight a key insight from this workshop: the capability of scientists to communicate the nuances of climate change science to policy makers and decision makers lags behind the level of efficient internal communication within the scientific community. This suggests the value of continued discussion of this subject, with emphasis toward optimal communication of global warming science to policy makers and decision makers.

TOM WIGLEY

Tom Wigley addressed several topics related to communicating about climate change and sensitivity.

Characterizing Uncertainty in the Spatial Patterns of Climate Change. Climate sensitivity is the primary method for characterizing the magnitude of climate change, but this can be misinterpreted and does not address impacts of climate change directly. For impact analysis, we need spatial details of change. Currently, the best way to get a handle on uncertainty related to the spatial patterns of climate change is to compare the results of different models. The MAGICC-SCENGEN system allows one to compare the results of different AOGCMs and assess the intermodel S/N (signal-to-noise) ratio (defined as the average signal across all of the models divided by the intermodel standard deviation). For annual mean temperature projections, S/N for most of the world is high, meaning that the models agree well, but for projections of precipitation, the S/N is low for most of the world (with S/N less than 1 everywhere but in the high latitudes), indicating a much greater level of uncertainty. Note that in some cases, the inferred noise level may simply reflect the realization-to-realization variability within any one model, as well as the differences between different models, thus demonstrating the need for intercomparing multiple-member ensembles of climate model runs.

Characterizing Progress in Climate Modeling Capabilities. It appears to some people that little progress has been made since current uncertainties in sensitivity estimates are similar to those of a few decades ago. However, there has in fact been substantial progress in many important aspects of climate modeling (e.g., see Covey et al., 2003). This progress was illustrated with a comparative study of 16 models, showing how well the annual mean precipitation patterns projected by the various models agreed with observations. The best models have ~80 percent of their variance in common with observations, while the worst have ~40 percent of their variance in common with observations. In the past 10 years, all of the models have improved dramatically in their ability to simulate present-day precipitation patterns accurately. The Hadley model explains twice as much of the observational variability as it could when first created, and the worst models in the analysis are now performing better than Hadley and other leading models did a decade ago.

New Approaches for Reducing Uncertainty. Some researchers are developing ways to use probabilistic information to circumvent uncertainties in sensitivity and provide useful input to policy. For instance, Myles Allen’s group uses an approach wherein a large set of models is calibrated against past observations and then only the best of these models are used to make future projections. (It still is unclear, however, whether such approaches make the most credible projections.) Similarly, Wigley tried an approach of evaluating various PDFs for key input values, determined which cases give twentieth century changes that lie within the observed range, and used only those cases to produce a probabilistic projection of global mean temperature change. (For example, it was found that the combination of a low aerosol radiative forcing with a low sensitivity would not fit the observations and thus could be discarded.) These studies found that the output was not strongly dependent on the input sensitivity value, which indicates that it is very difficult to derive a PDF for sensitivity directly from observations.

Determining CO2Stabilization Targets. Probabilistic approaches can also be used to address a question of central interest to policy makers: What is a “dangerous level of interference” in the climate system? With a simple climate model, Wigley input PDFs for sensitivity, for non-CO2 forcing, and for a desired global warming limit and then generated a PDF for the resulting CO2 concentration stabilization target.

Spatial Scaling. A question of great interest to researchers is whether one can use a scaling method to translate global mean values into spatial patterns of climate change, without having to rely on climate models. Spatial scaling is an attempt to decouple those components of change that are and are not related to sensitivity; that is, the global mean change (which is the sensitivity-dependent term) is separated from the patterns of change per unit of global mean warming (normalized patterns).



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement