Climate Forecasts as Innovations and the Concept of Decision Support
To define a clear and viable role for the Sectoral Applications Research Program (SARP), it is necessary to clarify the range of needs implied by the National Oceanic and Atmospheric Administration’s (NOAA) climate mission and to place SARP in the context of a division of responsibilities among NOAA’s activities. According to its official program description, the SARP objective is “to systematically build an interdisciplinary and expressly applicable knowledge base and mechanism for the creation, dissemination and exchange of climate-related research findings critical for understanding and addressing resource management challenges in vital social and economic sectors (e.g., coastal, water resources, agriculture, health, etc.).”1 The capacity of NOAA and SARP to meet this objective depends on the adequacy of its science, including the full representation and utilization of social science (Anderson et al., 2003).2
In this chapter we first make explicit an important aspect of climate information that often goes unaddressed: that useful climate information is a type of innovation that must pass through a process of adoption by individuals and organizations that can be slow and difficult. We then discuss the concept of “decision support,” which is critical to understanding and pursuing NOAA’s climate mission.
Description available: http://www.climate.noaa.gov/index.jsp?pg=./cpo_pa/cpo_pa_index.jsp&pa=sarp&sub=1 [accessed April 2007].
See also the human dimensions web page of the National Centers for Coastal Ocean Science, http://www.nccos.noaa.gov/human/welcome.html [accessed April 2007].
IMPROVED CLIMATE FORECASTS AS AN INNOVATION
Climate scientists have only recently become skillful at forecasting climatic events months in advance—a scientific advance that is fundamentally new in human history.3 Thus, only recently has it become possible for decision makers to achieve better outcomes by using information in climate forecasts than by ignoring them. Until recently, the best predictive rule available to water managers, land use planners, emergency managers, and many other decision makers for timescales beyond the reach of weather forecasting was to expect the future distribution of climate-driven events to mirror the past distribution for the same location and time of year.
It is important to emphasize that climate forecasting skill is still quite limited. It also varies considerably depending on lead time, geographic scale, location (e.g., the El Niño-Southern Oscillation [ENSO] effect is stronger in some places than others), time of year (for ENSO, the signal varies seasonally), phase of a climatic variation (e.g., forecasts are more skillful during the El Niño phase than during other parts of the ENSO cycle), the climate variable being predicted (e.g., temperature forecasts are typically more skillful than precipitation forecasts), and other factors. When forecasts combine climate information with information on other processes, such as the hydrological information in stream flow forecasts, the climate information may or may not increase forecasting skill. According to one recent assessment: “climate model forecasts presently suffer from a general lack of skill, [but] there may be locations, times of year and conditions (e.g., during El Niño or La Niña) for which they improve hydrological forecasts” relative to forecasts that do not use the climate information (Wood et al., 2005).
From the standpoint of potential climate forecast users, the current skill level of climate forecasts and related forecast products (e.g., hydrological forecasts that include climate) may or may not be sufficient to enable better decisions. Moreover, the same forecast may provide information that supports better decisions for some users but not others in the same geographic region: for example, hydropower producers may be able to use skillful forecasts of average conditions across a whole watershed,
while dryland farmers in the watershed may need finer spatial resolution for the forecast to be useful. Such imperfections and variations in the usefulness of forecast information, combined with the fact that the skill levels of climate forecasts can be expected to change over time at an uneven rate, create significant uncertainty for decision makers who are considering how a climate forecast should affect their actions. In addition, the usefulness of climate information depends on the decision-making contexts in which it is used. We discuss this issue further below, in the context of multilevel, multi-actor governance.
To make optimal use of climate forecast information, a user must have an appropriate understanding of the level of skill involved in relation to the decision at hand. However, the degree of forecasting skill in relation to practical decisions remains highly contested. Thus, decision makers will continue to act under great uncertainty when it comes to future climate. They can err in two directions: by ignoring useful forecast information and by giving greater credence to uncertain forecasts than their skill level warrants.
The potential adoption of climate forecast information by decision makers occurs in the context of their existing ways of using information. Over many decades, decision makers whose well-being is affected by weather and climate developed routines for gathering and using information, including weather forecasts and other information collected by scientific methods,but not including climate forecast information or input from climate scientists. For example, a decade ago, seasonal stream flow forecasts in much of the West were made after the January 1 snow surveys because, other than historical averages, snowpack was the only quantity considered relevant. In the fall, reservoir managers set weekly reservoir levels by following a set of “rule curves” derived only from historical experience. They did not take into account, and were not allowed to take into account (because the rule curves were mandated), information about the state of ENSO, even though that information was beginning to yield skillful forecasts of precipitation on a seasonal-to-interannual timescale.
In an iterative process of several years’ duration, scientists showed water management agencies how the statistics of stream flow were shifted according to the phase of ENSO and how this phase could be predicted with some skill by late summer. Then the agencies began to consider ENSO predictions during the fall to see what they forecast and do some retrospective studies. Eventually, some agencies developed confidence in ENSO forecasts, and some are managing reservoirs differently in the fall as a result (for example, by generating more hydropower in years when seasonal forecasts suggest higher than average stream flow). Such management changes have the potential to generate an average of $150 million per year more hydropower with little or no loss of reliability for
other management objectives (Hamlet et al., 2002). Yet, even as skill levels of ENSO-related forecasting products have improved, resistance to their use in water resources management has persisted (e.g., O’Connor et al., 1999, 2005; Rayner et al., 2005; Yarnal et al., 2006).
Many factors determine whether or not potentially beneficial information is used. Some of these relate to characteristics of new information as received and perceived by potential users. The key factors have sometimes been described in terms of salience, credibility, and legitimacy (e.g., Clark and Majone, 1985; Ravetz, 1971; Cash et al., 2003). Salience is closely related to decision relevance, which has long been recognized to affect the use of information (e.g., National Research Council, 1989, 2002b). It is also related to the issue of getting potential users’ attention; thus, information is more likely to be used if it is sent through multiple communication channels and if it comes from known and trusted sources, including personal communication (see, e.g., Hovland et al., 1953; McGuire, 1983; National Research Council, 1984, 1989; 2002b; Mileti and Peek, 2002). Some of these attributes of information sources also increase the credibility and legitimacy of the information from a user’s perspective. Credibility, which relates to the idea of competence, is also signaled by attributes of the information itself: for example, information is more credible if it recognizes and treats multiple perspectives in an even-handed way (e.g., Cash et al., 2003). These general principles have been found to apply across many areas of communication, including the use of climate-related information (see, e.g., Jacobs et al., 2006; McNie, 2007; Mitchell et al., 2006; National Research Council, 2005b, 2006, 2007c; Social Learning Group, 2001a, 2001b).
It is worth emphasizing that credibility of the information to scientists, which is the focus of major efforts to improve the skill of climate forecasts and related decision support products, does not necessarily confer credibility with potential users. Evidence is lacking that information is more likely to be used merely if it is made more credible scientifically. Research on scientific communication strongly suggests that as important as the scientific validity of information is to the quality of decisions based on it, other attributes of the information are more influential in determining whether it is used.
Whether information is used also depends on characteristics of the users. For example, the concept of path dependency (e.g., Pierson, 2000) describes organizations or processes that are shaped by users’ histories and are therefore conservative and resistant to information that would change past practice. Conservatism may arise from legal or regulatory constraints or from established routines for seeking and interpreting information, which become entrenched and serve as barriers to information from outside sources. It may arise from a lack of capacity (e.g.,
scientific expertise, skills with geographic information systems [GIS]) in the organization to translate and use information produced outside the organization. Past practice also builds expectations among constituencies about being served in the ways they have come to depend on. Certain interests served by existing ways of doing things may perceive that change will harm their privileged positions, even if new practices mean increased benefits to the overall social welfare. Organizations also resist changes that make them vulnerable to political blame. Some organizations may not innovate until political cover is provided by leaders who embrace change and are willing to take the brunt of criticism for things that inevitably go wrong when something new is tried.
Thought about in another way, getting climate information used, and used appropriately, depends on transcending boundaries in knowledge-action systems (Gieryn, 1995; Guston, 2001; Jasanoff, 1987), such as between scientific and policy communities, between disciplines, between organizations and jurisdictions, and so forth. Such barriers may serve important functions in both science and government, but they can also impede communication, collaboration, understanding of complex systems, and coordinated action.
In short, using climate forecast information requires change that is not accomplished automatically or easily. Even if the United States is entering a new era of climate consciousness, it will take time for organizations to modify their routines of information gathering and decision making and establish new lines of communication—in this case, with scientists who can provide useful climate-related information. It will take time and effort to hire new kinds of people in organizations that need new and more appropriate expertise—for example, in climate science—or to construct new networks to convey information in or among organizations. Signs are beginning to appear of increasing development of such expertise and networks, particularly in private-sector organizations. One challenge for SARP is to find ways of pushing forward this process of multifaceted change in sectors and constituencies in which it is not yet occurring. To make progress, social science insights about processes of communication and innovation will be as important as is improved skill in climate forecasting.
In sum, when a qualitatively new and potentially valuable kind of scientific information becomes available, many of the beneficiaries are unlikely to take advantage of it until they can modify their information-gathering and decision-making routines. It is necessary to find new information sources, either within the decision-making organization (e.g., by hiring people with different expertise, such as climate scientists), by using existing networks, or by building new networks. New networks will require support, but if they produce benefits, those who benefit may
well be willing to pay for them. New networks may also require that new organizations be created or that existing organizations take on new functions, acting as links between information sources and users—what have been called boundary organizations (Agrawala et al., 2001; Cash, 2001; Guston, 2001). All of this change takes time and requires overcoming barriers, building confidence, and improving communication among potential producers and consumers of the new information.
Social science research on diffusion of innovation, organizational change, and related topics can offer some insights about how the process can be advanced. In Chapter 3, we discuss this research in the context of a key factor on which the success of NOAA’s climate mission depends: the need for innovation in decision routines and social networks so that decision makers can take into account climate-related information they did not previously consider.
The term “decision support” is prominent in discussions of the value of federal research efforts under the U.S. Climate Change Science Program (CCSP). CCSP documents assert that such research produces value to society because it supports better decisions. However, the term “decision support” has no precise definition in CCSP documents, and officials in the program offer a wide variety of definitions. At some risk of oversimplification, we distinguish two quite distinct definitions that appear to be in common use in the program. Available research on the use of scientific information is much more supportive of one view than the other.
In one view, decision support is a matter of translating the products of climate science into forms useful for decision makers and disseminating the translated products. This view presumes that climate scientists know which products will be useful to which decision makers and that the potential users will make appropriate use of decision-relevant information once it is made available to them. Adherents of this view typically emphasize the importance of developing what are called “decision support tools,” such as models, maps, and other technical products intended to be relevant to certain classes of decisions. They believe that once these tools are created and disseminated, the task of decision support is essentially complete. This approach can be called a translation model.
In the other view, decision support is defined more broadly as creating conditions that foster the appropriate use of information. Creating decision support tools may be part of the task, but it is not all, and it may not be the most important part. This view presumes that scientists do not automatically know what information they could provide that decision makers would find useful, that decision makers do not necessarily know
how they could use climate information, and that decision-relevant information is not necessarily put to optimal use when it is received. In this view, the primary objective of decision support activities is to establish communication between climate information producers and users that ensures that the information produced addresses users’ decision needs and gets to them in useful ways. Such communication is believed to change both the kinds of information that is produced and the ways it is used (e.g., National Research Council, 1989, 1996b, 1999, 2005a, 2006). This objective is sometimes described as reconciling supply and demand with respect to decision-relevant climate information (McNie, 2007; Sarewitz and Pielke, 2007) or as the coproduction of science and policy (Lemos and Morehouse, 2005).
This approach often requires efforts to strengthen the demand side (Cash et al., 2003). Success in achieving decision support objectives depends critically on effective and use-oriented two-way communication between the producers and users of climate information (Jacobs et al., 2005, 2006; Lemos and Morehouse, 2005; National Research Council, 1999b, 2006). Such communication can help bridge gaps between what is produced and what is likely to be used, thus ensuring that scientists produce products that are recognized by the users, and not just the producers, as useful. Effective use-oriented two-way communication can increase users’ understanding of how they could use climate information and enable them to ask questions about information that is uncertain or in dispute. In this broader view, a collaborative approach to decision support centered on such communication will yield much greater and more appropriate use of climate information than will come from the translation model. It may also result in decision-relevant information that would not otherwise have been produced because scientists may not have understood completely what kinds of information would be most useful to the target decision makers. In the broader definition of decision support, the central issue is not the development of new tools or other products, but of social systems or networks that get decision-relevant climate information produced and used. In some instances, the most valuable form of decision support may come from a conversation, not a mathematical model.
The notion of conversation as a decision support tool brings into focus the importance of human relationships and networks in information utilization. The uptake of information is highly dependent on the extent to which the source is trusted and considered reliable by the recipient (e.g., Hovland et al., 1953; McGuire, 1983; Rosa and Clark, 1999; Mitchell et al., 2006). Trust is ordinarily the product of familiarity and repeated interactions. Decision support can involve the continual working and reworking of relationships.
Research on the use of scientific information consistently supports
the second, broader view of decision support. Research on “science utilization,” focused largely on government agencies as users, indicates that decision-relevant scientific information is not necessarily used, even when it is made available to those who can benefit from it. Nor is it used even when, as with government officials, it is their responsibility to make decisions on the basis of the best available information (e.g., Sabatier, 1978; Weiss and Bucuvalas, 1980; Freudenburg, 1989; Landry et al., 2003; Romsdahl, 2005). These conclusions emerge from research on environmental communication (e.g., McKenzie-Mohr and Smith, 1999; Schultz, 2002), disaster communication (e.g., Mileti, 1999; National Research Council, 2006), public health communication (e.g., Valente and Schuster, 2002), sustainable development (Cash et al., 2003; van Kerkhoff and Lebel, 2006), global environmental assessment (Mitchell et al., 2006), and risk perception and communication (e.g., National Research Council, 1989, 1996b; Fischhoff, 1989; Slovic, 2000). They are also beginning to emerge from studies of the use of climate information, including that produced by NOAA (e.g., Anderson et al., 2003; Jacobs et al., 2005; McNie et al., 2007; National Research Council, 2005b).
As noted above, key characteristics of information that is used are that, from the user’s standpoint, it is salient (e.g., decision relevant and timely), credible (perceived as accurate, valid, and of high quality), and legitimate (uninfluenced by pressures or other sources of bias) (e.g., Social Learning Group, 2001a, 2001b; Cash et al., 2003; National Research Council, 2005b). Among the factors found to be important in determining whether such information is produced and used are the existence of good communication links, trust, and respect between scientists and users; the fit between the information and users’ decision routines; the strength of communication networks among the information users; and the potential for decisions to be challenged and to be adapted.
Work on risk (e.g., National Research Council, 1996b) emphasizes that broad involvement of “interested and affected parties” in framing scientific questions helps ensure that the science produced is useful (“getting the right science”). Such involvement also helps ensure that formal decision support tools that make sense to the scientists are explicit about any simplifying assumptions that may be in dispute among the users. Science-driven climate forecasting efforts, in particular, have been criticized as inaccessible and unhelpful to their potential beneficiaries, especially highly vulnerable populations, and as drains on resources that could be applied to more effective ways to reduce climate vulnerability (Lemos and Dilling, 2007).
Evidence from multiple fields demonstrates that scientific information that is intended to be useful for practical decisions often goes unused and that a key reason is inadequate prior communication between the
producers and users of the information. Considering this evidence, we adopt a broad definition of “decision support” that emphasizes communication and recommend that SARP and NOAA do the same. This understanding of the decision support task has influenced our thinking about how NOAA should work to achieve its climate mission and, within that, how SARP should set its research priorities for improving decision support systems. When decision support activities are conducted without adequate communication between information producers and users, many of the decision support products that are developed are seriously underutilized. The following examples illustrate the role of adequate communication with the intended users for decision support activities.
As noted in a previous study (National Research Council, 1999b:2), “a climate forecast is useful to a recipient only if the outcome variables it skillfully predicts are relevant and the forecast is timely in relation to actions the recipient can take to improve outcomes.” An example of an interesting scientific result that was useless in an operational context was the prediction (Hamlet and Lettenmaier, 1999) that global warming would reduce summer stream flow volumes in the Columbia River. Water managers were concerned about the implications but were unable to apply the finding to their own situations because the statement was not sufficiently clear, quantitative, or geographically specific. Models in use by water resource management agencies typically required monthly flow data for some time span (50 years or more) at specific stream gauges. Feedback from users made it possible to produce information they could use. The Climate Impacts Group (the RISA for the Pacific Northwest) developed an online resource in which the hydrologic model output was available at selected stream gauges chosen by key stakeholders, both for much of the 20th century and for future climate scenarios (Snover et al., 2003). Once these data were available in this useful format, the Northwest Power and Conservation Council could use them in their existing models to estimate climate-related impacts on Northwest hydropower production and revenue (see Northwest Power and Conservation Council, 2005).
A related example concerns the Federal Emergency Management Agency’s HAZUS model, a multihazard damage estimation model consisting of separate modules to address different hazard regimes (Federal Emergency Management Agency, 2006). HAZUS was developed as a tool to analyze information recorded in GIS to produce estimates of damages caused by natural disasters such as floods, hurricane winds, and earthquakes. For example, the flood hazard analysis module uses such characteristics as frequency, discharge, and ground elevation to estimate flood depth, flood elevation, and flow velocity and thereby calculate estimates of physical damage and economic loss from a flood event. Many decision makers for whose benefit the HAZUS model was developed have com-
plained that the model demands too much detailed data—with significant up-front investment of time and financial resources—to get useful output. For example, the state of Hawaii’s initial investment of $8 million to develop data for the seismic model has yielded considerable dividends, but the high cost may not be bearable by many other users. Closer collaboration between the creators of HAZUS and users might have led to a model that many more state and local users would have found useful.