6

Communication of Uncertainty

Communication of uncertainty is an important component of the broader practice of human health risk communication. As discussed by Stirling (2010), conveying the uncertainty in the science related to the decision is crucial not only so that decision makers will understand the range of evidence on which to base a decision, but also because it can make the influences of “deep intractabilities of uncertainty, the perils of group dynamics or the perturbing effect of power … more rigorously explicit and democratically accountable” (p. 1031).

The U.S. Environmental Protection Agency (EPA) requested guidance on communicating uncertainty to ensure the appropriate use of risk information and to enhance the understanding of uncertainty among the users of risk information, such as risk managers (that is, decision makers), journalists, and citizens. Although, as discussed in the previous chapters, a number of factors play a role in EPA’s decisions, most of the research the committee identified on the communication of environmental decisions focuses on communication of the uncertainty in estimates of human health risk, and those uncertainties are the focus of this chapter.

This chapter begins with background information on the communication of those risks. It then discusses the advantages and disadvantages of different formats for the presentation of uncertainty and the considerations that go into determining a communication strategy, such as the purpose of the communication, the stage in the decision-making process, the decision context, the type of the uncertainty, and the characteristics of the audience. The relevant audience characteristics discussed here are the audience’s level of technical expertise, personal and group biases, and social trust. In



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 181
6 Communication of Uncertainty C ommunication of uncertainty is an important component of the broader practice of human health risk communication. As discussed by Stirling (2010), conveying the uncertainty in the science related to the decision is crucial not only so that decision makers will understand the range of evidence on which to base a decision, but also because it can make the influences of “deep intractabilities of uncertainty, the perils of group dynamics or the perturbing effect of power . . . more rigorously explicit and democratically accountable” (p. 1031). The U.S. Environmental Protection Agency (EPA) requested guidance on communicating uncertainty to ensure the appropriate use of risk infor- mation and to enhance the understanding of uncertainty among the users of risk information, such as risk managers (that is, decision makers), journal- ists, and citizens. Although, as discussed in the previous chapters, a number of factors play a role in EPA’s decisions, most of the research the committee identified on the communication of environmental decisions focuses on communication of the uncertainty in estimates of human health risk, and those uncertainties are the focus of this chapter. This chapter begins with background information on the communica- tion of those risks. It then discusses the advantages and disadvantages of different formats for the presentation of uncertainty and the considerations that go into determining a communication strategy, such as the purpose of the communication, the stage in the decision-making process, the decision context, the type of the uncertainty, and the characteristics of the audi- ence. The relevant audience characteristics discussed here are the audience’s level of technical expertise, personal and group biases, and social trust. In 181

OCR for page 181
182 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY response to its charge, when discussing audience characteristics, the com- mittee paid special attention to communicating with the media. COMMUNICATION OF UNCERTAINTY IN RISK ESTIMATES The science of risk communication and the idea of what is good and appropriate risk communication have evolved over the past decades (Fischhoff, 1995; Leiss, 1996). For example, Fischhoff (1995) described the first three stages in that evolution in terms of how communicators think about the process: “All we have to do is get the numbers right,” “All we have to do is tell them the numbers,” and “All we have to do is explain what we mean by the numbers” (p. 138). With the realization that those factors alone would not lead stakeholders to accept decisions about risks, Fischhoff writes, communication experts changed strategies to include the ideas that “All we have to do is show them that they’ve accepted similar risks in the past,” “All we have to do is show them that it’s a good deal for them,” and “All we have to do is treat them nice” (p. 138). Those ap- proaches eventually evolved to include the current strategy, “All we have to do is make them partners” (p. 138). That most recent strategy includes both the two-way communication and the stakeholder engagement that today are considered hallmarks of good risk communication. In the context of EPA’s decisions, stakeholders include the decision makers at the agency, the industries potentially affected by a regulatory decision, and individuals or groups affected by the decision, including local community members for local issues or all the public for issues of national significance. Improving Risk Communication (NRC, 1989) emphasized the impor- tance of such two-way communication for agencies such as EPA, defining risk communication as “an interactive process of exchange of informa- tion and opinion among individuals, groups, and institutions. It involves multiple messages about the nature of risk and other messages, not strictly about risk, that express concerns, opinions, or reactions to risk messages or to legal and institutional arrangements for risk management” (p. 21). That report noted that risk estimates always have inherent uncertainties and that scientists often disagree about the appropriate estimates of risk. It recommended not minimizing the existence of uncertainty, disclosing scientific disagreements, and communicating “some indication of the level of confidence of estimates and the significance of scientific uncertainty” (p. 170). Other reports have also emphasized the need for engagement of stakeholders in the decision-making process (NRC, 1996, 2009). Documenting the type and magnitude of uncertainty in a decision is not only important at the time of the decision, as discussed by Bazelon (1974), but it is also important when a decision might be revisited or evaluated in the future.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 183 The committee agrees with many of those concepts discussed and recommended in Improving Risk Communication (NRC, 1989). Many of those concepts have been incorporated into EPA guidance documents on risk communication (see, for example, Covello and Allen, 1988; EPA, 2004, 2007). In 2007, for example, EPA’s Office of Research and Development published Risk Communication in Action (EPA, 2007), which describes the basic concepts of successful risk communication, taking into account differences in values and risk perception, and includes instructions on how best to engage with and present risk information to the public. It is not clear, however, the extent to which that and other documents—which are not agency-wide policies—are considered by or implemented in the risk communication practices of different programs and offices at EPA. Other National Research Council (NRC) reports (1996, 2008) have expressed concern that stakeholders have not been adequately involved in EPA deci- sion making, suggesting that two-way risk communication, including com- munication surrounding uncertainty, may in some instances be inadequate. The extent to which uncertainty is described and discussed varies among EPA’s decision documents. Chapter 2 discusses EPA’s decisions and supporting documentation around arsenic in drinking water, the Clean Air Interstate Rule (CAIR), and methylmercury, including the uncertainty analyses that EPA conducted and presented for those regulatory decisions. Those examples indicate that EPA does sometimes conduct numerous un- certainty analyses and present those analyses in its documents. Such analy- ses, however, are often presented in appendixes, and the ranges of potential outcomes are not necessarily presented in the summaries and summary tables. The committee also noted that the uncertainty analyses in those documents focus almost exclusively on the uncertainty in estimates related to human health risks and benefits. Krupnick et al. (2006) reviewed four of EPA’s regulatory impact analyses for air pollution regulations, including CAIR and the Clean Air Mercury Rule. They concluded that although the documents “indicate increased use of uncertainty analysis,” the EPA’s regu- latory impact analyses “do not adequately represent uncertainties around ‘best estimates,’ do not incorporate uncertainties into primary analyses, include limited uncertainty and sensitivity analyses, and make little attempt to present the results of these analyses in comprehensive way” (p. 7). To successfully communicate uncertainty, EPA programs and offices need to develop communication plans that include identification of stake- holder values, perceptions, concerns, and information needs related to uncertainty about the decisions to be made and to the uncertainties to be evaluated. As discussed in Chapter 5, the development of those plans should be initiated in the problem-formulation phase of decision making, and it should continue during the assessment and management phases.

OCR for page 181
184 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY PRESENTATION OF UNCERTAINTY The most widely used formal language of uncertainty in risk estimates is probability1 (Morgan, 2009). As Spiegelhalter et al. (2011) stated, how- ever, “probabilities are notoriously difficult to communicate effectively to lay audiences.” Probabilistic information, and the uncertainties associated with those probabilities, can be communicated using numeric, verbal, or graphic formats, and consideration should be given to which approach is most appropriate. In a recent review Spiegelhalter et al. (2011) pointed out that the available research in this area for the most part is limited to small studies, often on students or self-selected samples. That lack of large, randomized experiments remains years after Bostrom and Löfstedt (2003) “concluded that risk communication was ‘still more art than science, rely- ing as it often does in practice on good intuition rather than well-researched principles’” (Spiegelhalter et al., 2011, p. 1399). As discussed later in this chapter, the most appropriate approach to communicating uncertainty depends on the circumstances (Fagerlin et al., 2007; Nelson et al., 2009; Spiegelhalter et al., 2011; Visschers et al., 2009). Lipkus (2007) summarized the general strengths and weaknesses of each of the different approaches for conveying probabilistic information, based on a comprehensive literature review and consultation with risk communica- tion experts (see Box 6-1). The committee discusses relevant findings from this research below. Regardless of the format in which the uncertainty is presented, it is important to bound the uncertainty and to describe the effect it might have on a regulatory decision. Presenting the results of analyses such as the sensitivity analyses and scenarios discussed in Chapter 5 is one way to provide some boundaries on the effects of those uncertainties and to educate stakeholders about how those uncertainties might affect a decision. It is important to note that the existence of weaknesses does not necessarily indicate that a given method should not be used, but rather those weaknesses should be considered and adjusted for when developing a communication strategy. Numeric Presentation of Uncertainty In general, numeric presentations of probabilistic information—such as presenting information in terms of percentages and frequencies—can lead to more accurate perceptions of risk than verbal and graphic formats (Budescu et al., 2009). Unlike graphic and verbal presentations, numeric information can be put into tables in order to communicate a large amount of infor- mation in a single presentation. For example, Table 6-1, created by EPA, 1  Probability is a form of uncertainty information.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 185 compares the expected reduction in nonfatal heart attacks in several age groups from two different strategies for attaining national ambient air qual- ity standards, including the 95 percent confidence interval for all values. Percentage and frequency formats have been found to be more effective than other formats (such as stating that there is a 1-in-X chance of an oc- currence) for some circumstances because they more readily allow readers to conduct mathematical operations, such as comparisons, on risk probabil- ities (Cuite et al., 2008). Other research, however, found that probabilistic reasoning improves and that the influence of cognitive biases (see further discussion below) decreases when information is presented in the form of natural frequencies (for example, 30/1,000) rather than as proportions and single-event probabilities (for example, 3 percent) (see Brase et al., 1998; Gigerenzer, 2002; Gigerenzer and Edwards, 2003; Hoffrage et al., 2000; Kramer and Gigerenzer, 2005). Hoffrage et al. (2000) tested physicians’ ability to calculate the predictive value of a screening test for colorectal cancer when information was presented in terms of probabilities—a task that required combining multiple probabilities. Only 1 out of 24 physi- cian participants correctly calculated the false-positive rate when provided data as percentages. In contrast, when provided as fractions (for example, 30/10,000 people), 16 out of 24 of the physicians correctly calculated the false-positive rate (Hoffrage et al., 2000). Among the disadvantages of numeric presentations are that they are only useful if the people the agency is communicating with are capable of interpreting the numeric information presented and that they may not hold people’s attention as well as verbal and graphic presentations (Krupnick et al., 2006; Lipkus, 2007). The appropriateness of such presentations will depend on with whom EPA is communicating. For example, numeric presentations might be more appropriate for EPA decision makers and stakeholders with technical backgrounds than for stakeholders with less technical backgrounds. As discussed by Peters (2008), decision making is part deliberative (that is, analytical or reason-based) and part affective (that is, intuitive or based on emotional feelings), and using a combination of both approaches is important to good decision making (Damasio, 1994; Slovic and Peters, 2006). The numerical ability, or numeracy, of people varies, however, and this numeracy plays a role in the interpretation of numerical data and in judgments and decisions (Peters et al., 2006). People with higher numeracy are more likely to “retrieve and use appropriate numerical principles and transform numbers presented in one frame to a different frame” (p. 412), and they “tend to draw more affective meaning from probabilities and nu- merical comparisons than the less numerate [people] do” (pp. 412–413). Both laypeople’s and scientist’s judgments about risks are often influenced by affective feelings, but the format in which risk data are presented can

OCR for page 181
186 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY BOX 6-1 Strengths and Weaknesses of Numeric, Ver- bal, and Visual Communication of Riska Numeric communication of risk (e.g., percentages, frequencies) Strengths Weaknesses • Lacks sensitivity for adequately •  s precise and leads to more I tapping into and expressing gut- accurate perceptions of level reactions and intuitions risk than the use of prob- • People have problems under- ability phrases and graphical standing and applying math- displays ematical concepts (level of • Conveys aura of scientific numeracy) credibility • Algorithms used to derive num- • Can be converted from one bers may be incorrect, untest- metric to another (e.g., 10% able, or result in wide confidence = 1 out of 10) intervals that may affect public • Can be verified for ac- trust curacy (assuming enough observations) • Can be computed using algorithms, often based on epidemiological and/or clinical data, to provide a summary score Verbal communication of risk (e.g., unlikely, possible, almost, certain) Strengths Weaknesses • Allows for fluidity in commu- • Especially if the goal is to nication (is easy and natural achieve precision in risk esti- to use) mates, variability in interpreta- • Expresses the level, source, tion may be a problem (e.g., and imprecision of uncer- likely may be interpreted as a tainty, encourages one to 60% chance by one person and think of reasons why an as an 80% chance by another) event will or will not occur (i.e., directionality) • Unlike numbers, may better capture a person’s emotions and intuitions

OCR for page 181
COMMUNICATION OF UNCERTAINTY 187 V  isual (graphic) communication of risk (e.g., pie charts, scatter plots, line graphs) Strengths Weaknesses • Ability to summarize a great • Data patterns may discourage deal of data and show pat- people from attending to details terns in the data that would (e.g., numbers) go undetected using other • Poorly designed or complex methods graphs may not be well under- • Useful for priming automatic stood, and some individuals may mathematic operations (e.g., lack the skills or educational subtraction in comparing the resources to learn how to use difference in height between and interpret graphs two bars of a histogram) • Graphics can sometimes be • Is able to attract and hold challenging to prepare or require people’s attention because specialized technical programs it displays data in concrete, • The design of graphics can visual terms mislead by calling attention to • May be especially useful certain elements and away from to help with visualization of others part-to-whole relationships aThe strengths and weaknesses will vary depending on the stage of the decision, the pur- pose of the communication, and the audience. SOURCE: Adapted from Lipkus, 2007.

OCR for page 181
188 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY TABLE 6-1 Estimated Reduction in Nonfatal Acute Myocardial Infarctions Associated with Illustrative Attainment Strategies for the Revised and More Stringent Alternative PM NAAQS in 2020 Reduction in Incidence (95% Confidence Interval) Age Interval 15/35 Attainment Strategy 14/35 Attainment Strategy 18–24 1 4 (1–2) (2–6) 25–34 8 26 (4–12) (13–40) 35–44 170 280 (84–250) (140–430) 45–54 520 930 (260–790) (460–1,400) 55–64 1,300 2,100 (630–1,900) (1,100–3,200) 65–74 1,500 2,600 (770–2,300) (1,300–3,900) 75–84 980 1,800 (490–1,500) (900–2,800) 85+ 520 940 (260–780) (460–1,400) Total 5,000 8,700 (2,500–7,500) (4,300–13,000) NOTE: PM NAAQS = Particulate Matter National Ambient Air Quality Standards. SOURCE: Modified from EPA, 2008. affect the interpretation of results to a greater extent in people with low nu- meracy. For example, low-numeracy individuals perceive risk to be higher when given the information about risk in frequency formats than when given the information in percentage formats (Peters et al., 2011). Present- ing information in a manner that facilitates understanding is important, therefore, to people understanding the risks and making decisions using both deliberative and affective approaches (Peters, 2008). Verbal Presentations of Uncertainty Verbal presentations of risk—for example, messages containing words such as “likely” or “unlikely”—can be used as calibrations of numeric risk. Such representations may do a better job of capturing people’s atten- tion than numeric presentations, and they are also effective for portraying directionality. People are typically familiar with verbal expressions of risk from everyday language (for example, the phrase “It will likely rain tomor- row”), and for some people such presentations may be more user friendly than quantitative portrayals. Furthermore, as discussed by Kloprogge et al.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 189 (2007), verbal expressions of uncertainty can be better adapted to the level of understanding of an individual or group than can numeric and graphic presentations. A major weakness of verbal or linguistic presentations of risk is that studies have shown that the probabilities attributed to words such as “likely” or “very likely” varies among individuals and can even vary for a single individual depending on the scenario being presented (see Wallsten and Budescu, 1995; Wallsten et al., 1986). For example, as discussed by Morgan (2003), in a study that asked members of the executive commit- tee of EPA’s Science Advisory Board about the probabilities attached to the words “likely” and “unlikely” in the context of carcinogenicity, “the minimum probability associated with the word “likely” spans four orders of magnitude, the maximum probability associated with “not likely” spans more than five orders of magnitude,” and there was an overlap between the ranges of probabilities associated with the two words (Morgan, 1998, p. 48). That variation can raise a variety of issues when consistency in the interpretation of a health risk is one of the goals of a communication. How- ever, Erev and Cohen (1990) suggested that such vague verbal presentations of information might lead to a consideration of a wider variety of actions within a group, which could be beneficial to the overall group. Qualitative descriptions of probability—that is, those that include a description or definition for a category of certainty—are sometimes used instead of such subjective calibrations as “very likely” or “unlikely,” which are open for individual interpretation. The third assessment report of the Intergovernmental Panel on Climate Change (IPCC), published in 2001 (IPCC, 2001), made extensive use of a qualitative table proposed by Moss and Schneider (2000) as well as of more quantitative likelihood scales (see Table 6-2); these presentations were also used in slightly modified forms in the fourth assessment report published in 2007 (IPCC, 2007). The Interna- tional Agency for Research on Cancer (IARC) also uses defined categories to classify evidence. For example, IARC classifies the relevant evidence of carcinogenicity from human studies for a given chemical as limited evi- dence of carcinogenicity when “[a] positive association has been observed between exposure to the agent and cancer for which a causal interpretation is considered by the Working Group to be credible, but chance, bias or confounding could not be ruled out with reasonable confidence” (IARC, 2006, p. 19). Such presentations, which provide a description of the state of the science in a given field, can help policy makers with decisions when definitive findings are still pending. Such use of qualitative likelihood presentations, however, has not been problem-free. Recent research suggests that people may interpret the IPCC qualitative presentations with less precision than intended (Budescu et al., 2009) and that estimates for negatively worded probabilities, such as “very

OCR for page 181
190 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY TABLE 6-2 Supplemental Qualitative Table Used by the Intergovernmental Panel on Climate Change to Describe Its Confidence in Conclusions and Results HIGH Level of Agreement/Consensus Established but Well Established Incomplete Speculative Competing Explanations LOW Amount of Evidence HIGH (Observations, model output, theory, etc.) NOTE: Key to qualitative “state of knowledge” descriptors: Well Established: Models incorporate known processes, observations largely consistent with models for important variables, or multiple lines of evidence support the finding. Established but incomplete: Models incorporate most known processes, although some param- eterizations may not be well tested; observations are somewhat consistent with theoretical or model results but incomplete; current empirical estimates are well founded, but the possibil- ity of changes in governing processes over time is considerable; or only one or a few lines of evidence support the finding. Competing Explanations: Different model representations account for different aspects of observations or evidence, or incorporate different aspects of key processes, leading to compet- ing explanations. Speculative: Conceptually plausible ideas that have not received much attention in the litera- ture or that are laced with difficult to reduce uncertainties or have few available observational tests. SOURCE: Moss and Schneider, 2000. unlikely,” may be interpreted with greater variability than probability esti- mates that are positively worded (Smithson et al., 2011). The use of double negatives was especially confounding (Smithson et al., 2011). Budescu et al. (2009) also found that there is interindividual variability in the interpre- tation of the IPCC categories for certainty, and they recommended using “both verbal terms and numerical values to communicate uncertainty” and adjusting “the width of the numerical ranges to match the uncertainty of the target events” (p. 306). They also recommended describing events as precisely as possible—for example, avoiding the use of subjective terms such as “large”—and specifying the various sources of uncertainty and outlining their type and magnitude.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 191 Graphical Presentation of Uncertainty Graphical displays of probabilistic information—such as bar charts, pie charts, and line graphs—can summarize more information than other presentations, can capture and hold people’s attention, and can show pat- terns and whole-to-part relationships (Budescu et al., 1988; Spiegelhalter et al., 2011). Furthermore, uncertainties about the outcomes of an analysis can also be depicted using graphical displays, such as bar charts, pie charts, probability density functions,2 cumulative density functions,3 and box-and- whisker plots. There is some evidence that graphic displays of uncertainty can help convey uncertainty to people with low numeracy (Peters et al., 2007). A few studies have explored how well different graphical displays of quantitative uncertainty can convey information and have analyzed the effects of different graphical displays on decision making (Bostrom et al., 2008; Visschers and Siegrist, 2008). Ibrekk and Morgan (1987) compared nine graphical displays by see- ing how well each of them communicated univariate uncertainty to 49 well-­ ducated semitechnical and nontechnical people (see Figure 6-1). Par- e ticipants were asked to estimate the mean, the probability that a value that occurs will be greater than some stated value (that is, x > a), and the probability that the value that occurs will fall within a stated interval (that is, b > x > a). They were first asked to make those estimates without an ex- planation of how to use or interpret the displays, and then they were asked again after receiving detailed nontechnical explanations. Participants were most accurate with their estimates when they had been shown graphics that explicitly marked the location of the mean (displays 1 and 8), contained the answers to questions about x > a and b > x > a (displays 2 and 9), or pro- vided the 95 percent confidence interval (display 1). In making judgments about best estimates using displays of probability density, subjects tended to select the mode rather than the mean unless the mean was marked. Subjects reported being most familiar with the bar chart and pie chart (displays 2 and 3, respectively), but there was no relationship between familiarity with a display and how sure subjects were of their responses. The researchers also found that participants with some working knowledge of probability and statistics did not perform significantly better in interpreting the displays than participants without such knowledge. One implication of this research is that it will be important to include nontechnical people and people with knowledge of probability, such as EPA decision makers, in research on the communication of uncertainty. 2  Probability density functions show the probability of a given value, for example, the prob- ability that there will be 12 inches of snow. 3  Cumulative density functions show the probability of something being less than or equal to a given value, for example, the probability that there will be 12 inches of snow or less.

OCR for page 181
206 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY possibility that such biases influence the interpretation and presentation of scientific evidence. Considerations for Communicating with Journalists The statement of task asks the committee if there are specific communi- cation techniques that could improve understanding of uncertainty among journalists. This is an important question, as most members of the public get their information about risks from the media. Journalists and the media help to identify conflicts about risk, and they can be channels of informa- tion during the resolution of those conflicts (NRC, 1989). Journalists do generally care about news accuracy and objectivity (NRC, 1989; Sandman, 1986) and about balance in representation of opinions, but journalists vary widely in their backgrounds, technical expertise, and ability to accurately report and explain environmental decisions. Even those who cover environ- mental policy making will not necessarily be familiar with the details of risk assessment and its inherent uncertainties, making it challenging to convey the rationale for decisions based, in part, on those assessments. Uncertainty is not unique to reporting on environmental health risks, of course. Studies of how the U.S. news media handle uncertainty in science in general have found that journalists tend to make science appear more cer- tain and solid than it is (see Fahnestock, 1986; Singer and Endreny, 1993; Weiss et al., 1988). In a quantitative content analysis, for example, Singer and Endreny (1993) found that the media tended to minimize uncertainties of the risks associated with natural and manmade hazards. The issue of which factors might contribute to this tendency to minimize uncertainties has not yet been studied, but the tendency could be related to journalists’ understanding of uncertain information versus their incentive to develop attention-grabbing stories that omit or downplay uncertainties. It should be expected that journalists, just like most other people, will tend to inter- pret risk messages based on their existing beliefs. The reporting of risk and uncertainty information in the media will be influenced accordingly. Because the journalists and the media are a major avenue for fram- ing risk information and its inherent uncertainty, efforts are needed to ensure that they are well informed of what is known about risks and risk- management options, including the sources and magnitude of uncertainty and its implications; a particularly useful approach would be to provide journalists with short, concise summaries about those implications. Al- though such summaries can be a challenge to develop, it can be done. For example, as discussed in Chapter 2, the summary of the regulatory impact analysis for the CAIR (EPA, 2005) contains a summary discussion of the uncertainty analysis. Those who are most familiar with the risk and uncer- tainties should provide the perspective that the journalists seek and should

OCR for page 181
COMMUNICATION OF UNCERTAINTY 207 recognize the limitations and constraints of the media. Although little research has been carried out on the best means of providing journalists with such a perspective, providing agency personnel with training on how to communicate effectively with media representatives about uncertainties may prove helpful to journalists, as might providing journalists with access to the agency officials who were involved in the decision making. Provid- ing the media with summaries of the uncertainties in the risk assessment and risk management in a variety of formats may also help ensure that the uncertainties are conveyed accurately. Social Trust An important concept related to stakeholder values and perceptions is social trust. Trust has long been considered of central importance to risk management and communication (Earle, 2010; Earle et al., 2007; Kasperson et al., 1992; Löfstedt, 2009; Renn and Levine, 1991). Slovic (1993) noted an inverse relationship between the level of trust in decision makers and the public’s concern about or perception of a risk—that is, the lower the trust, the higher the perception of risk. The importance of organizational reputa- tion is not unique to EPA; in Reputation and Power: Organizational Image and Pharmaceutical Regulation at the FDA, Carpenter (2010) emphasized the importance that the U.S. Food and Drug Administration’s reputation plays in its regulatory authority. Frewer and Salter (2012) point out that beliefs about the underlying causes of trust or distrust and about the best approaches for increasing trust have changed over the past few decades. In contrast to the old idea that increasing knowledge will increase trust, Frewer et al. (1996) found that certain inherent aspects of the source of information—such as having a good track record, being truthful, having a history of being concerned with public welfare, and being seen as knowledgeable—lead to increased trust. Similarly, Peters et al. (1997) found that the source of the information being seen as having “knowledge and expertise, honesty and openness, and concern and care” was an important contributor to trust (p. 10). In a study looking at attitudes toward genetically modified foods, however, Frewer et al. (2003) found that neither the information itself nor the strategy for communicating the risks had much effect on people’s attitudes toward ge- netically modified foods; in this case, people’s attitudes toward genetically modified foods tended to determine their level of trust in the source of information, rather than the trust in the source determining their attitudes toward the foods. It is important to remember, however, that there are rea- sons to communicate uncertainties beyond the potential to increase social trust (Stirling, 2010).

OCR for page 181
208 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY Earle (2010) reviewed the distinction between trust, which is about re- lationships between people, and confidence, which concerns a relationship between people and objects, and the role of both in social trust. As Earle (2010) pointed out, although some people believe that decisions should be made on the basis of data or numbers (Baron, 1998; Bazerman et al., 2002; Sunstein, 2005), any “confidence-based approach presupposes a relation of trust” (p. 570). For decisions concerning hazards of high moral importance, that trust does not necessarily exist (Earle, 2010). As discussed by Fischhoff (1995) and by Leiss (1996), at earlier stages in the evolution of risk communication sciences it was thought that public education via increased communication would lead to an increased under- standing of the concept of risk and, subsequently, to increased trust. Fur- thermore, some research had indicated that a decrease in public confidence in regulatory agencies and scientific institutions—and their motives—led to decreased trust (Frewer and Salter, 2002; Pew Research Center for the People and the Press, 2010). Given those observations, it was thought that increasing transparency would be one way to increase trust. As discussed by Frewer and Salter (2012), however, there is limited evidence that trans- parency actually does increase trust, although there is evidence that a lack of transparency can lead to increased distrust (Frewer et al., 1996). As highlighted in the discussion of the committee’s framework for decision making in Chapter 5, all aspects of the decision-making process, includ- ing the more technical risk assessment process, require value judgments. Thus engaging the public and policy makers, in addition to scientists, in the process of health risk assessments not only improves the assessment, but can also increase both trust in the process and communications about health risks by allowing the perspectives of all stakeholders to inform the assessment. Frewer and Salter (2002) described the communications by the United Kingdom’s regulatory agencies related to the bovine spongiform encephalopathy (BSE) outbreak in the mid-1990s in the United Kingdom as an example of the consequences of inadequate public participation in the decision process. Communications about the outbreak and the outbreak response did not address many of the concerns of the public and led to public outrage about the response. Concerning the communication of uncertainties in risks, Frewer and Salter (2012) pointed out that distrust in risks assessments will increase when uncertainties are not included in the discussion of the assessments. Although some researchers noted, for the BSE outbreak in the United Kingdom, an apparent view by government officials “that the public [is] unable to conceptualize uncertainty” (Frewer et al., 2002, p. 363), research on risks related to food safety indicates a preference by the public to be informed of uncertainties in risk information (Frewer et al., 2002) and finds

OCR for page 181
COMMUNICATION OF UNCERTAINTY 209 that not discussing uncertainties “increases public distrust in institutional activities designed to manage risk” (Frewer and Salter, 2012, p. 153). Although, there is insufficient information to develop guidelines or best practices for communicating the uncertainty and variability in health risk estimates (Frewer and Salter, 2012), there is evidence that the public can dif- ferentiate between different types and sources of uncertainty (see below for further discussion). As discussed by Kloprogge et al. (2007), it is possible to communicate to the public various aspects of uncertainty information, such as how uncertainty was dealt with in the analysis as well as the implications of uncertainties and what can or cannot be done about uncertainties. The need for a communication plan is increased when there are—or are expected to be—more uncertainties associated with a decision-making process, because there are likely to be more challenges in communicating with stakeholders. Research demonstrates a heightened interest by the pub- lic in evaluating the credibility of information sources when they perceive uncertainty (Brashers, 2001; Halfacre et al., 2000; van den Bos, 2001), and studies also indicate that the public is more likely to challenge the reliability and adequacy of risk estimates and be less accepting of reassurances in the presence of uncertainty (Kroll-Smith and Couch, 1991; Rich et al., 1995). Concerns about procedural fairness and trust appear to be even more sa- lient when there is scientific uncertainty (NRC, 2008), and risk communi- cation can serve to facilitate stakeholder trust (Conchie and Burns, 2008; Heath et al., 1998; Peters et al., 1997). KEY FINDINGS • Although communication is often thought of in terms of commu- nication to an audience, two-way conversations about risks and uncertainties throughout the decision-making process are key to the informed environmental decisions that are acceptable to stake- holders. Not only will such communication inform the public and others about decisions, but it will also help to ensure that the deci- sions take the concerns of various stakeholders into consideration, and to build social trust and broader acceptance of decisions. RECOMMENDATION 8.1 U.S. Environmental Protection Agency senior managers should be transparent in communicating the basis of the agency’s decisions, in- cluding the extent to which uncertainty may have influenced decisions. • There is no definitive research that can serve as a basis for uniform recommendations as to the best approaches to communicating uncertainty information with all stakeholders. Each situation will

OCR for page 181
210 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY likely require a unique communication strategy, determined on a case-by-case basis, and in each case it may require research to determine the most appropriate approach. Communicating de- cision made in the presence of deep uncertainty is particularly challenging. RECOMMENDATION 8.2 U.S. Environmental Protection Agency decision documents and com- munications to the public should include a discussion of which uncer- tainties are and are not reducible in the near term. The implications of each to policy making should be provided in other communication documents when it might be useful for readers. • The best presentation style will depend on the audience and their needs. When communicating with decision makers, for example, because of the problem of variability in interpretation of verbal presentations, such presentations should be accompanied by a numeric representation. When communicating with individuals with limited numeracy or with a variety of stakeholders, providing numeric presentations of uncertainty may be insufficient. Often a combination of numeric, verbal, and graphic displays of uncer- tainty information may be the best option. In general, however, the most appropriate communication strategy for uncertainty depends on o decision context; the o purpose of the communication; the o type of uncertainty; and the o  characteristics of the audience, including the level of techni- the cal expertise, personal and group biases, and the level of social trust. RECOMMENDATION 9.1 The U.S. Environmental Protection Agency, alone or in collabora- tion with other relevant agencies, should fund or conduct research on communication of uncertainties for different types of decisions and to different audiences, develop a compilation of best practices, and sys- tematically evaluate its communications. • Little research has been conducted on communicating the uncer- tainty associated with technological or economic factors that play a role in environmental decisions, or other influences on decisions that are less readily quantified, such as social factors (for example, environmental justice) and the political context.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 211 RECOMMENDATION 9.2 As part of an initiative evaluating uncertainties in public sentiment and communication, U.S. Environmental Protection Agency senior manag- ers should assess agency expertise in the social and behavioral sciences (for example, communication, decision analysis, and economics), and ensure it is adequate to implement the recommendations in this report. REFERENCES Baron, J. 1998. Judgment misguided: Intuition and error in public decision making. New York: Oxford University Press. Bazelon, D. L. 1974. The perils of wizardry. American Journal of Psychiatry 131(12):1317–1322. Bazerman, M. H., J. Baron, and K. Shonk. 2002. “You can’t enlarge the pie”: The psychology of ineffective government. Cambridge, MA: Basic Books. Bier, V. M. 2001a. On the state of the art: Risk communication to decision-makers. Reliability Engineering and System Safety 71:151–157. ———. 2001b. On the state of the art: Risk communication to the public. Reliability Engineer- ing and System Safety 71:139–150. Blanchard, D., J. Erblich, G. H. Montgomery, and D. H. Bovbjerg. 2002. Read all about it: The over-representation of breast cancer in popular magazines. Preventive Medicine 35(4):343–348. Boduroglu, A., and P. Shah. 2009. Effects of spatial configurations on visual change detection: An account of bias changes. Memory and Cognition 37(8):1120–1131. Bostrom, A., and R. E. Löfstedt. 2003. Communicating risk: Wireless and hardwired. Risk Analysis 23(2):241–248. Bostrom, A., L. Anselin, and J. Farris. 2008. Visualizing seismic risk and uncertainty. Annals of the New York Academy of Sciences 1128(1):29–40. Brase, G. L., L. Cosmides, and J. Tooby. 1998. Individuation, counting, and statistical infer- ence: The role of frequency and whole-object representations in judgment under uncer- tainty. Journal of Experimental Psychology: General 127(1):3–21. Brashers, D. E. 2001. Communication and uncertainty management. Journal of Communica- tion 51(3):477–497. Budescu, D. V., S. Weinberg, and T. S. Wallsten. 1988. Decisions based on numerically and verbally expressed uncertainties. Journal of Experimental Psychology: Human Perception and Performance 14(2):281–294. Budescu, D. V., S. Broomell, and H. H. Por. 2009. Improving communication of uncertainty in the reports of the Intergovernmental Panel on Climate Change. Psychological Science 20(3):299–308. Carpenter, D. P. 2010. Reputation and power: Organizational image and pharmaceutical regulation at the FDA. Princeton, NJ: Princeton University Press. Conchie, S. M., and C. Burns. 2008. Trust and risk communication in high-risk organizations: A test of principles from social risk research. Risk Analysis 28(1):141–149. Covello, V. T., and F. W. Allen. 1988. Seven cardinal rules of risk communication. Washington, DC: EPA. Cuite, C. L., N. D. Weinstein, K. Emmons, and G. Colditz. 2008. A test of numeric formats for communicating risk probabilities. Medical Decision Making 28(3):377–384. Damasio, A. R. 1994. Descartes’ error: Emotion, reason, and the human brain. New York: Putnam.

OCR for page 181
212 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY Daradkeh, M., A. E. McKinnon, and C. Churcher. 2010. Visualisation tools for exploring the uncertainty–risk relationship in the decision-making process: A preliminary empirical evaluation. User Interfaces 106:42–51. Earle, T. C. 2010. Trust in risk management: A model-based review of empirical research. Risk Analysis 30(4):541–574. Earle, T. C., M. Siegrist, and H. Gutscher. 2007. Trust, risk perception and the TCC model of cooperation. In Trust in cooperative risk management: Uncertainty and scepticism in the public mind, edited by M. Siegrist, T. C. Earle, and H. Gutscher. London: Earthscan. Pp. 1–49. EPA (U.S. Environmental Protection Agency). 2004. An examination of EPA risk assessment principles and practices. Washington, DC: Office of the Science Advisor, Environmental Protection Agency. ———. 2005. Regulatory impact analysis for the final Clean Air Interstate Rule. Washington, DC: EPA, Office of Air and Radiation. ———. 2007. Risk communication in action: The risk communication workbook. Washing- ton, DC: EPA. ———. 2008. Clean Air Interstate Rule. http://www.epa.gov/cair (accessed September 10, 2008). Erev, I., and B. L. Cohen. 1990. Verbal versus numerical probabilities: Efficiency, biases, and the preference paradox. Organizational Behavior and Human Decision Processes 45(1):1–18. Fagerlin, A., P. A. Ubel, D. M. Smith, and B. J. Zikmund–Fisher. 2007. Making numbers matter: Present and future research in risk communication. American Journal of Health Behavior 31(Suppl 1):S47–S56. Fahnestock, J. 1986. Accommodating science: The rhetorical life of scientific facts. Written Communication 3(3):275–296. Fischhoff, B. 1995. Risk perception and communication unplugged: Twenty years of process. Risk Analysis 15(2):137–145. Fischhoff, B., N. T. Brewer, and J. S. Downs, eds. 2011. Communicating risks and benefits: An evidence-based user’s guide: Washington, DC: Food and Drug Administration, U.S. Department of Health and Human Services. Frewer, L., and B. Salter. 2002. Public attitudes, scientific advice and the politics of regulatory policy: The case of BSE. Science and Public Policy 29(2):137–145. ———. 2012. Societal trust in risk analysis: Implications for the interface of risk assessment and risk management. In Trust in risk management: Uncertainty and skepticism in the public mind, edited by M. Siegrist, T. C. Earle, and H. Gutscher. London: Earthscan. ­ Pp. 143-158. Frewer, L. J., C. Howard, D. Hedderley, and R. Shepherd. 1996. What determines trust in information about food-related risks? Underlying psychological constructs. Risk Analysis 16(4):473–486. Frewer, L., S. Miles, M. Brennan, S. Kuznesof, M. Ness, and C. Ritson. 2002. Public prefer- ences for informed choice under conditions of risk uncertainty. Public Understanding of Science 11:363–372. Frewer, L. J., J. Scholderer, and L. Bredahl. 2003. Communicating about the risks and benefits of genetically modified foods: The mediating role of trust. Risk Analysis 23(6):1117–1133. Gigerenzer, G. 2002. Calculated risks: How to know when numbers deceive you. New York: Simon and Schuster. Gigerenzer, G., and A. Edwards. 2003. Simple tools for understanding risks: From innumeracy to insight. British Medical Journal 327(7417):741–744.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 213 Halfacre, A. C., A. R. Matheny, and W. A. Rosenbaum. 2000. Regulating contested local hazards: Is constructive dialogue possible among participants in community risk manage- ment? Policy Studies Journal 28(3):648–667. Heath, R. L., S. Seshadri, and J. Lee. 1998. Risk communication: A two-community analysis of proximity, dread, trust, involvement, uncertainty, openness/accessibility, and knowledge on support/opposition toward chemical companies. Journal of Public Relations Research 10(1):35–56. Hoffrage, U., S. Lindsey, R. Hertwig, and G. Gigerenzer. 2000. Medicine: Communicating statistical information. Science 290(5500):2261–2262. IARC (International Agency for Research on Cancer). 2006. IARC monographs on the evalu- ation of carcinogenic risks to humans. Lyon, France: WHO. Ibrekk, H., and M. G. Morgan. 1987. Graphical communication of uncertain quantities to nontechnical people. Risk Analysis 7(4):519–529. IPCC (Intergovernmental Panel on Climate Change). 2001. IPCC Third Assessment Report: Climate change 2001 (TAR). http://www.grida.no/publications/other/ipcc_tar (accessed January 3, 2013). ———. 2007. IPCC Fourth Assessment Report: Climate change 2007 (AR4). http://www. grida.no/publications/other/ipcc_tar (accessed January 3, 2013). Kasperson, R. E., D. Golding, and S. Tuler. 1992. Social distrust as a factor in siting hazardous facilities and communicating risks. Journal of Social Issues 48(4):161–187. Kerr, N. L., and R. S. Tindale. 2004. Group performance and decision making. Annual Review of Psychology 55:623–655. Kloprogge, K., J. van der Sluijs, and A. Wardekker. 2007. Uncertainty communication: Issues and good practice. Version 2.0. http://www.nusap.net/downloads/reports/uncertainty_ communication.pdf (accessed September 15, 2008). Kramer, W., and G. Gigerenzer. 2005. How to confuse with statistics or: The use and misuse of conditional probabilities. Statistical Science 20(3):223–230. Kroll-Smith, J. S., and S. R. Couch. 1991. As if exposure to toxins were not enough: The social and cultural system as a secondary stressor. Environmental Health Perspectives 95:61–66. Krupnick, A., R. Morgenstern, M. Batz, P. Nelson, D. Burtraw, J. Shih, and M. McWilliams. 2006. Not a sure thing: Making regulatory choices under uncertainty. Washington, DC: Resources for the Future. Leiss, W. 1996. Three phases in the evolution of risk communication practice. Annals of the American Academy of Political and Social Science 545:85–94. Lipkus, I. M. 2007. Numeric, verbal, and visual formats of conveying health risks: Suggested best practices and future recommendations. Medical Decision Making 27(5):696–713. Löfstedt, R. 2009. Risk management in post trust societies. London: Earthscan. Moore, D. A., and D. M. Cain. 2007. Overconfidence and underconfidence: When and why people underestimate (and overestimate) the competition. Organizational Behavior and Human Decision Processes 103(2):197–213. Morgan, M. G. 1998. Uncertainty analysis in risk assessment. Human and Ecological Risk Assessment 4(1):25–39. ———. 2003. Characterizing and dealing with uncertainty: Insights from the integrated as- sessment of climate change. Integrated Assessment 4(1):46–55. ———. 2009. Best practice approaches for characterizing, communicating and incorporating scientific uncertainty in climate decision making. Washington, DC: National Oceanic and Atmospheric Administration. Morgan, M. G., and M. Henrion. 1990. Uncertainty: A guide to dealing with uncertainty in quantitative risk and policy analysis. New York: Cambridge University Press.

OCR for page 181
214 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY Moss, R. H., and S. H. Schneider. 2000. Recommendations to lead authors for more consistent assessment and reporting. In Guidance papers on the cross cutting issues of the Third Assessment Report of the IOCC, edited by R. Pachauri, T. Taniguchi, and K. Tanaka. Geneva, Switzerland: IPCC. Pp. 33–51. NCI (National Cancer Institute). 2011. Making data talk: A workbook. Bethesda, MD: Na- tional Cancer Institute. Nelson, D. E., B. W. Hesse, and R. T. Croyle. 2009. Making data talk: Communicating public health data to the public, policy makers, and the press. New York: Oxford University Press. NRC (National Research Council). 1989. Improving risk communication. Washington, DC: National Academy Press. ———. 1996. Understanding risk: Informing decisions in a democratic society. Washington, DC: National Academy Press. ———. 2008. Public participation in environmental assessment and decision making. Wash- ington, DC: The National Academies Press. ———. 2009. Science and decisions: Advancing risk assessment. Washington, DC: The Na- tional Academies Press. Peters, E. 2008. Numeracy and the perception and communication of risk. Annals of the New York Academy of Sciences 1128(1):1–7. Peters, E., D. Västfjäll, P. Slovic, C. Mertz, K. Mazzocco, and S. Dickert. 2006. Numeracy and decision making. Psychological Science 17(5):407–413. Peters, E., J. Hibbard, P. Slovic, and N. Dieckmann. 2007. Numeracy skill and the communica- tion, comprehension, and use of risk–benefit information. Health Affairs 26(3):741–748. Peters, E., P. S. Hart, and L. Fraenkel. 2011. Informing patients: The influence of numeracy, framing, and format of side effect information on risk perceptions. Medical Decision Making 31(3):432–436. Peters, R., V. Covello, and D. McCallum. 1997. The determinants of trust and credibility in environmental risk communication: An empirical study. Risk Analysis 17(1):43–54. Pew Research Center for the People and the Press. 2010. Distrust, discontent, anger and par- tisan rancor: The people and their government. Washington, DC: Pew Research Center for the People and the Press. Renn, O. 1999. A model for an analytic–deliberative process in risk management. Environ- mental Science and Technology 33(18):3049–3055. ———. 2004. The challenge of integrating deliberation and expertise: Participation and discourse in risk management In Risk analysis and society: An interdisciplinary charac- terization of the field, edited by T. McDaniels and M. J. Small. Cambridge: Cambridge University Press. Renn, O., and D. Levine, eds. 1991. Credibility and trust in risk communication. Edited by R. E. Kasperson and P. J. M. Stallen, Communicating risks to the public. Netherlands: Kluwer Academic Publishers. Reynolds, B., and W. S. Matthew. 2005. Crisis and emergency risk communication as an integrative model. Journal of Health Communication 10(1):43–55. Rich, R. C., M. Edelstein, W. K. Hallman, and A. H. Wandersman. 1995. Citizen participa- tion and empowerment: The case of local environmental hazards. American Journal of Community Psychology 23(5):657–676. Russo, J. E., V. H. Medvec, and M. G. Meloy. 1996. The distortion of information during decisions. Organizational Behavior and Human Decision Processes 66(1):102–110. Sandman, P. M. 1986. Explaining environmental risk. Washington, DC: EPA. Shah, P., and E. G. Freedman. 2009. Bar and line graph comprehension: An interaction of top-down and bottom-up processes. Cognitive Science Society 3:560–578.

OCR for page 181
COMMUNICATION OF UNCERTAINTY 215 Shah, P., G. M. Maylin, and M. Hegarty. 1999. Graphs as aids to knowledge construction: Signaling techniques for guiding the process of graph comprehension. Journal of Educa- tional Psychology 4:690–702. Singer, E., and P. M. Endreny. 1993. Reporting on risk: How the mass media portray accidents, diseases, disasters, and other hazards. New York: Russell Sage Foundation. Slovic, P. 1993. Perceived risk, trust, and democracy. Risk Analysis 13(6):675–682. ———. 2000. The perception of risk. London; Sterling, VA: Earthscan Publications. Slovic, P., and J. Monahan. 1995. Probability, danger, and coercion: A study of risk perception and decision making in mental health law. Law and Human Behavior 19:49–65. Slovic, P., and E. Peters. 2006. Risk perception and affect. Current Directions in Psychological Science 15(6):322–325. Slovic, P., B. Fischhoff, and S. Lichtenstein. 1979. Rating the risks. Environment 21(3):14–20. ———. 1981. Perceived risk: Psychological factors and social implications. In The assessment and perception of risk: A discussion, edited by F. Warner and D. H. Slater. London: Royal Society. Pp. 17–34. ———. 1982. Facts versus fears: Understanding perceived risk. In Judgment under uncer- tainty: Heuristics and biases, edited by D. Kahneman, P. Slovic and A. Tversky. Cam- bridge: Cambridge University Press. Pp. 463–492. Slovic, P., J. Monahan, and D. G. MacGregor. 2000. Violence risk assessment and risk com- munication: The effects of using actual cases, providing instruction, and employing prob- ability versus frequency formats. Law and Human Behavior 24(3):271–296. Smithson, M., D. V. Budescu, S. Broomell, and H. H. Por. 2011. Never say “not”: Impact of negative wording in probability phrases on imprecise probability judgments. Paper read at Proceedings of the Seventh International Symposium on Imprecise Probability: Theories and Applications, July 25–28, Innsbruck, Austria. Spiegelhalter, D., M. Pearson, and I. Short. 2011. Visualizing uncertainty about the future. Science 333(6048):1393–1400. Stirling, A. 2010. Keep it complex. Nature 468(7327):1029–1031. Stone, E. R., J. F. Yates, and A. M. Parker. 1997. Effects of numerical and graphical dis- plays on professed risk-taking behavior. Journal of Experimental Psychology: Applied 3(4):243–256. Stone, E. R., W. R. Sieck, B. E. Bull, J. F. Yates, S. C. Parks, and C. J. Rush. 2003. Foreground: background salience: Explaining the effects of graphical displays on risk avoidance. Or- ganizational Behavior and Human Decision Processes 90(1):19–36. Sunstein, C. R. 2005. Moral heuristics. Behavioral and Brain Sciences 28(4):531–541. Thompson, K. M., and D. L. Bloom. 2000. Communication of risk assessment information to risk managers. Journal of Risk Research 3(4):333–352. Tversky, A., and D. Kahneman. 1974. Judgment under uncertainty: Heuristics and biases. Science 185(4157):1124–1131. van den Bos, K. 2001. Uncertainty management: The influence of uncertainty salience on reactions to perceived procedural fairness. Journal of Personality and Social Psychology 80(6):931–941. Visschers, V. H. M., and M. Siegrist. 2008. Exploring the triangular relationship between trust, affect, and risk perception: A review of the literature. Risk Management 10(3):156–167. Visschers, V. H. M., R. M. Meertens, W. W. F. Passchier, and N. N. K. De Vries. 2009. Prob- ability information in risk communication: A review of the research literature. Risk Analysis 29(2):267–287. Wallsten, T. S., and D. V. Budescu. 1995. A review of human linguistic probability processing: General principles and empirical evidence. Knowledge Engineering Review 10(01):43–62.

OCR for page 181
216 ENVIRONMENTAL DECISIONS IN THE FACE OF UNCERTAINTY Wallsten, T. S., D. V. Budescu, A. Rapoport, R. Zwick, and B. Forsyth. 1986. Measuring the vague meanings of probability terms. Journal of Experimental Psychology: General 115(4):348–365. Wardekker, J. A., J. P. van der Sluijs, P. H. M. Janssen, P. Kloprogge, and A. C. Petersen. 2008. Uncertainty communication in environmental assessments: Views from the Dutch science- policy interface. Environmental Science and Policy 11(7):627–641. Weiss, C. H., E. Singer, and P. M. Endreny. 1988. Reporting of social science in the national media. New York: Russell Sage Foundation.