women overestimate their risk of having breast cancer; women believe that their risk of breast cancer is higher than their risk of cardiovascular disease, despite the fact that cardiovascular disease affects and kills more women than breast cancer (Blanchard et al., 2002).

A second bias that can influence the communication of health risks and their uncertainties is confirmation bias. Confirmation bias refers to the filtering of new information to fit previously formed views; in particular, it is the tendency to accept as reliable new information that supports existing views, but to see as unreliable or erroneous and filter out new information that is contrary to current views (Russo et al., 1996). People may ignore or dismiss uncertainty information if it contradicts their current beliefs (Kloprogge et al., 2007). Evidence indicates that probability judgments are subject to confirmation bias (Smithson et al., 2011). Communicators of risk information, therefore, should be aware that peoples’ preexisting views about a risk, particularly when those views are very strong, may be difficult to change even with what some would consider to be “convincing” evidence with little uncertainty.

A third bias is confidence bias. People have a tendency to be overconfident about the judgments they make based on the use of heuristics. When people judge how well they know an uncertain quantity, they may set the range of their uncertainty too narrowly (Morgan, 2009). Research by Moore and Cain (2007) supports the notion that people may overestimate or underestimate their judgments based on their level of confidence. Referred to as the overconfidence bias, this tendency seems to have its basis in a psychological insensitivity to questioning of the assumptions upon which judgments are based (Slovic et al., 1979, 1981).

Group Biases The literature on public participation emphasizes the importance of interaction among stakeholders as a way of minimizing the cognitive biases that shape how people react to risk information (see Renn, 1999, 2004). Kerr and Tindale (2004), for example, caution that the more homogeneous a group is with respect to knowledge and preferences, the more strongly the knowledge and preferences will affect a group decision. Uncertainty can be either amplified or downplayed, depending on a group’s biases toward the evidence.

Assessment and explicit acknowledgement of the biases of the people that the agency is communicating with might be critical to successful communication. People may be more willing to listen to new information and other points of view after their own concerns have been acknowledged and validated (Bier, 2001a).

EPA’s scientists and technical staff are themselves not immune to these biases. An awareness of the possible biases within EPA and when they occur would be a first step toward identifying biases and helping decrease the



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement