Science communication is more complex than simply translating the jargon of science into language the public understands. Its complexity stems from the diversity and interconnectedness of its many elements, including the goals for communicating, the content being conveyed, the format in which it is presented, and the individuals and organizations involved. People approach science communication from their own starting points—a combination of expectations, knowledge, skills, beliefs, and values that are in turn shaped by broader social, political, and economic influences. Organizations and institutions involved in science communication add their own concerns and influences. Moreover, the communication landscape is changing dramatically in ways that offer unprecedented opportunities to communicate and connect with others but also pose many challenges, a topic addressed in detail in Chapter 4. A primary undertaking for those studying the science of science communication is to identify the key factors and best practices for effective science communication that anticipates and responds appropriately to this complexity. The issues discussed in this chapter apply to communicating science on almost any topic, regardless of whether the science is involved in a public controversy related to a contentious issue. The chapter that follows focuses in greater depth on the issues that matter in communicating science that relates in particular to topics that are contentious in the public sphere.
Few people outside the scientific community consume scientific information regularly (Boczkowski and Mitchelstein, 2013), although they encounter and benefit from science often in their everyday lives. Many people profess interest in science news, yet only 16 percent of the public say they follow news about science and technology “very closely” (Mitchell et al., 2016), a percentage that has remained below 18 percent since 2000 (National Science Board, 2014). Some people may encounter science as adults, and seek to make sense of it, only when it becomes important to a decision they must make as individuals or in the context of institutions in which they have a role—for example, as consumers, patients, parents, voters, or policy makers.
Making sense of scientific information is not easy. Consumers, for example, are faced with parsing complex and contradictory claims about the risks and benefits of fat, salt, added sugar, and genetically modified organisms (GMOs) in food. They must decide whether to agree with science-based advice about avoiding obesity, to listen to those who say the causes of obesity are not yet well understood, or to ignore science-based debates altogether. Likewise, patients must make choices about treatments and drugs—a task that often requires judging among contradictory claims about what “science says” and wrestling with inevitable uncertainties about the aftermath of any decision they make. Parents must choose whether to accept medically sound advice about vaccines. And policy makers must make decisions based on imperfect information and forecasts that invariably entail some uncertainties. Those decisions often relate to issues—such as environmental regulation, the risks and benefits of new technologies, and food health and safety—that are deeply rooted in science.
Because the decisions they need to make will vary, individuals and groups will vary in what they need from science communication. They also will differ in the knowledge and skills, ways of interpreting information, and other characteristics that influence how they are likely to respond to scientific information. Communicators must therefore be responsive both to people’s needs for scientific information and their ways of understanding, perceiving, and using science to make decisions. Adding to this complexity is that people’s needs and opinions can change as their engagement with science increases. Thus, an effective science communication strategy will be iterative and adaptable. In particular, it will evolve over time based on lessons learned about what is and is not working, as well as shifting needs and opportunities.
Although some goals of science communication can be achieved through one-way transmission of the information to an intended audience (as discussed later in this chapter), other goals are best achieved by the dialogue that occurs through formal public engagement.1 Such goals as generating excitement, sharing information needed for a decision, and finding common ground among diverse stakeholders all lend themselves to public engagement as a communication strategy. In addition, public engagement can be an important way to learn about the concerns, questions, and needs of the audience(s) to which the information is targeted. A recent report of the National Academies of Sciences, Engineering, and Medicine (2016b) that includes a recommendation on public engagement for the emerging technology of gene editing provides a useful example of reasons for using such an approach. As gene editing science advances, it spurs many questions about the science and its applications, as well the ethics and governance of its use.
More generally, public engagement offers opportunities to facilitate transparency and informed consent among stakeholders and for each stakeholder to both learn from and teach others involved in the debate. An essential component of mutual teaching and learning is the opportunity to clarify one’s beliefs and understanding, revise one’s opinions, gain insight into the thinking of others, and articulate values amid uncertainty about the societal implications of a decision. A key benefit of such processes is building and maintaining trust through a fair, open, and transparent process. When scientists are transparent about any conflicts of interest, sources of funding, or important affiliations related to their work, public views of their integrity can be enhanced (National Academies of Sciences, Engineering, and Medicine, 2016b). When dealing with morally charged issues, however, the outcome may matter more to people than the fairness of the process (Earle and Siegrist, 2008). In these circumstances, simply ensuring a fair process may not be sufficient to foster trust and cooperation.
Engagement models increasingly are used in risk communication as an important way to address questions and conflicts related to the ethical, legal, and social issues that arise around science (e.g., National Academies of Sciences, Engineering, and Medicine, 2016b). It is also argued that public
1 “Public engagement” can be defined as “seeking and facilitating the sharing and exchange of knowledge, perspectives, and preferences between or among groups who often have differences in expertise, power, and values” (National Academies of Sciences, Engineering, and Medicine, 2016b, p. 131). The specific terminology used for such activities may vary—some use the term “public participation” or “public deliberation,” for example—but the concept remains the same.
participation is part of the role of an engaged citizenry in a democracy.2 In the particular context of environmental decision making, research examining diverse traditions of public engagement—for example, on issues of Environmental Protection Agency (EPA) negotiated rulemaking, watershed management, climate change assessment, forest management, land use conflicts, and the cleanup of toxic waste sites—reveals practices associated with successful efforts. Among these are the use of processes that are inclusive, collaborative from the point of problem formulation, transparent, and characterized by good-faith communication (National Research Council, 2008). It should be noted, however, that most, although not all, of the available research focuses on public engagement efforts at the local to regional levels (National Research Council, 2008). Research is needed to determine effective structures and processes for engaging the public at larger scales, including the potential for online public engagement (Davies and Gangadharan, 2009). This need will become even more acute as nations cope with such problems as climate change that are global in scope (Corner and Pidgeon, 2010; Payne et al., 2015).
Public engagement is difficult to do well. The process of bringing together many stakeholders and publics is a challenging one that requires substantial preparation and support. Some critics suggest that low levels of knowledge about and attention to science, problematic group dynamics, and low levels of participation can make the process unproductive or even counterproductive in some cases (Binder et al., 2012; Merkle, 1996; Scheufele, 2012). Given that public participation may be especially useful when science is involved in controversy, Chapter 3 details principles of public engagement and additional needs for research.
Engaging the public may not appear to be essential in some situations, but there are both ethical and practical reasons, and in some cases legal mandates, for carrying out public participation (National Research Council, 2008). In science-related controversies, all three reasons for public engagement may be in play, making it important for research to attend to the design of engagement and deliberation processes involving science in a wide range of circumstances (see Chapter 3).
The nature of the information or the state of the science itself can pose a challenge for communication. Science is expected to yield information that is useful to society—if not immediately, then eventually through a self-
correcting and cumulative process. Science offers a unique, rule-governed method for producing reliable knowledge about the world. However, scientific findings often represent work in progress, are applicable only to particular contexts or populations, or are unsettled about questions to which the public wants clear answers. This section focuses on the uncertainty associated with scientific information and the challenges it poses for science communication.
Uncertainty has a number of sources that affect how it might be communicated. Researchers have developed ways to classify the types and sources of uncertainty.3 Some sources are inherent to science and cut across all scientific disciplines, while others, such as the use of certain estimation models or methods of measurement, are specific to particular scientific fields. Uncertainty may relate to the complex nature of scientific information or to how people process such information. It can arise when predicting future outcomes using probabilistic evidence (risk) or when deciding about the degree to which scientific evidence applies to a particular context or has personal significance. Continued research can resolve some uncertainty by providing additional evidence, but in other cases, and especially with complex problems involving science, uncertainty will persist. Despite inherent uncertainties, science may inform decisions, but how to interpret the relevance of the information may be negotiated among consumers of scientific information, such as individual decision makers or groups of stakeholders (Dietz, 2013a).
As a rule, people dislike uncertainty and try to avoid ambiguity (Fischhoff and Kadvany, 2011; Kahneman, 2011). When faced with decisions, they often will choose the least vague alternative even when more vague alternatives have a better expected payoff (Ellsberg, 1961; Wallsten, 1990). Scientists, including medical professionals, may be reluctant to discuss the uncertainties of their work as well (Fischhoff, 2012; Politi et al., 2007). It is tempting, then, to avoid talking about uncertainty when communicating science, but this may be a mistake. Some audiences know that uncertainty exists and say they want to be informed about how certain scientific findings are (Frewer and Salter, 2007). It also is possible that failing to discuss uncertainty conveys a false sense of certainty that can
3 For example, one typology identifies five main sources of uncertainty based on a review of the literature (Politi et al., 2007): (1) uncertainty about future outcomes (often called “risk” and operationalized as probabilities); (2) ambiguity, or uncertainty about the strength or validity of evidence about risks; (3) uncertainty about the personal significance of particular risks (e.g., their severity or timing); (4) uncertainty arising from the complexity of risk information (e.g., the multiplicity of risks and benefits or their instability over time); and (5) uncertainty resulting from ignorance. For another example of classifications of uncertainty, see Han et al. (2011b). For further discussion of scientific uncertainty as part of decision making, see Institute of Medicine (2013, 2014) and National Research Council (2006).
Reactions to the various types of uncertainty differ among individuals depending on their characteristics and values (Peters, 2012a; Politi et al., 2007). In the absence of clear explanation, people may attribute uncertainty to a variety of incorrect sources (e.g., Dieckmann et al., 2015). Some, for example, may attribute uncertainty to poor science (Freudenburg et al., 2008; Johnson and Slovic, 1995). In some cases, communicating uncertainty can diminish perceived scientific authority (Funtowicz and Ravetz, 1992; see also National Research Council, 2014; Rosa et al., 2013). And for some patients in medical settings, receiving information about uncertainty can make them feel less satisfied with a decision (Han et al., 2011a; Politi et al., 2011). On the other hand, the communication of uncertainty in some contexts promotes a sense of transparency that can foster trust (Johnson and Slovic, 1995). Clear information about uncertainty also can be helpful to decision makers weighing risk (Fischhoff and Davis, 2014; Joslyn and LeClerc, 2012; Joslyn et al., 2013). These findings suggest that audiences vary in their desire for and responses to information about uncertainty. One form of uncertainty concerns conflicting evidence or confusion about how much evidence supports different points of view. Scientists themselves may be uncertain and disagree about how to interpret scientific evidence. Such real or perceived conflicts can be quite disconcerting to audiences (Politi et al., 2011). The weight of evidence, or the degree to which facts and causal explanations regarding a particular issue are well established, may be insufficient to support a conclusion in cases in which the science is emergent, uncertain, or contested. Examples include the causes and impacts of obesity, the relative merits of technologies for responding to climate change (such as carbon capture and storage and solar radiation management), the social and health impacts of “vaping,” and the academic consequences of introducing more market-like policies in education (such as vouchers or charter schools). In addition, while the science may be relatively certain in one context, its application to another context, particularly to a complex local problem such as the contamination of a water supply or the implications of climate change, may increase scientific uncertainty.
With some issues, the weight of evidence leads to broad agreement within the scientific community. Such issues include the human contribution to climate change, the health benefits of vaccines, and the validity of evolutionary theory. In these cases, varying numbers of people remain unsure or unconvinced about the weight of evidence behind the assertions of the scientific community. In the case of climate change, one study found that communication that conveys a high degree of scientific consensus on an issue can increase people’s acknowledgment of that consensus (van der Linden et al., 2015), although it is unclear whether accepting scientific con-
sensus influences people’s attitudes or beliefs about an issue (Kahan, 2016) or support for particular policies (Kahan, 2016; McCright et al., 2013a). Further, some evidence indicates that political views influence people’s perceptions about scientific consensus on climate change (e.g., McCright et al., 2016). More remains to be learned about how audiences vary in their response to information about consensus on various issues.
National Academies reports have examined how to represent the level of agreement about scientific findings in the scientific community and the approaches that may be effective in different contexts.4 The Intergovernmental Panel on Climate Change (IPCC) employed labels to characterize confidence in scientific claims using the type, amount, quality, and consistency of the data, in addition to presenting information with statistics and probabilistic terms. However, people disagree widely about what verbal probability terms mean (Teigen and Brun, 2003). Thus, attempts to legislate such translations, such as that made by the IPCC, have not succeeded (Budescu et al., 2014; Teigen, 2014). The inclusion of numerical probabilities alongside verbal probabilities appears to result in more accurate interpretations than use of either alone (Budescu et al., 2012, 2014; Sinayev et al., 2015). Ultimately, the effectiveness of communications about uncertainty depends on the decision context (Fischhoff and Davis, 2014; Savelli and Joslyn, 2013; see also the discussion in Zikmund-Fisher, 2013) and the clarity of the communication format (Institute of Medicine, 2014; Spiegelhalter et al., 2011; Stephens et al., 2012; Taylor et al., 2015).
Research is needed to identify best practices for communicating the uncertainties of science in ways that convey the weight of evidence and speak to the particular questions people may have about specific sources of scientific uncertainty surrounding an issue. Formats are needed for communicating uncertainty to individuals or groups to support the use of scientific information as part of decision making, as are approaches that can be used to communicate on a large scale.
A responsive orientation to science communication means that the needs, abilities, perspectives, and constraints of the audiences are considered in the approach taken to communicating. This section addresses how three key aspects of audiences affect science communication: (1) prior knowledge of science, (2) ability to understand numeric information, and (3) ways of interpreting new information. Together, these factors help
explain why the same information from science can be understood and perceived so differently among different individuals. While the influences discussed here are evident for many people in the contexts within which they have been studied, further study is needed to determine the importance of each of these influences to communicating science to specific audiences in particular contexts to achieve specific goals. Research also is needed to determine their particular importance when a societal issue is contentious or when the science itself is controversial.
A long-standing question among science communicators is whether people have sufficient understanding and skills, such as science literacy, to make sense of science communication and to express their views as informed citizens on issues involving science. According to a recent National Academies report on the topic, science literacy can be broadly defined as having “some level of familiarity with the enterprise and practice of science” (National Academies of Sciences, Engineering, and Medicine, 2016c). A central theme of that report is that science literacy can be considered a characteristic not only of individuals, but also of communities and societies.
Knowledge levels among the general public, if measured as simple recall of scientific facts, have remained fairly high over time (Scheufele, 2013). At the same time, knowledge of scientific methods and thinking appears to be less widespread. Only one in four Americans (26 percent) in 2014 could explain “what it means to study something scientifically,” and only half of Americans (53 percent) had a correct understanding of randomized controlled experiments (National Science Board, 2016a).
For most people, formal science education ends in high school. Yet science continues to evolve, producing new information, discoveries, and technologies. Therefore, science education is seen as a lifelong process of learning that occurs across settings (Dierking and Falk, 2016). Most studies show that science literacy is strongly associated with level of education (National Science Board, 2016a), and only one-third (32.5 percent) of Americans over age 25 hold a bachelor’s or advanced degree (Ryan and Bauman, 2016). According to the National Science Board (2016b), moreover, as of 2013 only about 21.1 million adults in the United States had attained a bachelor’s or higher-level degree in a field of study involving science or engineering—about 9 percent of the 226.4 million Americans over age 21 (U.S. Census Bureau, 2013). Therefore, only about 1 in 10 American adults have had formal education as scientists or engineers.
The deficit model would predict that the more knowledge one has about science and the way it works, the more positive one’s attitudes toward science will be and the more consistent one’s decisions with scientific
evidence. Yet a growing body of research on the links between amount of scientific knowledge and attitudes toward science underscores that this is not a simple and direct relationship. Rather, a person’s characteristics, background, values and beliefs, and cues from mass media shape the linkage between general scientific knowledge and attitudes (Brossard et al., 2005; Ho et al., 2008; Scheufele and Lewenstein, 2005). Once these factors are taken into account, the relationship between knowledge and attitudes across studies is either weakly positive, nonexistent, or even negative (Allum et al., 2008; National Academies of Sciences, Engineering, and Medicine, 2016c) (see the more detailed discussion of this point in Chapter 3). These findings point to the importance of testing individual messages carefully before they are used. Moreover, knowledge is not a prerequisite for holding an opinion on a topic, which may explain in part the lack of relationship between knowledge and attitudes. Surveys show, for example, that a vast majority of people favor labeling GMOs in food, while many fewer people demonstrate knowledge about them (McFadden and Lusk, 2016).
Numerical information and concepts are often an important part of communicating scientific information, including scientific uncertainty. The understanding and use of mathematical concepts and numeric information often is referred to as “numeracy.” Many people, however, have difficulty understanding the quantitative and probabilistic information that frequently is the language of science. Problems with numeracy frequently affect even scientists outside of their areas of expertise. Thus when communication strategies need to convey quantities, rates, and probabilities, careful attention is required to the substantial body of research on how people process, and commonly misunderstand, such information, as well as to the best available tools for presenting it, as described later in this chapter.
Communication that takes people’s challenges with numeracy into account are more effective at improving understanding of the information presented, and in the health context, even at improving health outcomes (e.g., Peters et al., 2007). A large research literature on health decisions suggests the following steps for presenting numeric information so that it is understood and used effectively (see Institute of Medicine, 2014): (1) providing such information (i.e., not avoiding numbers); (2) reducing the cognitive effort required from the patient or consumer and requiring fewer inferences; (3) explaining what the numbers mean, particularly when the numeric information is unfamiliar to the audience; and (4) drawing attention to important information. Research outside the health domain is needed to determine how numeric information, such as uncertainty, can be presented in a way that facilitates comprehension and use of the infor-
mation across individuals, including those with lower numeracy. (A few systematic studies in this area do exist—for example, in finance [Soll et al., 2012] and in environmental domains [Hart, 2013; Markowitz et al., 2013; Myers et al., 2015].)
As noted earlier, beyond knowledge and skills for interpreting scientific information, people hold a variety of beliefs, values, and ways of understanding the world that shape their interpretations of new information. These predispositions are discussed below; a more detailed discussion of the roles of beliefs and values is included in Chapter 3 because of the central role they play in communicating science related to contentious societal issues.
One approach to understanding variation in how people interpret scientific information focuses on their “mental models”—the sets of beliefs they hold to explain how the world works (Bruine de Bruin and Bostrom, 2013). People who are deeply versed in the methods, theories, and facts of a particular scientific discipline use mental models quite different from those of nonexperts (Chowdhury et al., 2012). Nonexperts tend to apply multiple and often idiosyncratic explanations for a phenomenon. They also rely on metaphors and analogies to draw inferences (Bostrom, 2008) and often focus on the less relevant aspects of a problem or phenomenon (Chi et al., 1981; Downs et al., 2008).
People apply these ideas to their interpretations of new information, and their prior mental models tend to be resistant to change even when they encounter information to the contrary. Understanding people’s mental models can help communicators identify gaps in what people know, as well as misinformation or conceptions that affect how they make sense of an issue. With these insights, communicators could, for example, design approaches (such as framing, discussed later in this chapter) for making information more accessible to people, who could then use it in making decisions (Nisbet, 2014; Scheufele and Scheufele, 2010). The role of beliefs in science communication when science is involved in public controversy is discussed further in Chapter 3.
At times communicators may expect people to evaluate scientific uncertainty and other evidence based on full knowledge and understanding of
the information they receive, but that simply is not how the human mind appears to work when confronted with complexity. Instead, when trying to interpret scientific uncertainty and other complex information, people reduce mental effort by using a variety of heuristics or mental shortcuts to evaluate the evidence (Tversky and Kahneman, 1974). These shortcuts are usually adaptive, allowing people to decide quickly and efficiently how likely or how dangerous something is. Trust in, or deference to, scientific authority is an example of a reasonable shortcut people use for forming opinions or attitudes about science (Brossard and Nisbet, 2007). Yet shortcuts can bias interpretations of science, especially scientific uncertainty (Bruine de Bruin et al., 2007). For example, they influence understandings, memories, and reactions so that people pay more attention to, or weight more heavily, information that is consistent with their preexisting feelings about a subject. People also have a propensity to perceive that information they encounter frequently is more true or important than information they have encountered less often, even when that inference is incorrect (Fazio et al., 2015).
The use of emotions, in particular, is a critical mental shortcut in understanding and reacting to scientific information (Slovic et al., 2004). Like other mental shortcuts, emotions generally are helpful because they quickly inform perceptions of risks and benefits (e.g., “If I feel bad about a hazard, then it must be high in risk and low in benefit” [Finucane et al., 2000]). Emotional reactions motivate people to act and guide them to think more about important risk information in ways that would not occur without those emotions (Evans et al., 2015). But these same adaptive mental shortcuts also can bias reactions to science. As a result of these shortcuts, for instance, people’s initial emotional reactions to new information can persist, shaping and limiting how they respond to subsequent information (Bruine de Bruin and Wong-Parodi, 2014). These shortcuts also may cause people generally to attend more to negative than to positive information (Shaffer and Zikmund-Fisher, 2012).
Furthermore, people with lower numeracy are more likely to rely on these heuristics when engaging in complex judgments and decisions such as those that involve science, and especially scientific uncertainty (Peters et al., 2006; Sinayev and Peters, 2015). They also rely more on narratives and the way information is presented in particular lights (discussed below) instead of applying the probabilities and other numbers critical to understanding science (Peters, 2012a). Of course, highly numerate individuals also sometimes misunderstand numeric information and use heuristic processing, but to a lesser degree (Chapman and Liu, 2009; Peters et al., 2007). Careful attention to how scientific uncertainty and other numbers are presented can reduce the use of heuristics and increase understanding and use of provided numbers, especially among the less numerate (Institute of Medicine, 2014).
One form of mental shortcut is motivated reasoning, defined as the “systematic biasing of judgments in favor of one’s immediately accessible beliefs and feelings [that is] built into the basic architecture of information processing mechanisms of the brain” (Lodge and Taber, 2013, p. 24).5 Most, and perhaps all, people possess this natural reluctance to accept facts, evidence, and arguments that contradict the positions they hold. Because individuals tend to engage in motivated reasoning, the source of communication about a science-related topic and how that information is presented are likely to trigger specific associative pathways and patterns of thinking that will influence their attention to and interpretation of all subsequent information (Kraft et al., 2015).
Cognitive dissonance is the feeling of discomfort that arises from holding two conflicting thoughts or a thought that is discrepant with one’s behavior (Festinger and Carlsmith, 1959). The desire to end this discomfort is a strong motivator to resolve the conflict by changing either one’s behavior or one’s thinking. Cognitive dissonance can lead to attempts to justify one’s existing behavior or way of thinking, making a decision or attitude resistant to change. Indeed, people tend to doubt or reject expert persuasive messages that threaten or could lead to restrictions on freedoms or social activities they value (Byrne and Hart, 2009). For this reason, communicators need to be careful not to question or assault people’s values, and alternative information needs to be provided in a nonthreatening way.
Little research has examined directly how groups and social contexts (e.g., social networks, group norms, group membership, social identity) in which people are situated influence science communication (Bliuc et al., 2015; Pearson and Schuldt, 2015). Research is therefore needed on how groups (ranging from local governments to civic associations to planning boards) may differ in their attentiveness or response to science communication and the mechanisms by which different groups can best be reached and involved as audiences or participants in the science communication process. For example, a National Academies study focused on science literacy proposes that “community-held” science literacy is a concept to be
5 Many social science disciplines have studied this phenomenon, although the terms used for it differ. In some areas of psychology and political science, for example, it is called “biased assimilation,” while in decision sciences it is termed “Bayesian updating” (e.g., Lord et al., 1979; Munro and Ditto, 1997; Munro et al., 2002).
explored in future research (National Academies of Sciences, Engineering, and Medicine, 2016c). Research taking social influences into account could take advantage of statistical and data collection approaches that could provide a more nuanced understanding of group-level influences than currently exists (e.g., Contractor and DeChurch, 2014).
A particular need is a better understanding of groups that are underrepresented as audiences or participants in many traditional forms of science communication. Groups that may differ in their attentiveness or response to science communication may be distinguished by race, ethnicity, language status, income, or education level, and their responses may differ as a result of differences in conditions, norms, beliefs, or experiences. Some of these disparities with respect to knowledge acquisition—often called “knowledge gaps”—are discussed in Chapter 4. In the health domain, evidence suggests such gaps extend to disparities in the ability to access information sources and the capacity to process information—which may, for example, be limited by language barriers—and the ability to act on health information—resulting, for example, from a lack of health insurance (Viswanath, 2006).
In other research on group differences, whites and nonwhite racial and ethnic groups are divided along different political lines with regard to specific topics such as climate change (McCright et al., 2016; Schuldt and Pearson, 2016) and receptiveness to new medical technologies (Groeneveld et al., 2006), and whites relative to nonwhites generally are more deferential to scientific expertise (Blank and Shaw, 2015). In some instances, these differences may be rooted in objective experiences with science, either contemporary or historical. High-profile abuses, such as the Tuskegee syphilis study, appear to have contributed to higher levels of mistrust of science among African Americans, although the precise nature of the linkage is uncertain (Freimuth et al., 2001), as is the extent of this mistrust. Science communicators at a minimum need to be aware that messages from science may be heard differently by different groups and that certain communication channels, modes, messengers, or messages are likely to be effective for communicating science with some groups and not others.
The way in which scientific information is presented also affects how people interpret it. Whereas formally engaging the public, as discussed earlier, fosters multiway communication, the approaches that follow are used more commonly, though not exclusively, in one-way communication about science.
Framing is casting information in a certain light to influence what people think, believe, or do. Framing often is used to communicate that an issue is a priority or a problem, who or what might be responsible for it, and what should be done about it (Iyengar, 1991, 1996). Climate change, for example, could be presented as a grave environmental risk, as a public health risk, or as an opportunity for innovation and economic development (Nisbet and Scheufele, 2009). Frames are used to communicate in a wide range of channels that include conversation, public debate, and media reports. They also are used to deliver persuasive messages such as those intended to further public health–related goals. Marketing professionals and researchers have long employed and studied approaches to communicating in media, including online environments, to persuade others, including efforts to adopt healthful practices (see, e.g., Kotler and Keller, 2015; LeFebvre, 2013; Stewart, 2014). While the methods and tools from these fields may be useful for science communication, any practices from these fields would need to be tested in the context of science communication where persuasion was the goal.
Research suggests that such frames are likely to influence judgments about complex science-related debates when they are relevant to an individual’s existing ways of organizing, thinking about, and interpreting the world (Nisbet and Scheufele, 2009; Scheufele, 2000). In the case of GMOs, for instance, information framed in terms of social progress and improving quality of life may fit one individual’s way of thinking about the issue, while a frame that focuses on public accountability and right to know may appeal to another (Nisbet and Scheufele, 2009).6
Much of the research on framing to date has involved experiments comparing equivalent messages that convey either gains or losses (gain/loss framing) associated with an action or lack thereof. Other research compares the effects on audiences of science reports cast with different emphases (emphasis framing). Some information, for example, may be described in terms of local or personalized stories, while other descriptions offer general knowledge or statistical facts.
6 This discussion focuses on intentional framing, although framing is always present in messages even if the communicator’s use of a frame is not intentional or intended to influence people. The use of framing by scientists is a matter of debate. One issue concerns how to choose a frame that does not inadvertently convey misinformation (similar to the concern scientists can have about the use of analogies to communicate science). Another concern is whether scientists should communicate to influence people, as discussed in Chapter 1. For further discussion about the ethics of framing in science communication, see Keohane et al. (2014).
Hundreds of studies across a range of fields have tested the ability of gain/loss framing strategies to influence specific types of behaviors. Reviews of these studies paint a contradictory, uncertain picture. For example, whether framing in terms of gains or losses is more effective appears to vary depending on the type of message—that is, for instance, whether the message is intended to encourage behaviors to maintain health or detect illness or actions to mitigate climate change (Gallagher and Updegraff, 2012; Rothman and Salovey, 1997).
Meta-analyses of studies testing the effects of gain/loss framing on health prevention attitudes, intentions, and behaviors, including safe sex and vaccination, find either weak or no effects on attitudes and intentions and limited effects on behaviors (O’Keefe and Jensen, 2007; O’Keefe and Nan, 2012). Within health promotion research, effects of framing vary depending on whether the message is intended to affect behaviors related to skin cancer prevention, smoking cessation, and physical activity (for which small but potentially important effects were found), or to affect attitudes toward or intentions to seek out screening to detect illness (which showed no effects according to meta-analysis [Gallagher and Updegraff, 2012]). Similar patterns of mixed findings have been found in research on climate change communication. Messages framed in terms of environmental and economic gains rather than losses, for example, have led to greater support for mitigation-related actions (Hurlstone et al., 2014; Spence and Pidgeon, 2010). Other research has found no influence on attitudes when messages emphasize benefits over risks (Bernauer and McGrath, 2016).
In this line of research, frames are rendered as interpretive story lines that communicate what is at stake in a complex policy debate and why the issue matters. The story lines influence decisions by offering different trains of thought, each of which emphasizes one dimension of a complex issue over another—for example, pointing to new applications of an emerging science such as nanotechnology instead of the potential risks and benefits (Anderson et al., 2013). A common type of experiment in emphasis framing compares episodic and thematic framing: scientific information is cast either in terms of specific personalized stories (episodes) or more broadly (themes). Episodic framing, for example, might focus on the effects that foregoing vaccination had on a particular family and its children’s health. Thematic framing, in contrast, might present statistics about the likelihood of adverse health effects should vaccination rates drop below certain levels. As with gain/loss framing, however, evidence for the effectiveness of either
approach in changing attitudes or the degree of support for a particular action is mixed. Also mixed are findings on framing complex science-related problems such as obesity or climate change in terms of personalized stories as a persuasive strategy for encouraging behaviors or building support for policy actions. Some studies indicate that this approach is ineffective (Coleman et al., 2011; Hart, 2010), while others show that narrative accounts can lead to greater support for particular policies or intention to vaccinate (Dahlstrom, 2014; Nan et al., 2015; Niederdeppe et al., 2014).
Some argue that framing as a concept has become too broad and overlaps too much with other effects of the media on people’s decisions about the issues and ways of thinking that are salient to them (Cacciatore et al., 2016) (see Chapter 4). This conceptual murkiness may be one explanation for the unclear findings noted above. Results also may appear mixed because the research has not been systematic within or across the disciplines that study framing. It would be fruitful for future experimental research to determine the effects of particular kinds of frames on certain types of outcomes and to assess the degree to which these effects generalize to communicating about a range of issues.
Knowledge of framing effects could be enhanced with national samples and studies of framing in which people were subjected to competing messages from social media and other sources (McCright et al., 2015; Nisbet et al., 2013), as is typical of complex, real-life communication environments (see Chapter 4). Research also is needed to determine the extent to which framing of an issue matters and when it is best done. Important as well is to determine effective ways of continually reframing an issue in response to changes in people’s perceptions and understandings, or misunderstandings, about science as a public controversy evolves.
Research has evaluated various ways of presenting information for greater understanding of complexity and uncertainty. Science communicators frequently use narratives—information, including complicated numeric or statistical information, presented in the form of a story—to help explain complex issues (Entwistle et al., 2011; Reyna and Brainerd, 1991; Shaffer and Zikmund-Fisher, 2012). Narratives can increase audience engagement with and attention to science communication, and be easier to remember and process (Bekker et al., 2013; Dahlstrom, 2014; Kanouse et al., 2016; Winterbottom et al., 2008) relative to traditional forms of scientific communication. However, the use of narratives to promote understanding in science communication remains understudied (Dahlstrom, 2014). A common concern about narratives among experts is that they can sway people from using the presented statistical information (Downs et al., 2008;
As might be expected, people who are low on measures of numeracy prefer and are more influenced by explanations that are presented as stories than by those that involve numbers (Dieckmann et al., 2009). People with higher numeracy are less sensitive to how information is presented (Institute of Medicine, 2014). This finding is in line with other evidence suggesting that people with lower numeric skills tend to focus on narrative and other non-numeric information (e.g., information frames) even when it is irrelevant to the point, while ignoring relevant numbers (see Peters, 2012a; Reyna et al., 2009; Volk et al., 2008).
Despite the difficulty that numeric information poses for many people, it is sometimes the best way to promote understanding of the science, as experiments in communication about climate change, health, and the environment have demonstrated (Budescu et al., 2009; Myers et al., 2015; Peters et al., 2014). People who receive numeric information about scientific consensus on climate change, for example, estimate that consensus more in line with reality relative to those given non-numeric descriptive information (Myers et al., 2015).
Policy making occurs in a complex system that includes formal structures and procedures and a broad range of policy networks, actors, and organizations that affect how people involved in the system encounter and interpret information. When policy issues have substantial scientific content, policy networks involved with the issue will usually include individuals and organizations with concomitant scientific expertise (Dietz and Rycroft, 1987; Dietz et al., 1989). Nonetheless, biases in reasoning and assimilation of new information and the tendency of people to network and bond with similar others can prevent the spread and use of information across networks (Henry and Vollan, 2014). (For further discussion of networks in science communication, see Chapter 4.)
Research has shown that the use of science in policy making is not a straightforward process involving a simple, traceable relationship between the provision of information and a specific decision. Even when policy makers have access to and understand all the relevant sources of information, they will not necessarily weigh science heavily or use it to identify and select among policy options (e.g., National Research Council, 2012; Weiss, 1988). There is a paucity of evidence, however, on effective practices and structures for affecting policy makers’ understanding, perception, and use of science (National Research Council, 2012).
As discussed above with respect to decision making generally, policy
makers do not rely solely on the scientific information relevant to an issue. Values and personal interests operate at each stage of the policy process, from agenda setting to policy formulation, budgeting, policy implementation, policy monitoring, and policy evaluation. Policy makers, like everyone else, use mental shortcuts discussed earlier that affect their attention to and use of information, including when making decisions (Carnevale et al., 2011). Overall, evidence on the impact of science communication on policy decisions is still sparse and murky. A broader literature examines how the structure of policy networks influences and is influenced by values and beliefs, but research is only beginning to address scientific understanding per se (Frank et al., 2012; Henry, 2011; Henry et al., 2011). One reason for this lack of evidence is that it is difficult to study and assess how policy makers make sense of, are affected by, and use information from science. Policy makers vary in their science-related abilities and knowledge, and in some respects are influenced by the same factors that affect everyone’s understanding and perceptions of science. For example, like most people (including scientists themselves who encounter scientific evidence outside their own areas of expertise), they need to have scientific information interpreted and validated by trusted sources.
In a complex policy-making environment, it is almost impossible to know with any certainty that a specific decision made by an individual or group resulted from a specific encounter with relevant information. From the perspective of science, a goal of communicating with policy makers is to ensure that relevant scientific information is received and understood by those who may use it to make a decision. From this perspective, it may be sufficient to assess the effectiveness of science communication by documenting that the information was received, understood, shared, or discussed in a formal policy process (e.g., formal arguments, congressional testimony, public deliberations) (National Research Council, 2012). Research is needed to determine how science communication can influence these processes and how they are affected by science-related controversy.
Both individuals and organizations play a role in aggregating, diffusing, and interpreting scientific information for use by policy makers. The remainder of this section reviews several approaches to science communication within the policy-making context.
One approach to communicating science to policy makers involves one-way communication, such as issue briefs or websites. Examples of communicators using this approach from the health and education arenas include the Cochrane Collaboration, the Campbell Collaboration, and the What Works Clearinghouse. These organizations gather evidence about effective
interventions that they communicate through research summaries, lists, or guidelines so that policy makers can readily grasp the relevance of the information to a decision or task at hand. It is unclear, however, whether such translation efforts enhance the understanding or use of evidence-based practices (Ginsburg and Smith, 2016; Glasgow and Emmons, 2007; Green and Siefert, 2005; Lavis, 2006; Slavin, 2006).
Another organized approach to communicating science occurs in the form of brokering. A broker in this context is an intermediary (an organization or a person) that bridges science and policy making by providing information while developing relationships within networks and linking those who produce knowledge with those who may use it. Brokering may include facilitating agenda setting and other conversations among multiple stakeholders regarding complex or contentious societal problems that involve science. Although individuals may serve as brokers, “boundary organizations” also play this role and have been identified as critical to increasing evidence-informed decision making (Dobbins et al., 2009). Boundary organizations are entities that facilitate the flow of information among researchers, stakeholders, and policy makers while refraining from advocating policy positions (Bednarek et al., 2016). They may be think tanks, university-based policy centers, or other nonprofit research and policy organizations. Such organizations are able to devote the needed time, expertise, and resources to the task, and have been particularly useful in connecting researchers and policy makers (Bednarek et al., 2016). A small emerging literature, consisting mainly of case studies of established policy networks, documents the processes, associated outcomes, and conditions for the success of boundary organizations (e.g., Am, 2013; Bednarek et al., 2016; Leith et al., 2016). Other emerging research examines how information flows and evolves as it is used by bridging organizations and in policy networks (Bidwell et al., 2013; Frank et al., 2012; Henry and Vollan, 2014; Lemos et al., 2014; Lubienski et al., 2014). Research is needed to determine how science is best communicated to achieve particular goals in the context of such networks and organizations. What, for example, is the effect of boundary organizations on the quality or outcomes of policy discussions of science-related issues?
Another approach is aimed at fostering communication by positioning data and evidence as complements, not alternatives, to the professional knowledge, values, and practical experience of policy makers and the of-
ficials who implement the policies. Through partnerships entailing sustained interaction with members of the policy system and the practitioner community, researchers come to understand local needs and circumstances, while policy makers and practitioners gain a better understanding of the process of research and their role in it. These ongoing relationships among researchers, policy makers, and practitioners can benefit science communication directly by building understanding of science and trust (discussed further below) that can thwart attempts to draw science into public controversy (Bryk et al., 2011; Cohen-Vogel et al., 2015; Tseng, 2012). The formation of partnerships—such as those that have been established for education (e.g., Coburn and Penuel, 2016), criminal justice (Sullivan et al., 2013), and health (Kothari et al., 2011)—thus can increase the usefulness of research to policy decisions. In some cases, these partnerships take the form of assessments of the state of scientific knowledge related to a key policy issue, such as climate change (Morgan et al., 2005; National Research Council, 2007). Because participants in a partnership work together regularly and come to know one another, they learn to trust each other and communicate effectively. This trust and ease of communication makes it possible to design research agendas and protocols that are responsive to the needs and goals of all parties, and makes the communication of research findings a natural and ongoing element of working life for each group. As a result, both “good news” and “bad news” from the research may be accepted, built upon, and used. Most existing information about such partnerships is descriptive, however, and research is needed on the conditions of success in communicating science by diverse types of partnerships.
An exception is the health sciences, in which a significant body of research has developed in the area of dissemination and implementation science, with the aim of devising strategies to facilitate the effective incorporation of evidence-based practices (e.g., behavioral interventions, diagnostic treatments) into public health, clinical practice, and community settings (Brownson et al., 2012). The dissemination component of the research has focused on diffusing information about evidence-based interventions. The implementation research has proceeded with the recognition that a one-way flow of information in the form of a recommendation or guideline is insufficient to ensure use of an evidence-based practice. Thus the research has included a focus on partnerships between researchers and practitioners, on community-based participatory research, and on the engagement of multiple stakeholders and users of an intervention throughout the development and implementation process. Other scientific disciplines have similar challenges with communicating science, but use diverse methods and language for studying the problem. In such fields as health science in which evidence is accruing, there is a need for comprehensive reviews of the factors that affect the communication of science so those findings can be applied and
shared with other fields that communicate science for similar reasons in similar contexts.
The types and numbers of people and organizations that are communicating about science are increasing, spurred in part by the growth of online communication and accompanied by a decline in the coverage of science in mainstream media. These communicators include scientists themselves, health care providers, government sources, media, industry, a range of organizations, and individual citizens. Many interpersonal factors affect communication between people. For science, however, whose product is evidence-based information, being trusted as a source of valid and useful information is particularly critical. The following sections describe the factors that affect trust in science communication and the types of outcomes of that communication that are affected by trust.
In science communication, the audience decides whether communicators as sources of information, or the institutions they represent, are credible and trustworthy and credible. People use these assessments to decide what information to pay attention to and often, what to think about that information. While there is no widely accepted definition of trust, many conceptualizations include variations of three elements: integrity (those who can be trusted are fair and just), dependability (they can be relied upon to do what they say they will do), and competence (the ability to do what they say they will do) (Hon and Grunig, 1999; Rahn and Transue, 1998; Roduta-Roberts et al., 2011). Still others describe trust as a relationship in which both parties are willing to take risks and accept uncertainty about what their future interactions will be like, and others as a willingness to be vulnerable (Colquitt and Rodell, 2011; Earle, 2010; Henry and Dietz, 2011). Some see trust as a decision to give someone the benefit of the doubt (based on assessment of their future intentions) (Earle, 2010). And some conceptualizations of trust emphasize deference to or a reliance on decision makers or others in the scientific community to know and do what is best (Brossard and Nisbet, 2007; Siegrist et al., 2000). No organizations, institutions, or experts are universally trusted on all issues. People often make judgments about trustworthiness in relation to the information being conveyed (Chryssochoidis et al., 2009; Lang and Hallman, 2005), and they will not necessarily deem a particular source to be trustworthy and competent across topics.
People generally rely on two kinds of social information to determine
whom they will believe and find credible on scientific issues: (1) having common interests, defined as the perception that the communicator and listener desire the same outcome from the communication, and (2) perceived expertise (Lupia, 2013; Siegrist et al., 2000; Suhay and Druckman, 2015). Because actual expertise, such as scientific credentials, matters less than perceived expertise (Lupia, 2013), scientists do not automatically have credibility with many audiences. Credibility is attributed by the listener, not a fixed characteristic of the speaker. As with trust, it often includes a dimension focused on the perceived honesty and openness of the communicator (Renn and Levine, 1991). Audiences also may be more likely to find a source credible if they believe they can learn from that source (Lupia, 2013).
In general, the public has consistently had stable and high levels of trust in scientists (Gauchat, 2011, 2012; National Science Board, 2016a; Pew Research Center, 2015a). For information about GMOs, for example, scientists at universities and in industry, universities, and medical professionals are seen as relatively trustworthy sources of information, although industry sources are seen as least trustworthy (Corley et al., 2012; Ipsos MORI Social Research Institute, 2014; Lang and Hallman, 2005; Nisbet and Markowitz, 2016). However, the kinds of scientific sources an individual trusts also vary with that person’s political ideology (McCright et al., 2013b). With respect to the public’s confidence in scientific leaders and institutions, the General Social Survey indicates that 41 percent of the public have a great deal of confidence and 49 percent have some confidence in scientific leaders, while confidence in other leaders and institutions has declined precipitously (Gauchat, 2011, 2012; National Science Board, 2016a). People’s confidence in scientific leaders, as measured by the General Social Survey, also appears to vary with gender, age, and race/ethnicity, being somewhat lower among women, older Americans, and nonwhites (National Science Board, 2016a).
Individual and social factors beyond political ideology, such as race or ethnicity, income, religiosity, social capital, education, and knowledge, all can affect public trust in sources of information about science and of science itself, depending on the topic and the nature of the science being conveyed (Brewer and Ley, 2013; Gauchat, 2011, 2012; McCright et al., 2013b; Sturgis and Allum, 2004; Yearley, 2005). In addition, social identity—how people think of themselves in relation to various groups that matter to them—can affect how an individual thinks and feels about science (National Academies of Sciences, Engineering, and Medicine, 2016c), especially when that person has limited knowledge about an issue (Earle, 2010). Valuing scientific authority is positively related to level of education, and is also closely associated with trust in scientists and science more generally. This deference appears to be a long-term value on which people rely, especially when their knowledge of a topic is low (Brossard and Nisbet,
2007). People who defer to scientific authority have been shown to have higher levels of support for science or its applications, based in part on their trust in the experts, some research suggests (Brossard and Nisbet, 2007).
When information from science is at odds with one’s political ideology, whether conservative or liberal, not only does acceptance of the information decline, as discussed earlier, but so, too, according to one experiment, does trust (Nisbet et al., 2015). The media, particularly partisan media, can affect people’s trust in scientists, which in turn can affect their perceptions of science (Hmielowski et al., 2014; Nisbet et al., 2002). Preliminary research suggests that, at least in some circumstances, creating a less partisan atmosphere for scientists to engage with laypeople and apply logical reasoning to scientific evidence as part of drawing conclusions can help counteract motivated reasoning (Jamieson and Hardy, 2014).
A large literature shows that judgments about the warmth and competence of others are important to how people generally determine whether others have good intentions and the ability to act on them (e.g., Fiske et al., 2007). These factors are related to trust across a number of contexts, including risk communication, political communication, and employment settings (Chryssochoidis et al., 2009; Colquitt and Rodell, 2011; Fiske and Dupree, 2014; Peters et al., 1997). Another influence on trust is the audience’s beliefs about the communicator’s motives (Chryssochoidis et al., 2009; Lang and Hallman, 2005; Rabinovich and Morton, 2012). The perception of financial motives or self-interest, for example, is associated with lower trust in sources (Lang and Hallman, 2005). Other research has found that people resist what they perceive as an effort to persuade them (Byrne and Hart, 2009; Jacks and Devine, 2000).
When the source of information is an organization, the qualities of both the communicator and the organization affect trust. Qualities with a positive impact on trust include being perceived as having a good history of providing accurate information; being truthful; being concerned with public welfare; showing openness, concern, and care; being perceived as less likely to pursue self-interested motives; and being seen as knowledgeable (Chryssochoidis et al., 2009; National Research Council, 2012; Rabinovich and Morton, 2012). Individuals and organizations also may be more trusted when people believe the benefits of their research will be accessible to the general public (Critchley, 2008). Although the evidence that transparency increases trust is limited, a lack of transparency can increase distrust, as can omitting information about uncertainty from discussions of risk (Frewer et al., 2002; National Research Council, 2012). Perceptions of trust and different dimensions of fairness (e.g., the fairness of a process or its outcomes) appear to be closely related, but which perception precedes the other is unclear (Earle and Siegrist, 2008; McComas and Besley, 2011; McComas et al., 2007).
Some issues within the scientific community itself appear to have the potential to affect trust in science and scientists, although these effects have not been studied in great detail. Some of these issues stem from the complexities of translating the ways in which science is conducted; others from how scientists and others outside the scientific community, such as journalists, communicate findings; and still others from conflicts of interest, the deliberate misrepresentation of science, or failure to replicate findings (Ioannadis, 2005; Mnookin, 2012; National Science Foundation, 2015; Open Science Collaboration, 2015). The extent to which these issues actually do influence the perceptions of general lay audiences remains unclear.
As noted earlier, trust and credibility are important to science communication because they affect the degree of attention people pay to guidance from scientific experts, as well as whether they believe scientific findings or support science-related decisions (Bleich et al., 2007; Brossard and Nisbet, 2007; McCright et al., 2016; Rabinovich and Morton, 2012). Trust also affects attitudes toward science more generally. Differences in willingness to act on scientific information, for example, have been shown to be mediated by trust (Rabinovich and Morton, 2012). However, it is unclear whether attitudes toward an issue affect trust in the source of information on the issue or vice versa (National Research Council, 2012; Roduta Roberts et al., 2011). For an industry associated with science, preexisting levels of trust affect how both informative and positive news about that industry is perceived to be (Cvetkovich et al., 2002).
If people trust an institution to manage a communicated risk, they perceive the risk as smaller or the potential benefits as larger (Chryssochoidis et al., 2009). For example, trust in institutions has been shown to be a strong influence on support for nuclear power (Besley, 2010; Besley and McComas, 2015; Visschers and Siegrist, 2012; Whitfield et al., 2009).
Trusted friends and family may be important sources of motivation to take action on a science-related issue. For example, members of the public indicate that someone close to them, such as a spouse, child, or friend, would be the person most likely to convince them to take action to reduce climate change (Leiserowitz et al., 2013).
Science communication appears to be impacted more by the spread of distrust of science than by the building of trust (Slovic, 1999). Distrust is an assessment that someone lacks credibility and has a malevolent willingness to lie or deceive (Hon and Grunig, 1999). Distrust can arise from issues within the scientific community discussed earlier with respect to either the conduct or the communication of science, or it can be fostered by actions
outside of the scientific community, such as misreporting or the deliberate spread of misinformation.
Distrust may be greatest in those science policy contexts in which individuals feel their values, identity, or interests are threatened, regardless of their ideology (Braman et al., 2012; Kahan, 2012; McCright et al., 2013b; Nisbet, 2005, 2011). Overall, across issues and disciplines, trust as an appraisal of a communicator’s motives, future intentions, integrity, and dependability, as well as confidence in the communicator’s abilities, has proved to be important to people’s perceptions of science and the way they interpret scientific information. In addition, aspects of the environment, including mass media, interact with trust in important ways.
However, research on the role of trust, credibility, and related elements in science communication has an important limitation that needs to be addressed: these concepts have been conceptualized and measured in a number of different ways. Moreover, an assumption of the literature related to trust is that people associate a message with a source and come to believe the message based on trust in that source. Yet some research shows that over time, people remember the content of the information they receive but fail to remember the source (e.g., Mares, 1996). And if people are exposed repeatedly to the same information, even from a fictional source, they may come to believe it is true (Shrum, 2007). Research is needed to understand the factors that influence people’s trust in science and scientific information. As part of this research, it will be important to clarify the dimensions of trust and credibility and how they relate to one another. How does trust vary depending on the communicator, the goals for the communication (e.g., informational versus persuasive), and the communication context? How does trust change as science about an issue is communicated over time? What factors affect the trust and credibility of scientists and scientific understandings in science-related controversies? In such cases, for example, what are the effects on trust of science communicators being open about their own values and preferences?
Perhaps some of the most pertinent wisdom about effective science communication comes from experience with large-scale information campaigns, such as those for public health, undertaken to inform the public, shape opinions, and motivate behavior change. The challenges and successes of such campaigns offer a number of lessons, described below.
Too little attention often is paid to providing sufficient exposure to information to reach enough of the target audiences to effect change. An exposure strategy involves defining how often, through what methods, and over what period of time a message should be disseminated and who the intended audiences are (Hornik, 2002). The plans for achieving the desired exposure will then constrain the shaping of messages, since the messages will need to suit the communication methods that the communicators can afford and control. To be effective, some types of messages—such as those that are addressing a problem that is not yet widely recognized or counterarguments to messages already received by the public—may require more exposure than others.
Research suggests that communication intended to educate may have more impact if provided before people form strong opinions about the topic. For example, people who have not yet heard of a technology, such as carbon capture and storage, can be influenced by short, relatively uninformative emotional messages. Once people’s views have been shaped, however, it can be difficult to change those views by providing scientific information (Bruine de Bruin and Wong-Parodi, 2014; Lewandowsky et al., 2012). Observed “inoculation effects” in other areas of communication suggest that early communication about science, including equipping people with counterarguments that expose flaws in misinformation, also may “inoculate” the public from the spread of misinformation by those with a stake in misrepresenting the science (Cook, 2016). There is also support for encouraging healthy skepticism of sources of information as a way to reduce people’s susceptibility to misinformation before they encounter it (Lewandowsky et al., 2012). Moreover, research on formal efforts to engage with the public suggests that, when possible, engagement should start long before an issue emerges in public debate (National Research Council, 2008). The case of gene editing, for example, highlights the need to consider public engagement efforts as early as possible in the evolution of a technology, well before it is developed and ready to be applied (Jasanoff et al., 2015). At the same time, however, early efforts to communicate science also have the potential to create a situation in which public debate becomes polarized and to foster activist publics (see, e.g., Grunig et al., 1995). These factors can increase the complexity of the environment for communicating science, as discussed in Chapters 3 and 4. Thus, while early communication generally is better, more needs to be known about timing in light of the goals for communicating and the context.
Long-term and comprehensive approaches may be needed to achieve certain communication goals (Hornik, 2002). In the cases of smoking cessation, high blood pressure control, and prevention of HIV transmission, for example, communication efforts continued for months and even years, multiple agencies and constituencies engaged with the issue, the campaigns entailed routine media coverage, and so on. The underlying notion is that a strategy of repeated exposure to a message delivered in multiple formats by diverse actors via various platforms is effective for conveying a message of consensus to many segments of the public. Alternatives are available, such as attempting to shape news coverage (Wallack, 1993) or the content of entertainment shows (Singhal and Rogers, 2012). Adoption of these types of media strategies may achieve the needed exposure but will require ceding tight control of the message.
This page intentionally left blank.