“Never overestimate the knowledge of the people and never underestimate their wisdom.”
"Producing & protecting good independent science is one of the most important functions of government in a liberal democracy…this idea of independence is core to the issue of trust."
Science does not exist in isolation. Public discussions of new discoveries, applications, or concerns about science take place in a sociopolitical context. During the workshop, a panel of current and former scientist policy-makers, moderated by David Goldston of the Natural Resources Defense Council, discussed the intersections of government, politics, and science, and the influence of those intersections on public trust.
As political debates about an issue become more polarized, science is increasingly used as the “ultimate trump card,” Goldston said. Politicians bring science and scientists into the policy arena in an effort to say “the science is on my side.” However, this strategy, which tries to leverage the trust that the public’s historical trust in scientists, to move political agendas forward has a serious downside: the American public, which generally has an unfavorable view of politicians, can extend its negative feelings toward science when it perceives that science is being used for political purposes, Goldston warned. That warning echoed the assertion of James Grunig of the University Maryland, that when there is societal debate, public trust often becomes a function more of political ideology than of scientific fact— a consideration that brings trust out of the scientific domain and places it in a political one. Goldston said that scientists’ involvement in such debates offers great opportunity, but also substantial risk. People try to “elevate” scientists to use their credibility for political debates, but the only way for an opposing interest to win the argument is to “knock the scientist off the pedestal.” He explained that “more than science is at stake. Everyone tries to play on trust in science, but that puts both scientists and policy makers in a trap.” Kathleen Hall Jamieson of the Annenberg Public Policy Center expanded on that idea with her observation that “if the motive of the scientist is anything other than seeking knowledge…we run the risk of saying that the scientist is just another form of advocate who is selectively using evidence in order to engage in a persuasive campaign.” Goldston advised, “We should
always be asking, ‘What is the role of science in the policy debate?’” To begin to answer that question, he described four scenarios in which science and public policy intersect (Box 4-1).
Goldston’s remarks provided the context for a panel discussion in which three scientists whose work regularly places them in the political sphere —Ann Bartuska of the US Department of Agriculture, Jo Handelsman of White House Office of Science and Technology Policy, and former congressman Rush Holt of the American Association for the Advancement of Science—shared their personal accounts of public engagement and lessons learned about developing trust.
Bartuska described her experiences in interacting with the public about a project that involved harvesting trees for research purposes. Although the project aimed to maintain forest health by understanding forest dynamics in response to insects, diseases, and wildfires, Bartuska encountered segments of the public that believed that harvesting of trees for research was something to be avoided except perhaps for wildfire prevention. She described how, with that concern in mind, her group worked with the community to prepare for the project. Her efforts began with bringing environmental advocates together with local stakeholders who had interests in timber, wildlife, and fisheries. Joined by Forest Service scientists, these science-government-community groups walked through the scientific process together, including coming to agreement on the scientific question and evaluating the merits and liabilities of various research protocols. “We actually walked through the scientific process. We asked, what is the most important question to be addressed here? Because we all agreed that we wanted to improve the quality of the oak forest, we laid out a common understanding of the scientific question. The community not only helped us to do the analyses on the monitoring
Goldston outlined four scenarios that describe how science can intersect with public policy:
- A policy issue masquerades as a science issue, for example, whether to label genetically engineered organisms.
- A policy issue is supported by scientific consensus, for example, the relationship between ozone exposure concentrations and hospital admissions.
- A policy question that is not supported by a scientific consensus, for example, the question of what, if anything, should be done to restore a forest ecosystem after a fire
- An emerging issue that does not have either a policy position or a scientific consensus, for example, regulatory considerations associated with synthetic biology or nanotechnology research and development.
of the site but helped to inform the results and conclusions. It was a good example of how you could take a practical experiment and bring the community together to help resolve a thorny issue.” Bartuska noted that National Environmental Policy Act clearly shows that community engagement to identify common goals “is an essential first step”. She added that the community also assisted in monitoring the research and collaborated readily with federal officials in a manner that improved the overall quality and effect of the project. Bartuska said that an ancillary benefit of such a research strategy is that it increases scientific literacy in the communities where the collaborations occur. She emphasized that starting with a common goal—regardless of how simplistic it seems—is a critical aspect of the process. When asked by Goldston whether she did something special to arrive at a common goal before implementation of the research project, Bartuska responded, “You have to go slow to go fast.”
Handelsman shared her experience in addressing the Ebola outbreak, which stimulated “mass hysteria”. From the science perspective, “we know that this is not a very contagious virus and that the probability that people coming into the [the United States] from Africa and carrying the virus or infecting others was incredibly low.” Hence, dedicating substantial resources to screening and quarantine would not be a reasonable science-based policy, she said. “If Ebola had the infectivity of influenza, it would be disastrous, absolutely horrible, but it doesn’t.” However, Handelsman said, discussing with members of the public such scientific facts as the low transmissibility of the virus and its inability to travel through the air did little to assuage concerns. Handelsman does not consider the Ebola case to be a success story, although she pointed out that scientific editorials in mainstream papers constituted a useful dimension of public discussions. Goldston asked what approaches might help to separate scientific discussions about the magnitude of risk from social value discussion about how much risk is acceptable. Handelsman indicated that responsible science and political leadership can make a difference. “The president often speaks about evidence and about science in an evidence-based way. That has entered the American consciousness,” Handelsman emphasized.
Holt drew on his experiences as one of the few Congressman-scientists during the 2001 anthrax scare He recalled the anxiety in the US Capitol after letters tainted with anthrax arrived in congressional offices in 2001. The anxiety was accompanied by a desire on the part of some of Holt’s colleagues to obtain reliable information about anthrax, and these colleagues approached Holt saying, “You’re a scientist, you must know about anthrax.” Holt said he was puzzled by that assumption because his scientific training was in physics, a field that has little to do with life sciences or medicine. But the experience demonstrated to Holt that the public wants facts and perceives scientists as the “keepers” of facts. He pointed out that how we teach in the United States has produced the notions in many people that relatively few people “know about science” and that scientific information is accessible to the public only through them. Holt believes that those notions divide the United States into two camps—scientists and nonscientists—and that this division affects public trust in science. He challenged the audience to consider how the public can be expected to trust science when most people believe they cannot understand it. He added, however, that it is important for
scientists to leverage their position when people turn to them for information. “Why not help the public to understand the concept of risk? Why not help the public understand statistical reasoning?”
Building public trust takes time, and time is of the essence for such issues as the 2014 Ebola outbreak and the 2001 anthrax attack. Is there a way for the science community to “come out in front of the political or cultural narratives that sometimes arise around issues like Ebola and climate change?” asked Jessica Brooks, of the Science and Technology Policy Institute. Bartuska pointed out that scientific professional societies, such as the Ecological Society of America (ESA), have established processes to provide “policy-relevant” information before a political narrative has been established around an issue. ESA has people in Washington, DC, “who are paying attention to what’s happening in Congress that deal with ecologic and environmental issues” and “a rapid-response team” of member academic and federal scientists to develop white papers “with one of the hallmarks that they can be easily understood…but have the basis of good, solid peer review behind them,” Bartuska said. “No group of people is smart enough to guess what the next emerging public misunderstanding will be,” Holt argued. He emphasized that education—“helping people to understand” not just what scientists have learned, but also how scientists conduct risk assessment and statistical analysis and “what a public interface is about”—is enabling members of the public to understand issues for themselves. David Rejeski, of the Woodrow Wilson International Center for Scholars, countered that an issue that “will be a trust train wreck” is germline modification. The technology that scientists have developed to “modify and edit the genome easily” is front-page news, he said. Although some US scientists have called for a moratorium20 on human germline editing and Francis Collins, the director of the National Institutes of Health (NIH), has stated that NIH will not fund human germline-editing research,21 scientists in China have already conducted such research.22 “I thought that this was [a research area] that we had decided, morally, religiously, and scientifically, we wouldn’t go there,” he added. Rejeski asked what the White House, US government agencies, or science institutions like the National Academy of Sciences should do. Handelsman responded that although the US government has laws and policies that restrict research on human germline editing, not all countries do. The decision of whether to conduct this research is an ethical one that “definitely warrants an international debate.”
Jamieson talked about the fact that we all have alternative identities that we can prime or suppress as appropriate. For example, colleagues talk to each other as colleagues, not as mothers and daughters or some other identities. If we give an audience cues that we are
20Baltimore, D. et al. “A prudent path forward for genomic engineering and germline gene modification.” Science 348, no. 6230 (April 2015): 36-38.
21The NIH Director, “Statement on NIH funding of research using gene-editing technologies in human embryos,” National Institutes of Health http://www.nih.gov/about/director/04292015_statement_gene_editing_technologies.htm.
22Liang, P. et al. “CRISPR/Cas9-mediated gene editing in human tripronuclear zygotes.” Protein & Cell 6, no. 5 (May 2015): 363-373.
partisan and that we are not giving them all the data—people will treat us as partisan and question our motives, she said. As previously noted, Jamieson thinks that a scientist’s motive should be to seek and preserve knowledge, not advocate for or engage in persuasive campaigns about public policy.
Jamieson talked about her research that showed it is possible to activate different schemata in an audience, depending on the voices that science uses. She worries that the science community is beginning to talk as though being “partisan” is a hard-wired trait, and that members of the public are not capable of moving into a space in which they become information seekers who weigh the evidence as best they can. However, she said, the partisan switch can be turned off if a speaker does not come across as partisan. A person who receives a diagnosis with cancer, does not consult Rachel Maddow or Rush Limbaugh; the person finds someone who has expertise in medicine and asks, “Doctor, if I was your wife or daughter, what would you ask me to do?” Jamieson said that we ask people to perform a surrogacy function for us. That is why she thinks it’s so important to include a value and caring dimension in scientists---because it is intrinsic in all human relationships.
This page intentionally left blank.