“I do not see a definition in the literature that says trust is about thinking like me.”
“Trust in science means different things to different people.”—Tim Caulfield
There is no universally accepted definition of trust. Available definitions are complex, and vary by academic discipline and context. Workshop participants described and discussed several definitions of trust, what is known about the components of trust and distrust, and the history of trust in science.
What is the meaning of trust? “Trust is complex,” according to Tim Caulfield of the University of Alberta. There is a large literature base on trust in different disciplines, and the definitions are “somewhat abstract”, said Cary Funk of the Pew Research Center. Caulfield and Funk highlighted several of the academic definitions of trust (Box 2-1). Key aspects of the definitions include perceived benefits and risks, uncertainty, credibility, and vulnerability. Funk pointed out that although “the credibility of a messenger delivering a message” is an important component of trust, such a narrow definition does not capture “the two-way dialogue concept behind public engagement.” Caulfield particularly embraced a definition developed by science blogger, Liz Neeley, which combines multiple academic definitions of trust as: “your willingness to embrace the advice of a group of strangers because you believe they (a) know the truth, (b) will tell you the truth as they know it; and (c) have your best interest at heart,” all of which depend on “(d) who you are, (e) who they are, and (f) what you’re talking about.”4
4 Neeley, L., “What the Science Tells Us About ‘Trust in Science,’” COMPASSblogs, August 12, 2013, http://compassblogs.org/blog/2013/08/12/trust-in-science/.
- A “standing decision” in which someone is given the benefit of the doubt. a
- A relationship among people in which the relationships facilitate ongoing interactions that involve risk-taking and uncertainty about future interactions. b
- A focus on the source of a message, where we believe the source provides credible information.c
- The public’s willingness to be vulnerable to the actions of the designers, creators, and operators of science on the expectation that they will behave in a way beneficial to the public.d
a. Rahn, W. M. and Transue, J. E. “Social trust and value change: The decline of social capital in American youth, 1976-1995.” Political Psychology 19, no. 3 (September 1998): 545-565.
b. Resnick, D.B. “Scientific research and the public trust.” Science and Engineering Ethics 17, no. 3 (September 2011): 399-409.
c. Petty R.E. and Cacioppo J.T. Communication and persuasion: Central and peripheral routes to attitude change. New York: Springer-Verlag, 1986.
d. Roberts, M.R., G. Reid, M. Schroeder, and S.P. Norris. 2013. “Causal or spurious? The relationship of knowledge and attitudes to trust in science and technology.” Public Understanding of Science 22, no. 5 (July 2013): 624-641.
Funk emphasized the need to think about trust in terms of actors. She explained that individuals, groups, and institutions are involved in the scientific enterprise, and all these actors have roles in the trust landscape. Thus, a framework for thinking about trust is to consider it “a set of expectations among this broad amorphous collection of people, groups, and institutions”, Funk said. James Grunig of the University of Maryland emphasized that there is not a single, homogenous “public,” but many “publics.” He identified three ways in which publics interact with science information:
- Active publics seek information and enter into a relationship around an issue.
- Passive publics have a low level of involvement, and neither are affected by nor see a connection to a particular scientific problem.
- Hot-issue publics arise in response to intense media attention on specific issues such as vaccines.
Different publics behave in distinct ways in response to different circumstances, Grunig added. Caulfield described the results of a survey conducted in New Zealand that
mapped out public attitudes that influence trust in science.5 The study identified six population segments with distinct feelings about science: confident believers, educated cynics, concerned supporters, uninformed individualists, confused and suspicious people, and those feeling left behind. “For all of these different groups, trust plays out very differently,” Caulfield said.
Under normal circumstances and when the relationship between us and an actor is good, we assume that the motives of the scientist or science institution actor are good, Funk said. It is easy to trust because we expect favorable future interactions, she explained. However, when the relationship is bad and there is an expectation of unfavorable future interactions, trust can diminish or be eliminated. In this way, Funk emphasized the implicit connection between trust and risk.
On the basis of his decades of research on public relations, Grunig defined trust as the “willingness to open oneself to risk by engaging in a relationship with another party.” He explained that trust has three dimensions: integrity, the belief that a person or organization is fair and just; dependability, the belief that a person or organization will do what they say; and confidence, the belief that a person or organization has the ability to do what they say they will do.6 However, Grunig emphasized that a person’s “openness to risk is the important element” of trust.
Rose McDermott of Brown University explained that trust has a biological component that is determined in part by a person’s oxytocin concentration. Oxytocin, a mammalian hormone released by the pituitary gland, plays a role in social bonding and such other biological functions as contraction of uterine muscles, lactation, and parental care. Social bonding can be seen as a “precursor to trust,” she said. McDermott described research by Paul Zach that demonstrates that people with higher oxytocin concentrations are “more willing to bear risk of all kinds in their social interactions.” However, she stressed that the biological mechanisms that underlie trust involve human universals – broader and more complex psychological processes of which oxytocin concentration is only one component (Box 2-2).
What does it mean to distrust? Is it ok to be distrustful? Grunig explained that trust and distrust are not opposites. “Distrust is not the absence of trust,” nor is trust the absence of distrust he said. He explained that distrust has two dimensions: lack of credibility, the belief a person or organization is not accountable, is unethical, or does not respect laws or policies; and malevolence, a person or organizations willingness to lie to increase profits, to deceive
5 Hipkins, R., W. Stockwell, R. Bolstad, and R. Baker, Commonsense, trust and science: How Patterns of beliefs and attitudes to science pose challenges for effective communication (New Zealand Council for Educational Research/ACNielson for the Ministry of Research, Science and Technology, 2002).
6 Hon, L.C. and J.E. Grunig, Guidelines for Measuring Relationships in Public Relations (Institute for Public Relationship, 1999).
McDermott listed examples of human universals, cognitive processes involved in how and why people trust or distrust:
- How we receive and screen information.
- How we process information.
- How we alleviate the unknown (uncertainty).
- How we detect untruth (cheater detection).
SOURCE: McDermott Workshop Slide 4
members of the public, or to take more than is given.7 He added that because trust and distrust are not two sides of the same coin, they should be conceptualized individually in thinking about building public trust in science. Grunig shared his view that “distrust is more salient than trust.” The asymmetry of negative information—the idea that negative information has a stronger effect on public perceptions than positive information—was discussed throughout the workshop in relation to its impact on public trust in science.
“The flip side of trust is betrayal,” McDermott stated. “We talk about trust as though it’s all good, that we should trust everybody, but in fact we shouldn’t,” McDermott noted. She said that people have built in “cheater detection,” a psychological response to environmental cues that information or a person is not trustworthy. “Our survival has been potentiated by being able to figure out people who are trying to take advantage of us – to cheat us,” she explained.
Although there is a lack of consensus on the definition of trust, several workshop attendees noted that the relationship between science and members of the public has changed. Marcia Kean of Feinstein Kean Healthcare offered a historical perspective on the change through the lens of health care. In the past, scientists made up a small and exclusive “golden circle” that conducted its work in prestigious academic institutions, she noted. That small group, whose research was supported primarily by federal government tax dollars, was validated when its findings were published in peer-reviewed journals. She said that the mechanism through which members of the public learned about new research findings was coverage of peer-reviewed publications by highly regarded journalists who had expertise in science and medicine. Kean described that period as “a trust fabric built on authority”—when health-care research and its societal implications were “framed by a professional class in the biotechnology ecosystem”. She said that the system worked well for 30 years as “an implicit
7 Hon, L.C. and J.E. Grunig, Guidelines for Measuring Relationships in Public Relations (Institute for Public Relationship, 1999).
social contract” in which the public played a passive role. The public supported science through taxes and trusted the government and other institutions, such as science-advocacy organizations, to regulate science as they saw fit.
In recent years, Kean said, that authoritative trust fabric has deteriorated. She argued that a new social contract is needed between science and the public that reflects the landscape in which science is conducted today and the new ways in which the public intersects with and consumes scientific findings. In the 21st century, there is an increasing convergence of science disciplines—such as combinations of biology, engineering, and digital technologies—to address societal challenges, she noted. Alongside the changing approach to scientific research, the public is exchanging its historically passive role for one that is more active and interactive. Kean warned that the “elite ecosystem” is not prepared for these changes. The historical reliance on funding from federal agencies and charitable organizations is now enhanced by other strategies, and the professional class of scientists that has been leading the charge in science for the last few decades has little experience with the new tactics. Crowd funding and other innovative models of investment offer the public opportunities to be more active in deciding what science is funded and how it is conducted. The new approaches contrast with the historical model in which decisions about funding support were left to experts who were often selected by other experts in a manner that was obscure to the public.
Kean explained how changes in the availability of data have influenced the public’s expectations of its role in science. Access to individualized data through various platforms—including new research strategies that allow participants to provide and receive data in the course of research activities—has contributed to what Kean called “an increased sense of me” with regard to data availability and sharing. She emphasized that a new trust fabric of “partnership, participation, and peer groups” will need to be built. That backdrop provided a framework for considering the many definitions of trust that were discussed during the workshop.
This page intentionally left blank.