Atul Gawande, the keynote speaker at the colloquium, said that he has played three different roles at the intersection of science and society. He is a scientist who has gathered and analyzed data in public health experiments designed to reduce deaths and improve the quality of life. He is a staff writer with The New Yorker and the author of four books on critical issues in medicine and health. He is also a surgeon at Brigham and Women’s Hospital, where he has opportunities to put into practice the ideas developed through his research and writing.
In each of these three roles, he has encountered public mistrust of science. Research findings that suggest changes in medical practice often encounter resistance from physicians and patients, he said. He has
“learned not to read the comments section” of his articles, in part to avoid the charge that he is disseminating “fake news.” Patients of his have even raised doubts about the trustworthiness of his plans as a surgeon. He asked, what accounts for this mistrust?
Part of the answer involves a misunderstanding of science. As Gawande defined it, science requires a commitment to a way of building knowledge and explaining nature through factual observation and testing. This “is not a normal way of thinking,” he said. “Much of what we do in science may be counterintuitive. A scientific explanation will not necessarily be the same as the explanation that may come from the wisdom of divinity or the explanations that come from experience or common sense. We watch the sun move across the sky. It’s common sense. It’s moving across the sky. Or people get colds in cold weather, and cold must produce colds.”
In science, intuitions are hypotheses that need to be tested. He particularly likes Edwin Hubble’s description of a scientist as someone with “a healthy skepticism, suspended judgment, and disciplined imagination.” Gawande’s interpretation of this description is that scientists have an experimental mind, not a litigious mind. Scientists are not free of opinion, but evidence contradicting their explanations can always arise. Hubble also said that “the scientist explains the world by successive approximations.” This approach to understanding has proved remarkably powerful over time.
Yet, scientists are not necessarily trusted, despite their openness to experience and evidence, Gawande pointed out. Many people believe that childhood vaccines cause autism, even though a massive evidence base indicates that they do not. People believe they are safer if they own a gun, but they are not. People believe that climate change is not happening or is unrelated to human activities, even though it is.
Once embedded in people’s minds, beliefs that are not in accord with evidence can be hard to change. This is especially the case when people do not trust scientific authorities, and trust in science has been declining among some groups over time. Distressingly, surveys done since the 1970s have demonstrated that the higher a person’s educational level, the more substantial is the average decline in trust in the scientific community, particularly among conservatives. In 1974, educated conservatives had the highest level of trust in the scientific community; now they are the group with the lowest level of trust in the scientific community.
People do not dismiss scientific authority, Gawande pointed out. “As a society, the belief in the power and the fruits of science is strong.” Rather, people dismiss scientific authorities and point to alternative bodies of information. This makes the job of science communication particularly crucial.
A major task for science communicators is to distinguish between science and pseudoscience, Gawande pointed out. He cited five hallmarks of pseudoscience identified by writer Mark Huffnagle. First, people pushing pseudoscience allege a conspiracy to suppress dissenting views. Second, they provide fake experts with contrary views but no credible scientific track record. Third, they cherry-pick data and papers to discredit a field. Fourth, they deploy false analogies and logical fallacies. Fifth, they set expectations for research that are impossible to meet.
Pseudoscience takes the form of science without the substance of science. For that reason, using science to rebut typically does not work and can even backfire by strengthening the conviction of believers. “It spreads familiarity with the belief,” said Gawande. “That’s partly because of the nature of the brain. Misinformation sticks.” Debunking a commonsense view of the world can also leave a painful gap, which causes people to raise their defenses to avoid that discomfort.
Rebutting bad science may not be effective, but asserting scientific facts is, Gawande observed. Mental models based on misinformation need to be replaced with explanatory narratives based on science, he said. With vaccines, for example, focusing on vaccine myths is far less effective than focusing on the fact that giving children vaccines has proved vastly safer than not giving them vaccines. When federal policy reduced access to funds for vaccination among the poor, vaccination fell to the point that the United States had an outbreak between 1989 and 1991 of 55,000 measles cases and 123 deaths. “It’s safe to give vaccines and it’s deadly not to,” said Gawande, which is the message people need to receive.
Another effective approach is to expose the tactics that are used to mislead people. Bad science has a pattern, and helping people recognize that pattern helps them identify bad science when they see it. “Having a scientific understanding of the world is about helping people understand how you judge which information to trust.” But the task is complicated by the appearance of new data and the development of new understandings. Bisphenol A (BPA), for example, is a carbon-based compound that has been widely used in plastics. A confluence of studies has suggested that the compound may have negative effects on human health, but the issue has been disputed. How can people decide what to believe? The science surrounding these kinds of issues has become “too vast and too complex” for most people to master. Scientists themselves can become “bullheaded, enamored of pet theories, dismissive of new evidence, and heedless of their fallibility,” Gawande warned. The consensus may also be wrong, as was the case when, for years, physicians assured their patients that they were very unlikely to become addicted to pain killers by taking them after surgery.
The key in deciding what to think, said Gawande, is to consider the difference between science and pseudoscience. Does one of the sides cherry-pick data to support its view? Does one of the sides honestly grapple with evidence that runs counter to its views? Does one of the sides “assess the totality of the views versus taking the litigious position?”
The virtues of a scientific orientation lie more in the community and the body of work than in any one individual, Gawande explained. Science is a social enterprise with an intricate division of cognitive labor. “As a community endeavor, it’s been beautifully, amazingly self-correcting over time. It is not beautifully organized, however. Up close, it can be an extremely rickety-looking vehicle for arriving at truth,” he said. The peer-review process can be muddled, journal articles badly written, and scientific pronouncements pompous. “And it will not get any prettier,” he added. “That’s a contradictory idea for us to embrace: elitism in our method, but a democratization of data.” Yet, that is how science works.
The increasing number of people who question scientific conclusions is adding to the inherent messiness of the scientific process, said Gawande. His surgical practice is a good example. Science is “arguably the most powerful collective enterprise in human history,” yet many of his patients are deeply skeptical about even the most basic scientific knowledge. “In my work as a surgeon, there is fundamental skepticism on a regular basis about the most basic knowledge from ‘mainstream science,’” said Gawande (adding, however, that there is only one kind of science). Furthermore, the ones with the greatest doubts are often the most educated.
In dealing with such patients, it is a mistake, Gawande said, to think that his scientific credentials give him any special authority. “What a scientific education offers is something that’s more important: understanding what real truth seeking looks like.” In his clinic visits, Gawande seeks to engage in a dialogue that reflects the process of seeking truth. “Discussing and pursuing ideas with curiosity, inquisitiveness, openness, and discipline is the way I have to get through my clinic.” When a patient has a curable cancer but refuses to undergo surgery, he asks the patient to watch the progression of the cancer with him. “I’ve never gone through all the way to watching [a cancer] metastasize and kill somebody. Eventually, as we watch our way through that and grapple with the evidence, we have made that change.”
Science communicators need to take a similar approach, he said. They need to engage with ideas and their audiences using curiosity, inquisitiveness, openness, and discipline. As an example, Gawande cited his work on the issue of how to educate clinicians to talk to patients as they face
the end of life. Role playing enables clinicians to go beyond just providing facts to patients and then asking them to decide what to do. Rather, clinicians can learn to take the role of a counselor who can elicit people’s goals.
What’s your understanding of where you are with your illness at this time? What would you like to know about what might be ahead for you from me? What would you like to know about your prognosis from me? What are your fears for your future with your health? What are your goals if your health worsens? What are you willing to go through? What are you not willing to go through for the sake of more time? What’s the minimum quality of life you find acceptable?
Asking these questions makes it less frightening for both patients and clinicians to talk about uncertainties, establish goals, and then try to meet those goals.
Another useful approach is to focus on positive rather than negative outcomes. Solutions journalism, for example, stresses competence and why taking certain steps can lead to improvements. Such stories can introduce people to puzzles or threats to identify crucial questions that need to be answered. For example, rather than writing about the threats and quandaries posed by end-of-life questions, writing about people who work in palliative care and geriatrics can emphasize the skills needed to talk through difficult decisions and arrive at the best possible decisions. “People with solutions and directions for solutions for major issues in our lives, major threats to our world, that is something people can attach themselves to and believe in,” Gawande said.
All science communicators can use these kinds of techniques to learn how to communicate better, Gawande concluded, and they need to do so. “The stakes for understanding this could not be any higher than they are today, because we’re not just battling anymore what it means to be scientists. We’re battling for what it means to be citizens.”
This page intentionally left blank.