Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief Proceedings of a Workshop IN BRIEF December 2020 Addressing Health Misinformation with Health Literacy Strategies Proceedings of a Workshopâin Brief On July 29, 2020, the Roundtable on Health Literacy convened a public workshop to explore the challenges result- ing from the proliferation of health and medical misinformation and disinformation, particularly as they relate to the coronavirus disease 2019 (COVID-19) pandemic. The virtual workshop explored the role of fact-checking organizations (FCOs) and the technology industry in addressing misinformation and disinformation, the social psychology behind their spread, and health literacy strategies to support this ongoing multidisciplinary work. This proceedings was prepared by the rapporteur as a factual summary of what occurred at the workshop. Statements, recommendations, and opinions expressed are those of individual workshop participants and are not necessarily endorsed or verified by either the Roundtable on Health Literacy or the National Academies, and they should not be construed as reflecting any group consensus. Lawrence Smith, chair of the Roundtable on Health Literacy; executive vice president and physician in chief at Northwell Health; and dean of the Donald and Barbara Zucker School of Medicine at Hofstra/Northwell, welcomed attendees to the virtual workshop. The workshop, he explained, would examine the rise of health misinformation and would use COVID-19 as a case study to explore health literacy strategies that may be used to mitigate such misinfor- mation. Smith introduced the two moderators for the panel: Ruth Parker, professor of medicine, pediatrics, and public health at Emory University, and Laurie Myers, global health literacy director for Merck. UNDERSTANDING MISINFORMATION AND DISINFORMATION Parker opened the panel discussion with Kate Starbird, associate professor in the Department of Human-Centered De- sign & Engineering and director of the Emerging Capacities of Mass Participation Laboratory at Washington University, asking her to explain the differences between misinformation and disinformation. Starbird noted that the distinction between misinformation and disinformation is âreally important when we think about strategies for addressing false information, online or elsewhere.â The definitions are still in development as is the field of disinformation studies itself, she added, but misinformation is presently understood by most as âfalse in- formation that is not intentionally false,â whereas disinformation is âfalse or misleading information spread with some kind of intentâusually political, reputational, or financial.â Disinformation, she continued, is not just one piece of infor- mation: It is part of a campaign or a set of different narratives, frequently with a factual or a plausible core wrapped in layers of false information or removed from its original context. âIt is not as simple as saying that piece of information is true or false,â she said, âbut considering why it is being spread now, who is spreading it, and what is the intent for spreading it. That makes the challenge of identifying disinfor- mation and removing malicious information a much different task than fact checking something as being true or false.â âMisinformation doesnât spread itself,â Starbird said. âWe spread itââweâ being âeverybody who participates in information spaces.â And we, as humans, are particularly vulnerable to spreading misinformation during crises events like pandemics, due to the uncertainty of the information spaceâ and anxieties borne out of that vacuum. Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief Parker noted that many workshop registrants had submitted questions before the workshop asking about how to identify signals that information they are consuming may not be entirely true. Media literacy has often been concentrated on logically looking at the information in question along with its source, Starbird replied, but reactions to news or information are not always approached logically. There is a compo- nent of emotional manipulation that might compel someone to quickly share information online without considering it. That is not to say that someone should not share information because it made them emotional, she said, but individuals should reflect on the emotion they feel, why they feel that way, and what motives someone might have for inducing that emotion, ultimately slowing the process down between seeing information online and then sharing it with others. Parker asked Starbird to share her thoughts on the evolution of science, its uncertainty, and how it relates to the current COVID-19 pandemic and misinformation. Fields like crises informatics and the social psychology of rumor during crises events tell us that one of the reasons misinformation spreads during crises events has to do with uncertainty, Starbird explained. âWe are not sure what actions we should take. We have a tendency to come together, try to gather information, and try to collectively make sense of it,â she added. The COVID-19 pandemic is particularly uncomfortable for people because they are deal- ing with months of uncertainty, as opposed to the 2 or 3 days after an earthquake. âThings are changing underneath our feet. The best understandings today are very different from the best understandings from a week or a month ago. With facts changing, we have to update our mental model around things, and a lot of us arenât very good at that,â she said. Those vulnerable moments can create opportunities for people interested in spreading disinformation. Individu- als can take advantage of the uncertainty in times of crises to spread or create false narratives, frequently in service of a political objective. She added, âthat definitely makes this pandemic an even more complicated situation, and perhaps at this point, we can even begin to consider how that is costing lives in the United States.â Parker asked if it would be helpful for most people using the Internet to consider approaching misinforma- tion and disinformation differently. It is difficult, Starbird answered, because disinformation requires assessing intent, which is harder to assess compared with whether something is true or false. Several social media platforms have tried to develop policies to prevent the spread of disinformation, and even they struggle with that distinction, she noted. âWe need better tools for understanding that, and we need better cues from social media platforms to be able to assess intent,â she said, adding that âwe also need better information about where the information has originated.â While individuals should be perceptive and cautious participants in information spaces, she said, most do not yet have the resources to assess the difference between misinformation and disinformation. âThatâs a platform design problem as much as anything else.â Starbird explained that she studied rumors during crisis events in 2013 and observed the spread of disinfor- mation. The disinformation included conspiracy theories that âseemed to be selectively amplified for political objec- tives.â At the time, she said, she thought it was marginal and did not think it worth pursuing further, but 2 years later, âwe began to recognize that disinformation was becoming a bigger and bigger part of the picture.â There was infra- structureânetwork structures and connections between accountsâthat habitually spread disinformation, she said, and it was beginning to reshape how information moved in information spaces. âIncreasingly, we have plenty of cues indi- cating this is a really big problemâdisinformation is being repeated by political leaders, not just in the U.S., but all over the world, and weâre seeing it show up in what we would consider mainstream spaces. As it moves from the margins to the center of conversations, I think we can recognize it as a significant problem.â HEALTH EQUITY AT THE SCALE OF THE INTERNET: WORKING WITH FACT-CHECKING ORGANIZATIONS AND TECHNOLOGY COMPANIES TO ADDRESS HEALTH MISINFORMATION Parker next introduced Nat Gyenes, director of the Digital Health Lab at Meedan and research fellow at the Berkman Klein Center for Internet & Society at Harvard University. Gyenes explained that Meedan works directly with social me- dia platforms and Internet search organizations to âstrengthen information equity on the Internet.â Through their research conducted at the Digital Health Lab, it has become clear to Gyenes and her team how important it is to reduce the stigma in health misinformation response work. However, she noted, it is a difficult balanc- ing act to attempt to reduce the negative impact of health misinformation, while ensuring that community members feel comfortable asking questions about health myths. The world has more Internet users than people with access to essential health services, Gyenes explained, and more than 80 percent of Internet users search for health-related information online. At the same time, she said, health misinformation is becoming an increasingly difficult and complex issue to address, and its consequences âdispropor- tionately affect communities of color, communities with lower socioeconomic status, and queer communities.â Barriers to accessibility, language constraints, and content relevance (or lack thereof) can all exacerbate the negative effects of the proliferation of health misinformation, she noted. 2 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief What search engine audiences use and the information they are able to find depends on the Internet availabil- ity or digital resources to which they have access. For example, if an individual has access to the Internet through Face- bookâs âFree Basicsâ program,1 which is a collaboration between Facebook and key mobile providers around the world to provide limited Internet access, then that individual can access only select information and not the World Wide Web. Understanding these barriers to Internet access are important, she said, because there are additional barriers, such as language fluency or lower literacy, which can affect whether an Internet userâs search terms match the nuanced lan- guage frequently used by public health authorities. The current COVID-19 crisis has only reinforced the importance of effective collaboration between health authorities and actors in the technology community, Gyenes said. These collaborations can be strengthened by the involvement of FCOs, which are driving forward health communications in the digital information ecosystem. FCOs have existed since the early 2000s and many are primarily known through their work investigating political misinformation. Gyenes explained there is one major standardizing body, the International Fact-Checking Network (IFCN), which promotes best practices and a shared code of principles for fact checkers.2 Other organizations may apply to be vetted as verified signatories to the code, she noted. For a few reasons, she said, those FCOs and their verification by IFCN are incredibly important to promoting health literacy. She added, One reason FCOs are so important is because of their ability to work in direct collaboration with tech companies in responding to misinformation online. FCOs are the local actors who respond directly to the questions and misinformation circulating in their own countries and communities. In doing so, they improve equitable access to health information. Their methodologies and the publication of those methodologies are both important to improving health lit- eracy and how fact-checking processes take place, said Gyenes. In her opinion, the most important part of an FCOâs role is that it âunpacks claims or pieces of misinformation within the community context in which they are shared.â Members of communities are responding to misinformation that is affecting their communities. Because of that important exper- tise, she said, those responses to misinformation are ârelevant,â and accessible to the communities they serve. One issue, however, is that FCOs are often limited by the information or expertise that are readily available to them. If they cannot get in touch with a public health expert to comment rapidly, misinformation can spread faster than it can be addressed. The important role of health communications and public health literacy experts at this point, Gyenes said, is to âcontextualize the latest science in ways that are accessible to fact checkers so that fact checkers can make it accessible to their communities.â She added that filling this need is the goal of Meedanâs public health expert database and tool kit project to respond to COVID-19 misinformation. In early 2020, Meedan built a team of infectious disease experts, health literacy practitioners, epidemiologists, pandemic preventionists, and vaccine uptake research- ers to work directly with FCOs and local newsrooms to provide on-demand contextualization for the latest scientific research related to the COVID-19 pandemic. The project team fields questions from fact checkers and can provide responses in more than seven languages, she added. Technology platforms around the world are already working with FCOs to address misinformation on their platforms. They do this by finding new information pathways to the audiences or by using fact-checked information to inform algorithms that determine âwhich content gets shared with an Internet user either first, or later, or never,â Gyenes said. For example, WhatsApp is currently working to address information accessibility for different communities, she added. In collaboration with Meedan, WhatsApp works with fact checkers directly by enabling the creation of âtext bots.â Through such tools to customize text bots, FCOs can create their own to provide contextualized information to audiences and receive feedback through WhatsAppâs questions from audiences about content they want fact checked. Gyenes noted that questions from those audiences have provided fascinating insights about the discrepancies between the information that individuals need or are curious about and the information published by public health authorities online. âThese insights can definitely serve as an opportunity to improve the health communications and health literacy fields,â she said. Gyenesâs team also uses those insights to tailor its responses to health misinformation, ensuring that information is culturally relevant and culturally sensitive, Gyenes said, adding that âwe want to ensure that our content is localized and not just translated.â âMidinformationâ3 differs from misinformation and disinformation in that it characterizes a kind of information crisis that occurs when not all of the facts are available: It is informational ambiguity based on scant knowledge or emerg- ing scientific evidence, Gyenes said. To address it, âitâs important to make sure that the information that users see first when 1 For more information, see https://connectivity.fb.com/free-basics (accessed September 16, 2020). 2 For more information, see https://www.poynter.org/ifcn (accessed September 16, 2020). 3 For more information, see https://meedan.com/blog/missing-information-not-just-misinformation-is-part-of-the-problem (accessed September 16, 2020). 3 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief they search online is the information you want them to see for a given point in time.â One example of this is Googleâs newer annotation tools, which label and highlight fact checks in Google Search and Google News results. Facebook has also part- nered with FCOs, and when FCOs identify a piece of content as false and flag it to Facebook, âFacebook can integrate this information into their own content systems to hide or reduce the ability to view a particular piece of misinformation, which significantly reduces its distribution,â Gyenes continued. As FCOs have become so central to tech companiesâ responses to health misinformation, Gyenes said, âpublic health and health literacy experts have an opportunity to collaborate, acting as a resource to fact checkers, supporting their work, and advocating for their work to keep their communities informed.â HEALTH LITERACY AND THE CORRECTION OF MISINFORMATION Myers introduced the next panelist, Briony Swire-Thompson, a senior research scientist at the Northeastern University Network Science Institute and a fellow at the Harvard University Institute for Quantitative Social Science. Health information is a unique area of a broader misinformation and disinformation issue, Swire-Thompson said, because there are often financial incentives that do not necessarily exist for other topics of misinformation. Health misinformation can have âparticularly severe consequences regarding quality of life and risk of mortality,â she added (Swire-Thompson and Lazer, 2020). COVID-19 is something of a perfect storm for health misinformation, Swire-Thompson said, ânot least because it takes time for science to establish what is true.â Fake experts speak with certainty, she added, because âwhen you make information up, you donât have to couch everything in the nuance that often accompanies the truth.â Also, the urgency with which scientists are publishing information might mean prepublications and final publications may differ in their findings, and journalists, for example, may not realize âthe difference between new findings and established published product.â Swire-Thompson also noted that predatory journals can pose a problem, because they accept pub- lications for monetary gain and do not have the traditional editorial processes that control for quality and accuracy. She also observed that some search engines like Google Scholar may not always reflect whether literature has been retracted. Addressing health misinformation is a young field, Swire-Thompson said. There is evidence to suggest that critical thinking is a skill that can be taught, she added, but gauging the efficacy of critical thinking programs can be difficult and findings have been mixed. There is also converging evidence to suggest that older adults (65 years or older) share disinformation online seven times as frequently as 18â29-year-old adults (Grinberg et al., 2019; Guess et al., 2019). Teaching critical thinking at universities may not directly affect the older adult population: âWe have to think about where we are implementing health literacy strategies.â To correct health misinformation, there are several options supported by science, Swire-Thompson explained. One option would be to provide factual alternatives. In the traditional paradigm of correcting misinformation, individu- als react well to having one core piece of information replaced by the correct information. In the case of COVID-19, she noted, we often have not yet known the correct alternative to misinformation. Another element of providing factual alternatives is that they should ideally be as simple as the original misinformation. Providing warnings if misinformation will appear is also very effective, though it can be difficult to do this if you are not responsible for presenting the information, she noted. Repeating corrections can be effective as well, she said. There is some evidence that what we believe to be true is what we remember to be true, so repeating corrections multiple times should not be a cause for concern (Schacter and Scarry, 2001). The âbackfire effectâ occurs when you present an individual with a correction and they strengthen their belief in the misconception you are hoping to rectify, Swire-Thompson said, adding that âthis is not a robust empirical phe- nomenon. There have been widespread failures to replicate, researchers have been unable to elicit it under theoretically favorable conditions, and in some cases, the evidence is not very strong to begin withâ (Swire-Thompson et al., 2020). The backfire effect is often confused with the illusory truth effect, which occurs when people believe incor- rect information after repeated exposure to it. However, Swire-Thompson said, as soon as you pair the misinformation component with the correction, belief does not increase. In fact, she added, âif you donât repeat the original misinfor- mation, people often donât even know what you are trying to correct. It is very important to clearly and saliently pair the correction with the original misinformation.â To close her talk, Swire-Thompson observed, âWhen people read clear corrective evidence, they are incredibly good at updating their beliefs.â COMBATING MISINFORMATION DURING THE COVID-19 PANDEMIC: HEALTH LITERACY CONSIDERATIONS Myers introduced the fourth panelist, Wen-Ying Sylvia Chou, program director of the Health Communication and Infor- matics Research Branch at the National Cancer Institute (NCI) at the National Institutes of Health (NIH). 4 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief Chou explained that the rampant spread of information, especially in the online ecosystem, complicates how health literacy strategies can be developed and deployed. When considering these strategies, it is also important to consider intent, she said. Is it to sow division, to gain profit, to create chaos? A piece of COVID-19-related misinforma- tion that convinces people to drink certain juices more is different than misinformation that convinces people not to wear masks or practice social distancing. Also, she said, the medium and its control over which information and in what way information is shared is an important component. A growing body of evidence shows that divisive disinformation campaigns erode consensus, or the sense that there is consensus in the scientific literature, and erodes trust in experts, Chou explained. Echo chambers perpetuate these divisions, and falsehoods tend to spread easier and faster. Credible information is often complex, nuanced, evolv- ing, and uncertain, she continued, and âThese are important things in communication; anyone who has done work in health communication can attest to the importance of source, format, and health literacy of the community or the audience weâre communicating with.â In addition, she said, industry and governmentâs policies and practices toward misinformation and content moderation are rapidly evolving. Chou developed a working taxonomy to identify six major COVID-19 misinformation topics and some ex- amples of each (see Box 1). Planning for the successful uptake of a yet-to-be-developed COVID-19 vaccine should include traditional and newer health literacy approaches, Chou said. Traditional health literacy approaches, she continued, would include proactively promoting vaccine literacy, with â¢ interventions including targeted media campaigns; â¢ tailored peer-to-peer, school-based, or community-based vaccine education; and â¢ providerâpatient communication. One suggestion from digital literacy literature is to have strong, consistent messaging. âWe also need to think about strategies that are already being deployed by anti-vaccine groups,â Chou added. Those strategies, which cannot simply be addressed with fact checking alone, include BOX 1 A WORKING TAXONOMY IDENTIFYING MAJOR COVID-19 MISINFORMATION TOPICS 1. Disease characteristics â¢ Denial of pandemic (e.g., âThe virusâs transmission rate and severity are all overblown.â) â¢ Downplay susceptibility and disease severity â¢ Unsubstantiated symptoms 2. Origins and spread of virus â¢ Conspiracy theories (e.g., âThe virus was invented by Bill Gates.â âThe virus was created in a Chinese lab.â or âThe virus is caused by 5G signals.â) â¢ Cultural practices (e.g., âThe virus spread from bat soup.â) 3. Federal, state, and local government and organization responses â¢ Opposing quarantine and stay-at-home policies â¢ Misinformation about public health professionals 4. Individualsâ prevention behaviors â¢ Questioning social distancing guidelines â¢ Opposing mask wearing (or casting doubt on its effectiveness) 5. Unproven treatments â¢ Home remedies â¢ Unproven drugs (e.g., hydroxychloroquine) â¢ Dangerous advice or products (e.g., suggestion to consume bleach) 6. Vaccine attitudes â¢ Skepticism toward vaccine and its developers â¢ Concerns over safety â¢ Doubting efficacy SOURCE: As presented by Sylvia Chou at the workshop Addressing Health Misinformation Through Health Literacy Prac- tices on July 29, 2020. 5 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief â¢ Propagating rhetoric related to personal freedom and against government mandates â¢ Discrediting agents involved in vaccine development â¢ Targeting already mobilized groups and emotional topics There are some newer strategies that may be effective, Chou said. While there is not a lot of established litera- ture on this sort of the efficacy, she said, âI think these are worthy targets.â Some novel communication strategies may include â¢ Induce skepticism toward disinformation agents (similar to the discrediting of tobacco marketing) â¢ Develop tools to help identify and access credible information sources and resources for debunking myths and misinformation â¢ Cultivate science literacy: understanding the uncertain and evolving nature of science â¢ Combat conspiracy theories by partnering with former members and trusted influencers â¢ Mobilize the public health majority to counter online misinformation â¢ Proactively monitor, flag, downrank, and remove content or accounts that promote misinformation; recon- figure platform features that amplify misinformation (e.g., Twitterâs handling of QAnon and Facebookâs and Googleâs removal of misinformation videos) These efforts can help address cognitive, emotional, social, and contextual factors of misinformation spread, she said. Chou described a study in development at NCIâa randomized trial to look at the use of storytelling and narra- tive-based messages to promote recommended behaviors on COVID-19-related behaviors. The study would explore at- titudes, beliefs, and behaviors at baseline and provide people with congruent messagesâone in a personal experience narrative format and one in a non-narrative didactic formatâto see which one is more effective at changing attitudes and behaviors. Similar research endeavors, Chou added, could help the health literacy and health communication fields to better know how to address health misinformation. âThe priority is to put health literacy in context,â she said. âIt does not exist in a vacuum, and itâs not just about providing good information or filling in the gap where there is a lack of good information. We need to consider the role of technology, identity, values, biases, and emotions, and learn from examples of successful or effective com- munication.â Chou noted that there are three areas in which health literacy approaches or interventions could be con- ducted in new ways (Chou et al., 2020; Peterson et al., 2020). The first, she said, is digital literacy. âItâs not just a matter of discerning a piece of health information. Itâs about fostering fact-checking skills and an awareness of algorithms or techniques used to make you want to click on something or share a meme that gets you really excited or angry.â The second one is redefining what is meant by vulnerability. Traditionally, many of us think about limited health literacy in terms of limited English proficiency or limited education, she said, but vulnerability has taken on a new meaning: It can include those who operate in online information silos or who have conspiratorial mindsets. Health literacy interven- tions need to penetrate those silos, she said. Last, Chou added, any health literacy efforts need to consider the role of trust: How can we foster trust and restore trust as part of any health literacy initiative? DISCUSSION Myers observed that each speaker was optimistic about the health literacy communityâs ability to address health misinformation, having each identified a variety of promising tactics and resources. One theme in questions from the audience members was the important role of so many players in the health system, she noted, including the roles of health information technology professionals, the technology industry, fact checkers, journalists, and clinicians. All are important in addressing health misinformation. She invited the panelists to reflect on the discussion. Starbird agreed that health misinformation needs to be addressed from multiple perspectives, including health literacy, education, and social media platforms. Social media platforms have been making changes, but they need to continue to do so, she said, adding that there is also a role for government and policy, though it will be com- plicated. She noted that she appreciated Swire-Thompsonâs research suggesting that the backfire effect should not be a consideration when addressing misinformation. âFor years, we had been telling people not to correct other people online, telling journalists not to talk about it because it would amplify [the misinformation]. But we gave the wrong 6 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief adviceâit began to fester at the edges and move into the conversation.â But, she added, we started to develop and spread new norms around corrections with empathy. Another element of disinformation campaigns that are difficult to address, Starbird said, is their intersection with authentic activism, in which politically motivated groups are targeted to become vectors of misinformation and disinformation around health and COVID-19. It is hard to address, she said, but it is important to find a way to help activist communities protect themselves from infiltrations by people that have other motives. It is about education, but it is also about platform design and platform policies, she added. âI think we need to work together holistically across the different sectors to address the problems.â Gyenes added that the tech community has learned a lot from the public health community about communi- cation, intervention design, controlling for factors and populations, and understanding that populations have nuanced needs when it comes to outreach. âItâs our hope,â she said, âthat in coming to this discussion from the public health technology and psychology sectors, we can work to create more interdisciplinary solutions.â Swire-Thompson echoed Starbirdâs and Gyenesâs observations. The science is impacting the policy and vice versa, she said. Researchers have a responsibility to develop replicable evidence-based recommendations and social media platforms have a responsibility to make changes that make sense for a factual, evidence-based world. Chou added: âWe have seen more than 150,000 deaths already. From the perspective as a public health practitioner, and someone who cares about communication and health literacy,â she said, âwe need to take novel ap- proaches and we need to try something different. The traditional health literacy approaches have worked for certain things, but they are not working in the information ecosystem.â We canât remain naÃ¯ve, she said. âHealth misinforma- tion is obviously not a fringe topic, and we need to work together.â The people who study misinformation and dis- information need to be at the table when we are designing public health campaigns and messaging so we avoid the problem of inaccessible information, she said. Parker thanked the panelists, noting that she was inspired by their thinking, research, work, and collaborative spirit. âIâm hearing some new horizons for health literacy,â she added, âFact checking is probably a new horizon for health literacy: the role of having people who understand health and public health, and the various entities that are a part of it.â She noted her appreciation for Chouâs comments that âhealth literacy does not live in a vacuum,â as well as the emphasis on the importance of trust. Concluding the workshop, Parker observed that building trust is âtruly foun- dational to our individual and collective lives.â âââ REFERENCES Chou, W. S., A. Gaysynsky, and J. N. Cappella. 2020. Where we go from here: Health misinformation on social media. American Journal of Public Health 110:S273âS275. https://doi.org/10.2105/AJPH.2020.305905. Grinberg, N., K. Joseph, L. Friedland, B. Swire-Thompson, and D. Lazer. 2019. Fake news on Twitter during the 2016 U.S. presidential election. Science 363(6425):374â378. doi: 10.1126/science.aau2706. https://science.sciencemag.org/content/363/6425/374 (accessed October 1, 2020). Guess, A., J. Nagler, and J. Tucker. 2019. Less than you think: Prevalence and predictors of fake news dissemination on Facebook. Science Advances 5(1):eaau4586. doi: 10.1126/sciadv.aau4586. https://advances.sciencemag.org/content/5/1/eaau4586 (accessed October 1, 2020). Peterson, E. B., W. S. Chou, C. Rising, and A. Gaysynsky. 2020. The role and impact of health literacy on peer-to-peer health communication. Studies in Health Technology and Informatics 269:497â510. https://doi.org/10.3233/SHTI200058. Schacter, D. L., and E. Scarry (eds.). 2001. Memory, brain, and belief (Vol. 2). Cambridge, MA: Harvard University Press. Swire-Thompson, B., and D. Lazer. 2020. Public health and online misinformation: Challenges and recommendations. Annual Review of Public Health 41(1):433â451. https://doi.org/10.1146/annurev-publhealth-040119-094127. Swire-Thompson, B., J. DeGutis, and D. Lazer. 2020. Searching for the backfire effect: Measurement and design considerations. Journal of Ap- plied Research in Memory and Cognition 9(3):286â299. https://doi.org/10.1016/j.jarmac.2020.06.006. 7 Copyright National Academy of Sciences. All rights reserved.
Addressing Health Misinformation with Health Literacy Strategies: Proceedings of a Workshopâin Brief DISCLAIMER: This Proceedings of a Workshopâin Brief was prepared by Alexis Wojtowicz as a factual summary of what occurred at the workshop. The statements made are those of the rapporteur or individual workshop participants and do not necessarily represent the views of all workshop participants; the planning committee; or the National Acad- emies of Sciences, Engineering, and Medicine. *The National Academies of Sciences, Engineering, and Medicineâs planning committees are solely responsible for or- ganizing the workshop, identifying topics, and choosing speakers. The responsibility for the published Proceedings of a Workshopâin Brief rests with the institution. The members of the planning committee were Laura Bartlett, National Library of Medicine; Jennifer Dillaha, Arkan- sas Department of Health; Ellen Markman, Stanford University; Michael M. McKee, University of Michigan School of Medicine; Laurie Myers, Merck Sharp & Dohme Corp.; and Ruth Parker, Emory University School of Medicine. REVIEWERS: To ensure that it meets institutional standards for quality and objectivity, this Proceedings of a Workshopâ in Brief was reviewed by Christopher R. Trudeau, University of Arkansas at Little Rock William H. Bowen School of Law, and University of Arkansas for Medical Sciences Translational Research Institute, and Amanda J. Wilson, National Library of Medicine, National Institutes of Health. Lauren Shern, National Academies of Sciences, Engineering, and Medicine, served as the review coordinator. STAFF: Rose Marie Martinez and Alexis Wojtowicz, Board on Population Health and Public Health Practice, Health and Medicine Division, National Academies of Sciences, Engineering, and Medicine SPONSORS: This workshop was partially supported by AbbVie Inc.; California Dental Association; Eli Lilly and Co.; Health Literacy Media; Health Literacy Partners; Health Resources and Services Administration; Mserck Sharp & Dohme Corp.; National Library of Medicine; Northwell Health; and Pfizer Inc. For additional information regarding the workshop, visit www.nationalacademies.org/HealthLiteracyRT Suggested citation: National Academies of Sciences, Engineering, and Medicine. 2020. Addressing health misinforma- tion with health literacy strategies: Proceedings of a workshopâin brief. Washington, DC: The National Academies Press. https://doi.org/10.17226/26021. Health and Medicine Division Copyright 2020 by the National Academy of Sciences. All rights reserved. Copyright National Academy of Sciences. All rights reserved.