by Anna Harvey, SSRC President
The World Health Organization defines an infodemic as the spread of “false or misleading information in digital and physical environments during a disease outbreak. It causes confusion and risk-taking behaviors that can harm health. It also leads to mistrust in health authorities and undermines the public health response.” The US Office of the Surgeon General has declared health misinformation to be a significant public health challenge. In an October 17, 2021, interview on MSNBC, outgoing National Institutes of Health Director Francis Collins expressed regret over the US response to COVID-19 misinformation: “I think we underestimated the vaccine hesitancy issue…. I wish we had somehow seen that coming and tried to come up with some kind of a ‘Myth Buster’ approach to try to block all of the misinformation and disinformation that’s gotten out there, all tangled up with politics, and which is costing lives.”
Yet, despite widespread concern about the potential impacts of mis- and disinformation on health outcomes, we know little about the magnitudes of those impacts nor about their differential effects across sociodemographic groups. We also know little about cost-effective interventions that may mitigate those impacts and increase the spread and uptake of accurate health information. For us to be able to respond more effectively and equitably to the new challenges posed by infodemics, we need investment in research that can provide policy-relevant evidence about the effects of mis- and disinformation on health outcomes and about the efficacy of interventions that may counter the effects of mis- and disinformation and increase the spread and uptake of accurate health information.
The Mercury Project, a three-year global research consortium recently launched by the Social Science Research Council with funding support from the Rockefeller Foundation, the Robert Wood Johnson Foundation, and Craig Newmark Philanthropies, will build the evidence base for an infodemic response informed by behavioral science. The Mercury Project will fund research projects that estimate the causal impacts of mis- and disinformation on online and offline health, economic, and social outcomes in the context of the COVID-19 pandemic, including estimating the differential impacts across sociodemographic groups and quantifying the global costs of those impacts; and estimate the causal impacts of online or offline interventions in the United States, Africa, Asia, and Latin America to increase uptake of COVID-19 vaccines and other recommended public health measures by countering mis- and disinformation. (In this context, “online” is defined as social media/search platforms, while “offline” is defined as other media such as email, mail, and SMS messages.) The Mercury Project will also provide a suite of research-sharing and policy-development activities for grantees and other invited organizations to enable more effective policy and regulatory responses to future infodemics.
Online Information, Misinformation, And Behavior
There is mixed evidence regarding whether inaccurate and misleading health information circulating on social media platforms may reduce COVID-19 vaccination uptake. For example, in a recent survey experiment conducted on samples drawn from the United States and the United Kingdom, Sahil Loomba and colleagues found that exposure to examples of online COVID-19-related health misinformation reduced subjects’ vaccination intentions. However, Santosh Vijaykumar and colleagues found in a United Kingdom-based WhatsApp experiment that exposure to COVID-19-related misinformation had no effect on subjects’ belief in the misinformation and found in a similar Brazil-based experiment that exposure to COVID-19-related misinformation reduced subjects’ belief in the misinformation. To develop effective responses to infodemics, we need more causal evidence of the impacts of online health misinformation on outcomes.
There is also some concern that the engagement strategies of the social media platforms may incentivize the sharing of misinformation online, despite users’ stated preference to share only accurate information. For example, there is some evidence that social media platforms may accelerate the politicization and polarization of beliefs. Ro’ee Levy showed in a sample of US-based Facebook users that randomized exposure to counter-attitudinal news outlets on the platform led to more positive attitudes toward the opposing party, but that Facebook pushed more posts to users randomized to follow news outlets consistent with their baseline political attitudes, relative to users randomized to exposure to news outlets divergent from their baseline political attitudes. Similar empirical strategies may be able to reveal the effects of social media platforms’ engagement strategies on exposure to health information and misinformation, and on downstream online and offline outcomes, for platform users around the globe.
Some interventions may have the potential to reduce the impact of online health-related misinformation on outcomes in a variety of geographical contexts. Levy showed that a significant number of US-based Facebook users randomized to an encouragement to follow counter-attitudinal news outlets did in fact follow those outlets, with positive downstream effects on depoliticization—suggesting that such encouragement interventions may counteract the platform’s politicizing effects. Interventions that encourage reductions in social media usage may also counteract these effects. For example, Hunt Allcott and colleagues showed that US-based users who were randomized to temporarily deactivate their Facebook accounts had less polarized policy opinions after deactivation.
Interventions that remind social media users of the value of accurate information may also counteract the effects of platform engagement strategies. For example, Gordon Pennycook and colleagues showed in a US-based survey experiment that an intervention focusing subjects’ attention on accuracy significantly reduced intentions to share inaccurate COVID-19-related information, relative to accurate COVID-19-related information. In a field experiment conducted on a sample of US-based Twitter users, Pennycook and colleagues reported that sending Twitter users an accuracy nudge significantly reduced the extent to which they shared inaccurate information on the platform, an important demonstration of the efficacy of the intervention in a real-world context. Jon Roozenbeek and Sander van der Linden reported that a UK-based online intervention engaging subjects in the creation of misinformation significantly reduced the perceived reliability of tweets sharing misinformation, suggesting that the intervention may “inoculate” users from misinformation.
Finally, there is also some evidence that increasing individuals’ exposure to reliable online health information may increase the uptake of that information. Emily Breza and colleagues found that US-based Facebook users randomized to view ads containing video messages from health professionals about the dangers of travel during the 2020 holiday season decreased distance traveled, and COVID-19 infection rates in intervention counties and ZIP codes fell. Alex Vernon Moehring and colleagues found in a large international Facebook survey experiment that increasing exposure to accurate information about growing COVID-19 vaccine acceptance increased individuals’ own vaccine acceptance.
The success of these initial interventions is promising and suggests a direction in which to further develop the evidence base for infodemic response.
Offline Information, Misinformation, And Behavior
We have little evidence of the causal impacts of offline health-related mis- and disinformation on outcomes. However, there are several studies showing that increasing individuals’ exposure to offline reliable health information may increase the uptake of that information. For example, several studies have shown that interventions delivering reliable vaccine information via mailed letters, email messages, SMS messages, or a mobile health app can increase flu vaccination uptake. However, studies on flu vaccine uptake may not generalize to the more politicized information environment of COVID-19 vaccine uptake.
Some recent work has evaluated interventions that increased the exposure of individuals to reliable offline COVID-19-related health information. Abhijit Banerjee and colleagues reported that SMS messages containing a video message delivering accurate COVID-19-related information increased the reporting of symptoms, decreased travel, and increased estimated hand washing in West Bengal, India. Jason Abaluck and colleagues reported that an intervention consisting of mask promoters in public spaces reminding non-mask wearers of the benefits of masking resulted in tripled mask-wearing and decreased COVID-19 transmission in Bangladesh.
For some interventions, timing may be crucial. Hengchen Dai and colleagues reported that SMS messages containing reliable vaccine information sent to patients by their US-based health care provider relatively early in 2021 increased COVID-19 vaccination uptake. However, Nathaniel Rabb and colleagues reported that a similar intervention implemented by Rhode Island’s public health department later in 2021 failed to have any impacts on COVID-19 vaccination uptake.
Mitigating Disparities In Information Uptake
Finally, there are some studies of informational interventions that may be able to mitigate racial and ethnic disparities in information uptake. In a US-based survey experiment, Marcella Alsan and colleagues found that physician video messages delivering COVID-19-related health information increased COVID-19 knowledge among Black and Latinx subjects, but that alternative interventions tailored to Black and Latinx communities had no additional knowledge effects for either Black or Latinx subjects, although messages delivered by race/ethnicity concordant physicians did increase COVID-19-related information seeking. In a similar US-based survey experiment, Carlos Torres and colleagues likewise found that physician video messages delivering COVID-19-related health information increased COVID-19 knowledge, the demand for COVID-19 information, and the willingness to pay for a mask among both Black and White subjects, but that videos tailored to Black communities had no additional effects for either Black or White subjects. More work is needed to identify interventions that can mitigate disparities in information uptake around the globe.
The Behavioral Science Of Infodemic Response
As we prepare for the next pandemic, we need social and behavioral scientists and public health experts to work together to advance the behavioral science of infodemic response. Through the Mercury Project, the Social Science Research Council will fund researchers to build on the existing knowledge base and discover new, evidence-based, data-driven tools, methods, and interventions to counter mis- and disinformation and to support the spread and uptake of accurate health information. These solutions will be an essential resource for global policy makers and for social media and technology companies as they build an information ecosystem that supports the sharing of accurate and effective health information.
Reprinted with permission from Health Affairs.