Social and behavioral science to build vaccine demand and healthier information environments.
Mercury Project teams are working around the world to find cost-effective and scalable interventions that support science-based health decision-making.

Mercury Project teams are working around the world to find cost-effective and scalable interventions that support science-based health decision-making.
Teams working across 17 countries
Researchers representing 60 institutions in 20 countries
in funding with average of $600k in research money per team
Adolescence is a time when youth start engaging more with their own health—and seeking information about their bodies and health choices online. While this is true of adolescents the world over, it can present an extra challenge in contexts of low literacy and limited internet experience. Partnering with local authorities, researchers will evaluate if inoculating against misinformation through a long-term grassroots training program will be effective in combating health misinformation among secondary school students in India. In randomly selected schools, students will participate in two-hour sessions held every month for half a school year, in which they will learn about the problem of health misinformation in India and its consequences, how and why misinformation spreads, and strategies to inoculate against it. The effectiveness of these sessions will be measured by comparing between students in schools with and without these sessions, including students’ self-reported attitudes and identification of misinformation, as well as behavioral outcomes such as whether they flag dubious content online, comply with public health measures, and the quality of their news diet.
Simon Chauchard (University Carlos III Madrid, Spain), Sumitra Badrinathan (American University, United States)
In Tanzania, local radio stations—including station managers, journalists, and DJs—act as trusted messengers and influencers, including on health information and behavior. Early in the Covid-19 pandemic, media restrictions limited the ability of radio to respond to Covid-19. But local radio stations generally want to serve their communities with accurate and relevant public health information. Now, with restrictions loosened, an interdisciplinary team will partner with local radio stations over two years to assess the effects of a nationwide campaign, Afya Yako (“Your Health” in Swahili). Researchers will randomly assign 30 hyper-local radio stations (a mix of for-profit, community, and donor-driven radio) to air this campaign, with another 30 stations acting as a control group. A subset of villages within the campaign’s broadcast radius will also receive an in-person mobilization campaign, and a subset of citizens within the broadcast radius will be invited to participate in Afya Yako WhatsApp groups. Over two years, researchers and radio stations will assess the impact of the program on Covid-19 vaccination knowledge, attitudes, and uptake, as well as trust in and use of essential health services. Beyond the campaign’s immediate impact, researchers hope that the project will equip interested parties at local radio stations with the knowledge and skills to recognize and dispel misinformation as it emerges.
Salma Emmanuel (University of Dar es Salaam, Tanzania), Dylan Groves (Columbia University, United States), Tausi Kida (Economic and Social Research Foundation (ESRF), Tanzania), Constantine Manda (ESRF, Tanzania), John Marshall (Columbia University, United States), Anelisa Martin (National Institute for Medical Research, Tanzania), Said Rashid (ESRF, Tanzania), Noela Ringo (ESRF, Tanzania), George Temba (ESRF, Tanzania), Zakayo Zakaria (ESRF, Tanzania)
In 2018, in response to the digital gender divide, the government of Chhattisgarh (a state in central India), launched the ambitious SKY program, providing free smartphones to two million women in rural areas across the state who live in locations with more than 1,000 residents. With this new internet access, women may have better access to information—including health information—that may be accurate or inaccurate. With the arrival of the pandemic, India has been a major source and consumer of Covid-19 and other health-related misinformation. While we all have a sense that misinformation is harmful and can impede protective health behaviors, it has been difficult to know if this is true or to get a fix on how harmful it is. To address this gap in our understanding, the research team will compare villages included in the SKY initiative and those not included, allowing them to assess the causal effect of internet access on information-seeking and health-protective behaviors. Researchers will identify who is more susceptible to misinformation (across gender, education, and age distribution), indicating how resources from the global community could be mobilized to target communities vulnerable to misinformation.
Rohini Pande (Yale University, United States), Giorgia Barboni (University of Warwick, United Kingdom), Erica Field (Duke University, United States), Natalia Rigol (Harvard University, United States), Simone Schaner (University of Southern California, United States), Anwesha Bhattacharya (Harvard University, United States), Aruj Shukla (University of Southern California, United States), Charity Troyer Moore (Yale University, United States)
This “megastudy” will simultaneously test different tactics designed to increase Covid-19 booster uptake in the United States. Only half of eligible Americans have received their booster, indicating the need for tactics that can effectively encourage vaccination to counter misinformation, lack of motivation, and logistical barriers. The megastudy will be designed by an interdisciplinary team, which will develop about 10 interventions to encourage vaccination, and simultaneously evaluate their efficacy among at least 500,000 people. This megastudy aims to identify which tactics effectively increase vaccinations overall and which tactics work best for whom (e.g., based on age, gender, race, etc.), which could help address disparities in vaccination rates across different demographic groups. The insights gleaned from this megastudy can be incorporated into vaccine promotion campaigns to increase vaccinations at scale.
Researchers: Katherine Milkman (University of Pennsylvania, United States), Angela Duckworth (University of Pennsylvania, United States), Neil Lewis, Jr. (Cornell University, United States), John List (University of Chicago, United States), Kevin Volpp (University of Pennsylvania, United States), Mitesh Patel (Ascension Health, United States), Dena Gromet (University of Pennsylvania, United States), Sean Ellis (University of Pennsylvania, United States), Joseph Kay (University of Pennsylvania, United States), Rob Kuan (University of Pennsylvania, United States)
Practitioners are currently searching for ways to help people distinguish between true and false information and to reduce the spread of false information in online spaces. Meanwhile, scientists have recently identified and tested a number of interventions aimed to do just that. However, these interventions have each been tested in different conditions with different types of users. Thus, practitioners are left unsure about which interventions will work best for their particular situation. In other words, we have a bunch of problems and a bunch of new tools, but it’s not yet clear which tool is most useful for each problem. A multidisciplinary, multinational team of 80 misinformation experts will work to identify the eight most promising interventions and test their effectiveness with approximately 30,000 participants. The researcher team will then test the most effective interventions on English-speaking YouTube. This process will allow the researchers to create a handbook for practitioners, detailing the relative strengths and weaknesses of each intervention and guiding their choices. There will never be one winning intervention that works for all users and all situations, but the project will help practitioners build a toolkit of useful misinformation interventions and understand which interventions will be most effective for their particular problem.
Lisa Fazio (Vanderbilt University, United States), David Rand, (Massachusetts Institute of Technology, United States), Stephen Lewandowsky (University of Bristol, United Kingdom), Jon Roozenbeek (University of Cambridge, United Kingdom), Briony Swire-Thompson (Northeastern University, United States), Adam Berinsky (Massachusetts Institute of Technology (MIT), United States), Gordon Pennycook (University of Regina, Canada), Andrew Guess (Princeton University, United States), Panayiota Kendeou (University of Minnesota, United States), Eryn Newman (Australian National University, Australia), Joanne Miller (University of Delaware, United States)
Because misconceptions about Covid-19 and vaccination vary considerably between nations and different demographic groups within them, public health messages are more persuasive when they are tailored and delivered by members of one’s group, including members in online communities. Partnering with grassroots organizations in Brazil, Mexico, and the US, this study will evaluate the effectiveness of health-information messaging generated from members of a given community. Researchers will recruit organizations with online presence in each nation, and ask them to revise currently used Covid-19 messages in order to better appeal to their peers. Using traditional online surveys and randomized evaluations on Facebook and YouTube, researchers will assess how well these community-generated messages fare in improving vaccine attitudes and intentions, relative to the messages currently in use, which have been created by technical experts who represent government and public health organizations. A follow-up study will test how well community generated messaging combats health misinformation. Finally, project findings will result in vital insights to the development and validation of a messaging pipeline infrastructure that, once in place, holds great potential to optimize targeted messaging for a variety of public health attitudes/behaviors and subcommunity groups across the world.
Charles Senteio (Rutgers University, United States), David Rand (Massachusetts Institute of Technology (MIT), United States), Antonio Alonso Arechar (Center for Research and Teaching in Economics, Mexico), Luke Hewitt (Rhetorical, LLC, United States), Gordon Pennycook (University of Regina, Canada), Paulo Sérgio Baggio (Mackenzie Presbyterian University, Brazil), Ben Tappin (MIT, United States)
In Haiti, Malawi, and Rwanda, community health workers (CHWs) working with Partners in Health have expressed stress and concern that they do not have all the information and skills they need to encourage vaccination—including for Covid-19—as they work in difficult circumstances and changing information environments. CHWs are the frontline of health information and care, as well as the primary entry point into the larger health system. Researchers working in and with Partners in Health and its local affiliates—Abwenzi Pa Za Umoyo (Malawi), Zanmi Lasante (Haiti), and Inshuti Mu Buzima (Rwanda)—will test a new system to support these trusted health messengers as they in turn support their communities. This SMS- and phone-based system will allow CHWs to identify and report misinformation and raise questions, then receive tailored scripts and guidance to use in their communities. Researchers will randomly select areas around 44 community health clinics to better understand what the CHWs there know and how they are communicating about both Covid-19 and mental health. Partners in Health will then provide 12 months of information messaging to CHWs on one of those topics (also randomly selected). The research teams will periodically measure attitudes and intentions about Covid-19 and about mental health in those communities and track if use of services such as Covid-19 vaccinations changes after CHWs have more accurate, tailored information to share.
Chiyembekezo Kachimanga (Abwenzi Pa Za Umoyo, Malawi), Bethany Hedt-Gauthier (Harvard Medical School, United States), Kobel Dubique (Zanmi Lasante, Haiti), Erick Baganizi (Inshuti Mu Buzima, Rwanda), Jones Chimpukuso (Abwenzi Pa Za Umoyo, Malawi), Dale Barnhart (Harvard Medical School, United States), Ximena Tovar (Harvard Medical School, United States)
Despite efforts being made by the Government of Ghana and other interested parties to increase Covid-19 vaccine uptake across the country, widespread misinformation surrounding the vaccine has led to a distrust in state institutions and erroneous safety concerns. To date, medical doctors—trusted health messengers in many communities—have not been systematically mobilized to promote vaccination and dispel misinformation. Researchers in this study will test two different approaches over the course of a year. They will randomize selected communities to be in two comparison groups where one group will receive push SMS messages from doctors and the other group will receive these same messages plus a chance to interact directly with local doctors and other participants in WhatsApp groups. Working with vaccine-hesitant individuals in communities in both northern and southern Ghana and across urban and rural settings, this research team will assess how this mHealth intervention causally affects vaccine attitudes, intentions, and ultimate behaviors. Participants who indicate their intention to get vaccinated following the intervention will be directed to existing Ghana Health Service facilities at the community, subdistrict, and district levels. In addition to increasing Covid-19 vaccination uptake in Ghana, this project seeks to determine whether a Covid-focused intervention also increases generalized trust in doctors and the medical establishment.
Hubert Amu (University of Health and Allied Sciences, Ghana), Luchuo Engelbert Bain (University of Lincoln, United Kingdom), Mylene Lagarde (London School of Economics, United Kingdom), Alberta Adjebeng Biritwum-Nyarko (Ghana Health Service, Ghana), Salifu Amadu (Innovations for Poverty Action, Ghana)
While fact-checks and digital literacy training can effectively counter misinformation, fact-checks have usually very low reach, and getting social media users to consume such training is challenging. Partnering with fact-checkers AfricaCheck (in Kenya, Nigeria, and South Africa) and ChequeaBolivia, this research team will conduct two studies to test approaches to counter misinformation and change users’ engagement with reliable information. In the first study, in Kenya, South Africa, and Bolivia, the team will recruit 600 positive social media influencers—high-profile journalists and social activists with large followings—interested in countering misinformation among their followers, and provide a randomly selected half of those influencers with digital-literacy—training resources and fact-checks, along with modest financial compensation. They will track followers’ daily online posting and sharing behavior over six months. In the second study, researchers will first test whether providing the fact-checkers in Bolivia, Kenya, Nigeria, and South Africa with data on viral posts by serial misinformation spreaders improves the quality of their fact-checking. The researchers will choose a random half of misinformation-spreaders and share such information with the fact-checkers over a six-month period, and see whether it increases the likelihood that the fact-checkers verify those posts. In Bolivia, the researchers will additionally study the effect of the fact-checker directly reaching out to misinformation spreaders and their followers to debunk the misinformation they have shared or have been subjected to and provide digital literacy training materials. They will assess whether such reaching out improves the quality of followers’ posting and sharing behavior over six months.
Antonella Bandiera (Instituto Tecnológico Autónomo de México, Mexico), Jeremy Bowles (Stanford University, USA), Kevin Croke (Harvard University, United States), Romain Ferrali (Aix-Marseille School of Economics, France), Horacio Larreguy (Instituto Tecnológico Autónomo de México, Mexico), Shelley Liu (University of California Berkeley, United States), John Marshall (Columbia University, United States), Daniela Pinto Veizaga (Instituto Tecnológico y de Estudios Superiores de Monterrey, Mexico)
One potential barrier to internalizing and acting on new health information is whether we are able to hear it in private and ask questions of a trusted messenger. Building on early lessons of providing “health ambassadors” with skills to address individuals’ concerns about the Covid-19 vaccine through brief but meaningful interactions, this research collaborative will work in Côte d’Ivoire, Senegal, Malawi, and Zimbabwe. In each country, this multinational, multidisciplinary collaborative will partner with health authorities to recruit and train ordinary citizens (college students and local community representatives) to serve as health ambassadors using a replicable protocol. The health ambassadors will then proactively engage individuals around vaccine risks and benefits face-to-face in an effort to increase public fluency and confidence in reliable scientific information about Covid-19 and Covid-19 vaccine uptake to ultimately increase vaccination rates. Health ambassadors will offer a direct and private opportunity to discuss vaccination concerns. A multisite randomized control trial will test the nine-month intervention, with the aim of generating generalizable insights with the potential to scale. The ambassador training protocol will be developed in a formative, qualitative phase of the study, and the RCT will roll out alongside vaccination campaigns in urban and rural areas. The study will help health authorities to define a scalable strategy for increasing the speed of public uptake of urgent health measures.
Martin Atela (PASGR, Kenya), Arsene Brice Bado (Center for Research and Action for Peace (CERAP), Kenya), Anthony Mveyange (Partnership for African Social and Governance Research (PASGR), Kenya), Mame Mor Syll (Universite Gaston-Berger, Senegal), Cyrus Samii (New York University, United States), Maarten Voors (Wageningen University, Netherlands), Horace Gninafon, EGAP (UC Berkeley, United States), Yapo Félix Boa (Université Félix Houphouët Boigny, Cote d’Ivoire), Samba Cor Sarr (Ministère de la Santé, Senegal), Susan Hyde (UC Berkeley, United States), Paul Kawale (Kamazu University of Health Sciences, Malawi), Fortunate Machingura (Centre for Sexual Health and HIV AIDS Research (CeSHHAR), Zimbabwe)
Meta, Twitter, YouTube, and other social media websites have struggled to make effective governance decisions on behalf of billions of users every day, and misinformation continues to run rampant. At an individual level, many users of these systems find themselves trapped in health misinformation bubbles, sometimes built inadvertently via the networks they have created. In an attempt to address these challenges, this team of researchers will design, build, and evaluate network-transforming interventions: software-assisted systems to alter underlying networks that spread health misinformation online. In this context, a health misinformation monitor Twitter account will continuously track emerging health misinformation on English-speaking Twitter and deliver counter-messaging to the recipients of that misinformation—with the aim of motivating users to unfollow the source. The research team will introduce a roadmap for misinformation interventions that can operate outside of corporations and governments, fill the gaps left by the constraints of industry and regulation, and promote safe and trustworthy online experiences.
Eric Gilbert (University of Michigan, United States), Ceren Budak (University of Michigan, United States), Sarita Schoenebeck (University of Michigan, United States), Joshua Ashkinaze (University of Michigan, United States)
Addressing the spread of misinformation online requires durable interventions that adapt social media architectures to promote informed decision-making. In search of such an intervention, this team of researchers will conduct one of the most systematic tests to date of the welfare effects of altering information environments by decreasing exposure to untrustworthy sources. Importantly, their intervention will create sustained changes to information exposure over time by encouraging users to change the composition of the accounts they follow and measure their effects on real-world behavior. In this way, they will assess the extent to which respondents can be nudged to alter the set of accounts they follow and thus the information they consume. Through this study, the team will help popularize a design approach that combines a paired individual-level survey and social media data, providing a new framework for conducting and evaluating online behavioral interventions. Use of this design will provide a building block for future research on the effects of online information exposure on offline behavior. Finally, using the results of this study, researchers will be able to provide important guidance to platforms, policymakers, and researchers about how to most effectively counter misinformation about COVID-19 and other topics.
Betsi Grabe (Indiana University), Brendan Nyhan (Dartmouth College), Filippo Menczer (Indiana University), Giovanni Luca Ciampaglia (University of Maryland), Ro’ee Levy (Tel Aviv University)