Expert Commentary

How to combat health misinformation online: A research roundup

Is it possible to stem the tide of health misinformation on the internet? We summarized seven recent experimental studies on the efficacy of interventions used to correct false info online.

mobile phone apps
(Pixabay)

Many Americans are turning to the internet with their health questions. And their use of the internet to seek answers isn’t limited to search engines and established health resources. Researchers at Microsoft analyzed survey and search data to find that “a surprising amount of sensitive health information is also sought and shared via social media.”

While social media helps connect people with similar experiences, it also carries significant pitfalls. In an op-ed published in Nature, Heidi Larson, an anthropologist and director of the Vaccine Confidence Project at the London School of Hygiene & Tropical Medicine, writes: “The deluge of conflicting information, misinformation and manipulated information on social media should be recognized as a global public-health threat.”

Is it possible to stem the tide of misinformation online? If it is, what are the most effective ways to do so? We turned to a source of high-quality information – peer-reviewed academic research – to look for answers. Below we’ve summarized seven recent academic studies on the efficacy of interventions used to correct health misinformation. It’s worth noting that the first three studies included in this roundup focus on a small group of students from one university. Additionally, all of these studies are behavioral experiments, which tend to have relatively small sample sizes, and are intended to complement other forms of research.

I Do Not Believe You: How Providing a Source Corrects Health Misperceptions Across Social Media Platforms
Vraga, Emily K.; Bode, Leticia. Information, Communication & Society, October 2018.

If you want to debunk misinformation, back up your claims with a source, this study finds. The authors used the case of Zika to test the effects of providing a source to correct misinformation on Facebook and Twitter. They analyzed 271 online survey responses from a sample of students at a large Mid-Atlantic university. Participants were instructed to read a simulated Facebook or Twitter feed and then answer related questions.

There were three experimental conditions tested in the Twitter and Facebook feeds. The control group saw three pages of control posts about social interactions and news. The two experimental groups viewed the same three pages of control posts, along with an additional page consisting of a single news post. That post consisted of “an anonymous user claiming the Zika outbreak was caused by genetically modified mosquitoes in Brazil and posting a news story from USA Today that validated that claim (in reality, this story was created by researchers).”

One of the experimental groups also viewed replies to the post from two commenters who shared links to articles that debunked the fake news story. The other experimental group saw comments where “two individual posters claimed the information about genetically modified mosquitoes was false but did not provide any outside sources to support their claims.”

A post-experiment survey asked participants to rate their level of trust in the fake USA Today story.

The researchers found that correcting misinformation with a source successfully reduced belief in false information found on both Facebook and Twitter, as compared with the control group.

But in terms of perceptions of credibility, “On Facebook, providing an external source to substantiate the correction significantly improved perceptions of the corrective comments, while adding this source did not affect evaluations of the corrective replies on Twitter.”

The authors suggest the following explanation: “If Twitter is seen as a space for news, including a source may be redundant for evaluations of the social correction, where the expectation is already that a tweet is offering new, and presumably credible, information.”

The authors conclude by stressing the importance of studying social media platforms like Facebook and Twitter as separate entities and the efficacy of correcting misinformation with a source. “These findings offer hope for planning and implementing social media campaigns to address misperceptions on a range of health and science issues: one’s peers can effectively offer social corrections, so long as they provide a source for their information, on diverse social media platforms,” they write.

See Something, Say Something: Correction of Global Health Misinformation on Social Media
Bode, Leticia; Vraga, Emily K. Health Communication, September 2018.

This study analyzes data collected from a different subset of participants — 136 respondents — from the same survey as the preceding experiment. The researchers showed students three pages of posts on a Facebook feed followed by the fabricated USA Today story about Zika and genetically modified mosquitoes.

But for this study, 43 students were shown a feed in which the Zika story was followed by algorithmic correction – two subsequent stories that debunked the claim that genetically modified mosquitoes caused Zika appeared in the news feed, as though delivered by the Facebook algorithm. The stories cite Snopes.com and the U.S. Centers for Disease Control and Prevention as sources. Another experimental condition tested on 48 students involved a “social correction,” in which two commenters discredited the information in the original post and provided links to Snopes.com and the CDC debunking the news story. As with the first experiment, participants in the algorithmic and social correction groups were then asked about their belief in the misinformation about Zika as well as the perceived credibility of the related stories or commenters’ responses.

The researchers find that both the algorithmic and social corrections worked to reduce misperceptions about the cause of the Zika virus in Brazil. Compared with the control group, however, the algorithmic correction was more effective than the social correction.

The authors suggest that public health authorities might encourage social media users to refute false health information with appropriate sources as a strategy for combating misinformation. “Such an effort may prove more fruitful than attempting to partner with social media platforms to encourage the presence of refuting information in algorithms that produce stories related to health information, especially given the many limitations of such algorithms,” they write.

Using Expert Sources to Correct Health Misinformation in Social Media
Vraga, Emily K.; Bode, Leticia. Science Communication, October 2017.

This study looks at whether the number and source of corrections on social media have differing impacts on users’ belief in misinformation. The researchers analyzed 1,384 online survey responses gathered from students at a large Mid-Atlantic university. The students viewed a simulated Twitter feed and answered questions relating to what they saw.

The five conditions of interest for this paper include the control – a post about a link to a news story about housing prices – and four experimental feeds, in which a post links to a false news story about the Zika outbreak in the United States being caused by the release of genetically modified mosquitos.

In the experimental feeds, the Zika story is followed by responses correcting the misinformation in the following ways: 1) a single user responds, 2) the CDC responds, 3) an individual responds, followed by the CDC, 4) the CDC responds, followed by an individual user. The follow-up survey asked respondents to rate their agreement with a series of statements about the cause of the Zika outbreak, the credibility of the CDC and the trustworthiness of other users on Twitter.

The researchers find that the correction from the CDC alone was able to reduce misperceptions about the spread of Zika. A single correction from an individual user, on the other hand, did not reduce misperceptions. CDC and individual user pairings also worked to correct misinformation. Further, the researchers find that corrections were more effective among those who believed before the experiment that Zika was caused by genetically modified mosquitos (as measured by survey responses collected prior to the experiment).

The researchers note that the credibility of the CDC was not harmed by correcting misinformation on social media, “making this a low-cost behavior for public health organizations.” They conclude, “We recommend that expert organizations like the CDC immediately and personally rebut misinformation about health issues on social media.”

On the Benefits of Explaining Herd Immunity in Vaccine Advocacy
Betsch, Cornelia; et al. Nature Human Behavior, March 2017.

When people get vaccinated, they protect themselves and their communities. This concept is called “herd immunity.” Understanding that concept requires an understanding of the individual’s relationship to a larger collective.

With that in mind, the researchers were interested in understanding whether attitudes toward vaccination differ in cultures that are more oriented to the collective rather than the individual. They collected online survey responses from over 2,000 respondents from South Korea, India, Vietnam, Hong Kong, the U.S., Germany and the Netherlands. Participants were recruited through emails sent by researchers, social media and, for the U.S. and India, Amazon Mechanical Turk. Eastern countries were categorized as collectivistic and western countries were categorized as individualistic, a judgment informed by country-specific individualism rankings produced by Hofstede Insights, a research firm focusing on cultural management.

In the experiment, participants responded to two hypothetical scenarios, one involving a highly contagious fictitious disease, the other involving a less contagious disease. “In each of the two scenarios, the participants read about the disease, the respective vaccine and the probability of vaccine adverse events,” the authors write. Participants were then asked whether they would get vaccinated in each scenario.

Some participants read an explanation of herd immunity prior to reading the scenarios; others participated in an interactive simulation that illustrated the concept; others received no explanation.

The key findings: “People from eastern countries generally showed higher vaccination rates,” the authors write. “We argue that collectivism does assist in a consideration of others, including in vaccination decisions.”

People who received explanations of herd immunity prior to the hypothetical scenarios were more willing to vaccinate compared with those who did not receive any explanation of the concept, particularly for respondents from individualistic cultures.

The authors conclude: “The present findings can be seen as optimistic evidence that the communication of herd immunity in vaccine advocacy materials can increase vaccine uptake in the population, therefore reducing the burden of infectious diseases, and finally eliminating them.”

Prevention Is Better than Cure: Addressing Anti-Vaccine Conspiracy Theories
Jolley, Daniel; Douglas, Karen M. Journal of Applied Social Psychology, August 2017.

This study tested the efficacy of different strategies to debunk anti-vaccine conspiracy theories through two experiments with 267 and 180 online survey participants, respectively, who were U.S. residents recruited through Amazon Mechanical Turk. In the first experiment, “Participants were asked to read one of five combinations of arguments [relating to vaccines]: (a) conspiracy arguments only, (b) anti-conspiracy arguments only, (c) arguments refuting anti-vaccine conspiracy theories, followed by arguments in favor (anti-conspiracy/conspiracy), (d) arguments in favor of conspiracy theories, followed by arguments refuting them (conspiracy/ anti-conspiracy), or (e) a control condition where participants were presented with no information.”

Participants then rated their belief in various anti-vaccine conspiracy theories, and provided responses to a hypothetical scenario about their intention to vaccinate a fictional child.

The researchers find that exposure to pro-conspiracy information is linked to reduced intent to vaccinate. Notably, when participants viewed anti-conspiracy arguments before conspiracy theories, participants were more likely to vaccinate compared with those who were presented with conspiracy information alone. However, when participants read conspiracy arguments, and then the debunking, anti-conspiracy arguments, intentions to vaccinate did not improve.

The second experiment repeated conditions (a), (c) and (d) from the first experiment, because the researchers were interested in whether they could replicate the preliminary findings about the impact of timing of anti-conspiracy information. Again, they found that anti-conspiracy arguments improved intention to vaccinate only when shown before conspiracy theories.

“This provides empirical evidence of the success of a technique to address conspiracy theories,” the authors conclude. “We suggest that by presenting anti-conspiracy information first, this may in some way inoculate people from the potential harm of conspiracy theories.”

Does Correcting Myths about the Flu Vaccine Work? An Experimental Evaluation of the Effects of Corrective Information
Nyhan, Brendan; Reifler, Jason. Vaccine, January 2015.

Correcting misinformation about the flu vaccine dispels associated myths, but it doesn’t persuade people concerned about its safety to inoculate themselves, this study finds. The researchers looked at nationally representative online survey data collected from an initial sample of 1,000 U.S. adults that examined attitudes toward the flu vaccine. Before answering questions about the vaccine and intent to vaccinate, participants received information that did one of the following: corrected the myth that people can get the flu from the vaccine, offered pro-vaccination information that described the risks associated with the flu, or provided no information about the flu or associated vaccine.

At baseline, the researchers find that more than 4 in 10 Americans believed that the flu vaccine can cause the flu. Meanwhile, 16% thought that the vaccine was unsafe. In terms of likelihood of getting the vaccine, 34% reported they were “very unlikely” to get it, 37% said they were “very likely” to be vaccinated and 29% were uncertain.

The researchers find debunking the flu vaccine myth led to less people believing it. The finding applied to both those who reported high levels of concern with side effects as well as those with low levels of concern. On the other hand, pro-vaccine information about the risks of contracting the flu did not affect misperceptions about the vaccine.

Despite the success of the corrective information in debunking the vaccine-related myth, respondents who were “very concerned” about side effects  reported being less likely to get the vaccine after learning that it does not cause the flu. The probability of these respondents reporting that they are likely to be vaccinated decreased from 46% to 28%.

The authors conclude, “While corrective information about the flu vaccine had no effect on vaccination intention among respondents with low side effects concern, it significantly decreased the reported likelihood of receiving a flu vaccine among those with high side effects concern. These results are consistent with previous research showing that factual corrections about controversial issues may have unexpected or counterproductive results.”

The Limitations of the Backfire Effect
Haglin, Kathryn. Research & Politics, July 2017.

This study attempts to replicate the findings of Nyhan and Reifler’s study summarized above. Using the same methods and procedures, the researcher surveyed 474 adults with Internet Protocol addresses in Texas through Amazon Mechanical Turk. The researcher notes that her sample differs from the original sample in that it is not nationally representative.

This study, like the original, finds that about 40% of respondents believe the myth that the vaccine can give you the flu but only 15% think the vaccine is unsafe. It also finds a similar distribution, overall, of intent to vaccinate. Additionally, the study finds that those who received debunking information were less likely to belief the vaccine can give you the flu.

Where the analysis differs, though, is in the so-called “backfire effect.” The researcher finds that debunking the myth “had no effect” on intent to vaccinate among those who were highly concerned with side effects. The author concludes, “These findings suggest that more work is needed to validate the backfire effect and the conditions under which it occurs.”

 

For more on how you can combat misinformation, check out JR’s tip sheets on debunking weather-related hoaxes and covering health research accurately.

About The Author