Misinformation and fact-checking: Research findings from social science

 
Share
By

From “death panels” and WMD claims to 9/11 conspiracies and falsehoods about President Obama’s religion, misinformation and unverified assertions play an outsized role in American public life and discourse.

A 2012 report from the New America Foundation, “Misinformation and Fact-checking: Research Findings from Social Science,” examines relevant cognitive and psychological research on public policy-related communications. The report, by scholars Brendan Nyhan at Dartmouth College and Jason Reifler at Georgia State University, makes a series of recommendations for news organizations seeking to correct false or misleading information.

These strategy recommendations are based on research findings in the following problem areas:

  • Information deficit fallacy: Simply giving citizens more facts does not always change their policy opinions. Studies that have given respondents more information about such issues as welfare, immigration or the Iraq war have not always shown that opinions shift accordingly. Still, the effects vary across topics, and “infor­mation seems to be most effective in shaping pref­erences about government spending.”
  • Motivated reasoning theory: People are subject to confirmation bias (accepting information that confirms their beliefs) and disconfirmation bias (ignoring information that undermines these beliefs): “Information that challenges beliefs that people hold strongly is generally unwel­come and can prompt a variety of compensatory re­sponses. (By contrast, individuals will be much more likely to accept corrective information if they are not motivated to defend a mistaken belief.) These mecha­nisms can explain why corrections are sometimes ineffective and can even make misperceptions worse in some cases.”
  • Belief perseverance dynamics: People often have trouble remembering which ideas are true or false over time. This means that once false information is introduced, it can be very difficult to reverse people’s opinions and combating it with the facts can actually backfire. Some research suggests that stating something is not true — negating false information that was first introduced — can result in further confusion: Over time, the meaning is flipped as the person remembers only fragments of content. “If the correction makes a [false] claim seem more familiar, the claim may be more likely to be per­ceived to be true.”
  • Power of particular sources: People are more likely to accept new factual assertions when they come from sources that are perceived as trustworthy and hold the same general values. For example, “in politics, people are more receptive to sources that share their party affiliation or values as well as those that communicate unexpected information.” When race and identity are at issue, it can be more difficult to correct information among audience members of different races than the person in question (for example, white voters and President Obama.)
  • Visual correctives: Research has found that “presenting corrective information in graphical form is generally quite successful at increasing the ac­curacy of respondents’ beliefs about the number of insurgent attacks in Iraq after the U.S. troop surge, the change in payroll jobs in the U.S. between Janu­ary 2010 and January 2011, and the change in global temperatures in the last thirty years.” While visuals are helpful “when people have false beliefs about changes in a quantitative variable,” some research shows that graphics do not necessarily help to clarify factual statements about political assertions.

The scholars conclude that the more prominent an issue is in public discourse, the more difficult it is to change beliefs. Still, “it appears to be easier to reduce misper­ceptions that are technical or quantitative in nature … especially when people do not have strong prior beliefs about these quantities and they are not directly linked to one’s support or opposition to a given policy or candidate.” Moreover, “corrections that require proving a negative (e.g., that President Bush did not allow 9/11) are often especially ineffective given the difficulty of debunking conspiracy theories, their deep psycholog­ical roots, and the ineffectiveness of negations.”

Tags: news, ethics, climate politics

Last updated: March 1, 2012

 

We welcome feedback. Please contact us here.