Pop quiz: Which of these headlines appeared atop a real news story?
(A) Zelenskyy pleads to US Congress: ‘We need you right now’
(B) Biden signed bill to mandate climate change curriculum in all K-8 classrooms
If you answered A, you’re correct — and not alone.
About 3 in 4 adults in the U.S. can discern real political news headlines from fake ones, finds a new paper, “Is Journalistic Truth Dead? Measuring How Informed Voters Are About Political News,” forthcoming in the American Economic Review.
The findings are based on a dozen quizzes completed by a total of nearly 8,000 participants representative of the national population, conducted by Charles Angelucci, an assistant professor of applied economics at the Massachusetts Institute of Technology, and Andrea Prat, an economics professor at Columbia University, from June 2019 to March 2022.
Angelucci and Prat find participants selected the real headlines 75% of the time, on average. While participants overall were much more likely than not to pick the true headline, demographic factors, not partisanship, were a stronger predictor of avoiding fake news, according to a subsequent statistical analysis the authors conducted. For example, younger, less educated people are less likely to pick true headlines, compared with those who are older and have a bachelor’s degree or higher.
And if you answered B above, don’t worry. That headline was written by a real journalist and meant to be plausible — but it never happened. President Joe Biden did not sign a bill mandating that educators teach climate change in elementary schools.
Both answers are from a quiz Angelucci and Prat conducted in March 2022. Participants had one minute to record which headlines they thought were true and which were fabricated.
“The average person is very well capable of distinguishing mainstream real news,” says Angelucci.
The findings “cast doubt” on media narratives that objective truth is dying, write Angelucci and Prat.
“It’s a really impressive paper, and the most comprehensive and rigorous study I’ve seen that assesses the level of knowledge, the level of news knowledge, in the mass public,” says Andy Guess, assistant professor of politics and public affairs at Princeton University, who was not involved with the research but provided early feedback.
Angelucci and Prat recruited four journalists to create the fake headlines. The journalists came from television, radio and local newspapers. They also selected the real headlines used in the surveys, drawn from Reuters wire articles on U.S. politics and based on the journalists’ judgment of their editorial significance.
(The last survey, conducted in March 2022, was drawn from Associated Press stories, because Reuters went behind a paywall in April 2021.)
Specifically, Angelucci and Prat instructed the journalists to put on their editor-in-chief hats: Pick the most important stories in U.S. politics to cover in a given week. The paper succeeds in creating a procedure that can be used in future research, Guess says, in relying on journalists to select a sample of newsworthy, real articles published by a global news outlet that covers nearly every notable political and policy event.
Coming up with a well-defined, replicable procedure to sample articles has been “one of the key challenges,” in analyzing news knowledge, Guess says.
Trouble identifying truth across socioeconomic groups
The three most recent survey rounds, conducted in October 2020, February 2021 and March 2022, included two quizzes each. One quiz offered three real headlines and three fabricated headlines created by the journalists. The other quiz had three real headlines and three fake headlines that circulated online, provided by fact checking site Snopes.
The real headlines appeared between May 2019 and March 2022, and were presented to participants within weeks or days of publication. Overall, the average fake headline, whether created by the journalists or from Snopes, was selected by participants 25% of the time, on average.
Angelucci and Prat then used the data they collected to build statistical models to explore differences in people’s ability to evaluate news across socioeconomic and partisan lines. Survey participants were drawn from panels convened by polling firm YouGov, which provides participants’ demographic information, such as age, family income, education and race or ethnicity, along with political party affiliation.
Statistical modeling is a fairly common way social scientists analyze survey results. It allows them to control for what are called latent characteristics — things researchers cannot observe, such as how plausible individual participants perceived the true and fake news headlines to be.
Based on the model, white, college-educated men under age 52, with relatively high incomes, had the highest odds, 89%, of choosing the correct answer when presented with a true and a false headline. Women under age 52 with lower incomes and education levels who were racial or ethnic minorities had the lowest odds, 71%, of selecting the true headline.
Despite the stark difference between some socioeconomic groups, even those more likely to select the fake headline still had a good chance of picking the true headline.
“They’re still more likely to get it right than wrong,” Angelucci says.
Those more politically engaged and likely to vote, who also tend to be older and have a college education, are more likely to choose the true headline, according to the model.
“We don’t know why young people are not as well informed as older people,” Angelucci says. “But age is the single most important predictor of knowledge.”
Angelucci notes, however, that social media confounds this trend, because “people who go on social media tend to be less informed than the general population,” he says.
The analytic results suggest partisanship is not a major factor in discerning true headlines. Overall, the model indicates people presented with one true and one fake headline have an 83% chance of picking the true headline when the news was positive about their political party — that ticked down just two percentage points, to 81%, when the news was unfavorable to their party.
Recognizing truth in a ‘post-truth’ world
The idea that Americans are living in a “post-truth world” gained some immediate currency during the second half of the 2010s, though the idea has existed since at least the 1990s. Angelucci and Prat write that the phrases “death of truth” and “post-truth world” have become “commonplace” in popular books and other media, “and are often accompanied by calls for immediate action to counter this risk.”
The post-truth narrative rose amid a tempest of political upheaval, exemplified by the election of Donald Trump as president in 2016, and technological upheaval, in the form of proliferating social media allowing unfettered self-publishing of dangerous conspiracies.
While it is difficult to narrow down a precise and widely accepted definition of “post-truth,” the phrase “especially refers to a sociopolitical condition perceived as rifer than ever before with dishonesty and distrust, inaccuracies or false knowledge, all corresponding to a crisis of shared trusted adjudicating authorities,” writes Jayson Harsin, an associate professor of communication, media and culture at the American University of Paris, in a December 2018 paper in the journal Communication.
The post-truth world that fascinated many in the news media during the waning years of the 2010s was aptly captured by Barack Obama during the waning days of his presidency. In a November 2016 article, Obama told New Yorker editor David Remnick that “the capacity to disseminate misinformation, wild conspiracy theories, to paint the opposition in wildly negative light without any rebuttal — that has accelerated in ways that much more sharply polarize the electorate and make it very difficult to have a common conversation.”
In her 2018 book, “The Death of Truth,” former New York Times literary critic Michiko Kakutani likewise wondered: “How did truth and reason become such endangered species, and what does their impending demise portend for our public discourse and the future of our politics and governance?”
When Kakutani’s book was released, Trump was making nearly six misleading claims per day, according to a Washington Post tally. Those misleading claims were being shared to varying degrees across social media.
One literature review, published in November 2021 in Trends in Cognitive Sciences, identified several analyses indicating that the proliferation of social media has contributed to political polarization.
While the new paper does not eliminate questions on what it means to live in a post-truth society, it does suggest, at some basic level, that many Americans are yet able to agree on the truth and falsity of political news.
Angelucci doesn’t dispute that polarization is real. What’s interesting for future research, and troubling at the same time, he says, is what truth means.
“It must be that we just — it’s not so much interpret the information differently, though that, too,” he says. “We literally think about the world differently. And so, throwing information at people, unfortunately, will not solve the problem.”
3 tips for covering political misinformation online
Guess, the Princeton professor, has extensively studied how social media use influences the U.S. electorate, including how algorithms and reshares affect political polarization in recent papers published in Science.
For journalists covering online misinformation, he offers three tips based on his research and other studies on misinformation.
1. Be specific about who is exposed to misinformation and avoid overplaying the extent to which misinformation affects all voters.
Certain segments of the voting population are “awash” in misinformation or, at least, are readily exposed to “news content that is produced by untrustworthy sources that lack standards of journalistic verification,” he says.
For example, fake news tends to circulate most among people over age 65, as well as those who are more politically conservative. While these trends are not fixed and could change in the coming years, they have held since around the mid-2010s.
“The misinformation problem is often stated in very general terms,” Guess says. “In other words, as if misinformation is equally likely to be seen by anyone on social media.”
2. Think of misinformation as a supply and demand equation, and not necessarily the end of the world as we know it.
There is clearly demand among some individuals and groups for false information, Guess says. Why do people seek out information that aligns with their politics, even if that information is untrue?
There are also clearly a host of people and groups willing to produce it.
“Every time there’s a new technological development, like generative AI, the tendency is to frame it as a sort of, all-encompassing technology that’s going to manipulate a passive public without their awareness,” Guess says. “I’m not ruling out that something like that could happen, but I think often the assumption that will happen is sort of baked into the coverage, as opposed to just looking at the nuts and bolts of, who is producing it, what the reach is, and what the impact will be on real people’s news diets?”
3. Give context and remember that denominators are your friend.
Numbers are often reported without context, Guess observes. Say a piece of misinformation created by a Russian troll farm was shared on X, formerly Twitter, a million times. How does that compare with the total number of shares on X over a given time frame, or to shares of real news across the platform?
“You need to take the denominator into account,” Guess says. “And think about also the perspective of individual users. What proportion of their daily news diet are we talking about? And it’s probably just a miniscule fraction. If journalists think these numbers are important, then context is really critical.”
Expert Commentary