Expert Commentary

1 in 4 journalists surveyed rarely or never seek out peer-reviewed research to learn about beat topics

The results of our 2021 user survey offer insights into how journalists use academic research, why they don’t do it more often and more.

user survey journalist news research habits
(Pixabay)

We’re grateful to the 1,561 reporters, editors, journalism professors and others who took time out of their busy days recently to participate in our 2021 user survey, designed to help us get to know our audience better — the jobs they do, the beats they cover, how long they’ve been in the field, and so on.  

We asked questions to better understand how journalists use academic research, why they don’t do it more often and how key segments of our audience use our tip sheets, research roundups, explainers and other materials.

An important takeaway: Nearly half the journalists who participated said they seek out peer-reviewed research on an almost daily or weekly basis to learn about a topic they’re covering. But 1 in 4 told us they rarely (once or twice a year), never or almost never do that.

Survey responses also reveal differences in how often journalists mention scholarly findings in their news articles and TV or radio broadcasts. While 14% of journalists surveyed reported “always” citing research in their stories, 11% indicated they never or almost never do. Meanwhile, 41% said they mention research “sometimes.”

Many journalists haven’t been trained to read, interpret and spot flaws in research. More than 60% of the journalists we surveyed said they’re not sure they can tell a high-quality study from a questionable one. About 54% are “somewhat confident” in their ability, and 8% are “not confident” at all. Fewer than 40% rated themselves “very confident.”

This feedback is a clear signal that we at The Journalist’s Resource still have lots of work to do, considering one of our top goals is getting newsrooms to consistently use peer-reviewed research to ground their coverage and help prevent the spread of misinformation. Peer-reviewed studies aren’t infallible, but they are considered the gold standard for reliability in academic scholarship.

Our user survey also asked how we can improve our work, which public policy areas we should focus on in the coming year and what new things we should do to help journalists on deadline find the best research on a particular topic and explain it clearly and accurately to the public.

All of this input will guide our decisions as we plan for 2022. Many of you indicated you’d be “very interested” in webinars on reporting and understanding research and on covering specific policy topics. As a result, we’ve already started organizing a series of online training sessions. The first, “Using Academic Research to Keep Politicians Honest: A Free Online Training Session for Journalists Across Beats,” is scheduled for Feb. 17 at noon ET.

We learned a lot from your feedback. For example, we asked:

  • What is your profession? More than 44% the people who responded to our survey identified as journalists, journalism faculty, such as college professors and high school teachers, and journalism researchers. Another 23% selected the “other” category, including former journalists, retired journalists, attorneys, librarians, and others representing a wide range of backgrounds and professions. A bunch of people we consider journalists — newspaper editors, part-time journalists and one photojournalist, for example — also chose “other.”
  • How important is it do you that The Journalist’s Resource provides more resources on the following topics? Respondents agree we need to provide more resources on certain subjects. Of the 11 policy topics offered as options, 70% of survey participants said it’s “very important” we offer more on misinformation and 61% said it’s “very important” to provide more on climate change and the environment. The three other highest-rated topics were health, education and understanding how to gauge the quality of research.
  • What did you learn about covering research during the COVID-19 pandemic? Dozens of people weighed in. “Science evolves; it isn’t static,” one person writes. “Be clear with your readers what we know and what we don’t know at this point in time, because things change.” Another writes, “The good, the bad, and the ugly of preprints,” referring to the flood of academic papers examining COVID-19 that researchers posted online, before they had been been vetted by independent experts.

Carmen Nobel, program director of The Journalist’s Resource, says coverage of COVID-19 has made obvious the need for journalists to build their research literacy. Errors in interpreting findings and an inability to spot problems in research can be dangerous during a global pandemic.

“The survey responses highlighted the challenges of understanding that science is provisional – and the challenges that journalists face explaining research to audiences who are grappling with an onslaught of good information, misinformation and disinformation amid a deadly pandemic,” she says.

It’s important to note that our user survey isn’t scientific. It captures a snapshot of our growing audience at a single point in time, based on responses from individuals who heard about our survey and had the time or inclination to participate during a short window (the user survey was open from late October 2021 to early January 2022). We used our weekly newsletter to promote the survey and enlist participants, which likely skewed the results.

The bottom line: We cannot assume the people who took the survey or their responses represent our audience as a whole. The results we describe in this article apply only to those who participated in the 2021 user survey, and some people skipped some questions.

Audience demographics

We asked a number of questions aimed at learning more about the people in our audience, especially journalists and journalism faculty, who include college professors and high school teachers. Here’s some of what we learned:

  • What is the geographic scale of your primary audience? About 34% of those who identify as journalists work for news outlets with a national reach. And 24% work for international outlets. Meanwhile, 21% work for local news organizations, which cover a state or part of a state, and 13% are in newsrooms with a hyperlocal focus on a specific community or group of communities. The smallest share of journalists — 7% — work for regional outlets covering a section of a country.
  • How long have you worked in journalism? About 64% of journalists who took our survey are industry veterans, with 20 years of experience or more. About 7% have fewer than five years of experience. Meanwhile, 12% have been in the field between five and nine years and 17% have worked in journalism between 10 and 19 years.
  • Where are you based? The vast majority of survey takers — 69% — are spread out across the U.S., with 37% living in the Northeast and Mid-Atlantic states, 17% concentrated in the Midwest and 16% located in the West. Another 14% live in the Southeast and 9% in the Southwest. Significant portions of respondents are from Europe (8%), Asia (6%), Africa (6%), Canada (5%), Latin America (3%) and Australasia/Oceana (2%). Less than 1% are in the Caribbean or Middle East.
  • How would you describe changes to your newsroom’s size over the past year? About 16% of journalists said their newsrooms have shrunk considerably, but a similar portion — 14% — have seen their newsrooms grow considerably. Over that period, 21% of journalists surveyed said their newsrooms shrunk a small amount and 19% said their newsrooms grew a small amount.
  • Which of the following beats or topic areas do you personally cover? Health and medical science is the most common beat among journalists who took our survey — 47% cover it as their primary beat or in addition to one or more other beats. About 43% told us they cover economics/business. The next two most common beats are climate/the environment and state and local politics, with 41% and 40% of journalists covering them, respectively. Survey takers were allowed to pick multiple beats.
  • How much education have you had in statistics? We asked this question because without training in that subject, it’s often difficult to understand the methodology and findings of most quantitative research. Of those who answered the question, 34% said they have no training in statistics and another 29% said they studied it some as part of their undergraduate education.  About 15% indicated they’ve had “some basic training through online courses.”
  • How is your work distributed? About 40% told us their work is published in a print newspaper, 21% said their work is published in a print magazine and, for 39%, it’s featured in newsletters or dedicated emails. About 13% said their work is broadcast on TV or through videos and 13% said their work airs on the radio. The vast majority — 79% — told us their work is distributed online. A lot of journalists’ work appears across different types of media.

Barriers to using research

To get more insights for tip sheets we could produce and trainings we should offer, we need to understand the challenges journalists encounter when they try to use academic research in their news coverage. The user survey reveals:

  • What are some of the barriers to your using academic research more often in your journalism? Journalists face multiple barriers to finding and interpreting research studies. The two most common: Difficulty understanding academic jargon and limits on the studies journalists can access, considering many studies exist behind academic journal paywalls. Sixty percent cited paywalls as a barrier and 58% pointed to academic jargon. Another big barrier: Lack of time. About 54% said they don’t have time to read lengthy research papers on deadline while 31% said they don’t have time to even look for it.
  • How could academic experts make it easier for you to communicate with them? For many journalists, communication with academic experts can be problematic, too. When we asked how communication could be made easier, 77% of journalists said it would be “very helpful” if academics included contact information in their online bios and 69% said it would be “very helpful” is they responded to calls and email more promptly. More than half agreed it would be “very helpful” if academics would do these three things: use less jargon (55%), respond to questions patiently (54%) and articulate the strengths and weaknesses of research (56%).

Reporting habits

We aim to help journalists recognize the importance of relying on peer-reviewed research as well as the dangers of relying on research that hasn’t been peer reviewed, or evaluated by independent scholars. (In fact, we have a whole explainer on it!)

But, do journalists tell their audiences whether a study has undergone peer review? If they’re not sure about the quality of a study or need help explaining scholarly literature to the public, do they contact researchers for interviews?

Here’s what we found when we asked the journalists who took our survey these questions:

  • When you report on academic research, how often do you note whether it has undergone peer review? One-third said they never or almost never note whether research has been peer reviewed. One-fourth of those who took our survey said they do it “sometimes” and almost a fourth “always” do. The remaining 16% said they often note whether a research study featured in a story has survived the peer-review process.
  • When you report on research, how often do you reach out to the researcher(s) for an interview? About 15% of journalists surveyed told us they always reach out to researchers for interviews when they report on research. And 33% often do it. But 32% of journalists said they only do it sometimes and 21% never or almost never do it.

How audience members use our resources

Audience input helps us understand how people learn about and use the resources we create, what they find most useful and what new things we can do to serve them better. We asked everyone questions on this topic, although we asked journalists and journalism faculty for additional details.

We learned:

  • Although there are many ways people find our content, the most common is our weekly newsletter, which generally goes out on Wednesday. More than 60% of respondents said they learn about our resources that way. Almost one-third indicated they go directly to our website. Meanwhile, 19% find it through online search engines and 22% follow links provided by news outlets and other organizations who reference our work. Smaller percentages of people learn about our content when someone shares it with them or when they come across it on social media.
  • Almost 9 out of 10 respondents said they “often” or “sometimes” read or use our articles on new research studies and our research-based explainers that generally focus on policy topics. About 80% “often” or “sometimes” read or use our research roundups — collections of researching examining a specific issue. Almost 75% reported “often” or “sometimes” reading or using our tip sheets on how to cover a particular policy topic or tip sheets designed to help journalists understand research methods, spot flaws in research and use academic studies in their news reports.
  • Journalists told us our materials help them in numerous ways. About 67% reported using them to better understand and explain policy issues and 66% said our materials gave them background knowledge on a topic they were covering. Almost half said our resources help them generate story ideas and 40% said they used them to fact-check. Meanwhile, 36% told us our work helped them improve the accuracy of their work.
  • Our materials are also useful to faculty and researchers. About half those who took the user survey said they have recommended or assigned our research summaries to students, forwarded our articles or newsletters to students, used our materials to prepare a lesson and used our materials to get up to date on a policy issue. Forty-two percent indicated the resources we provide have helped them in their research or publishing work, and 43% reported that our tip sheets in particular have helped inform their students on a topic.

What else should we be doing?

Audience feedback is crucial to helping us decide which types of tip sheets and other resources to create during the coming year. It also helps us decide what to include and emphasize when explaining a research paper’s findings — for example, whether the findings apply only to the sample of people who participated in a study or are generalizable to a much larger population, such as an entire city or country or even everyone on the planet.

Here are some of the main takeaways:

  • Of the 11 topic choices offered, they’re most interested in misinformation, which 70% say is “very important” and 17% consider “somewhat important.” The overwhelming majority of those who weighed in indicate they also want us to do more on health, climate change and the environment, education and understanding how to gauge research quality. For example, 61% say it’s “very important” we provide more on climate change and the environment, and another 30% say that’s “somewhat important.” Meanwhile, 54% of respondents say it’s “very important” and 30% say that’s “somewhat important” that we offer more resources aimed at helping journalists interrogate research.
  • Close to 80% of respondents said it’s “very important” that we point out the strengths and weaknesses of the studies we highlight and that we include a clear explanation of a study’s methods and statistics. Almost 70% indicated it’s “very important” we explain whether findings are generalizable. About half the people who answered the question agreed it’s “very important” we include the authors’ contact information and point out other relevant research.
  • Nearly 8 in 10 respondents would be “very” or “somewhat” interested in webinars about specific policy topics and tip sheets explaining how to apply behavioral science to their work. Meanwhile, about 7 in 10 would be “very” or “somewhat” interested in webinars on reporting and understanding research, and short video explainers.

Audience feedback, in their own words

Although our survey seeks feedback primarily through multiple-choice questions, we did ask people to tell us, in their own words, how we’ve helped them in their work or how we can help them do their jobs even better in the future.

Hundreds of people submitted comments, such as:

“I’ve shared your newsletter with my direct team and others, many of whom now have their own subscriptions. Your reporting makes us better as writers ourselves when reporting about research, statistics, analytics. As editors, we use your reporting to help us guide and improve other authors’ work, too. Especially in the development stage, we’ve learned to truly dig into authors’ delivered content and ask some important questions which, if left unasked and unanswered, might lead readers to a different understanding of the topic at hand.”

“I feel myself more confident in ‘the field’ after I’ve read an article or a research [study]. But in general, I wish I had more time to read more materials you suggest.”

“The access I get to your analysis and research helps me formulate better questions for the local authorities I interview, when I do get stories that require the research and resources you offer. Thank you for that.”

“Set an editorial calendar with the topics you will be covering each quarter so we know in advance what’s coming.”

“I often share articles published by The Journalist’s Resource with my students who are interested in journalism and writing about research. My students find them informative and useful for developing their writing, critical thinking and research skill.”

“Though I rarely write for publication these days, when I do, I check here first. Nobody has time to waste any more, and The Journalist’s Resource cuts to the quick, quick.”

“The research roundups help me see local trends in a new light, a more macro, holistic view.”

“For my purposes, The Journalists Resource helps me to parse through the daily influx of news, particularly how to assess it for truth/reliability vs misinformation. I have learned a lot from your organization that helps me to be more fact based & unbiased in my opinions and in discussions with other Americans.”

“It is nice to have one place to find so much research with clear explanations.”

“This site is a collective of not just good research, it’s a place where I and my students can see how journalists think, question, cast doubt on presumptions, and persevere in the hard work of thinking as those called to be journalists and fact-driven media practitioners. I have a T-shirt (provided by my convention fees to the Associated Collegiate Press) that says ‘The World Needs More Journalists.’ I’ve been tempted to use shirt-making materials to add “thinking” to that sentence.”

“JR has collaborated with several newsrooms in the region to produce high quality, data-driven information to local communities. Communities and newsrooms need access to high quality research, but journalists don’t have it on a time-sensitive basis or often don’t know where to look for it. JR can help bridge that gap.”

“What I find most brilliant is the voice in which it’s written, as if letting the readers know that we’re all capable of more.”

Lessons from the COVID-19 pandemic

We also asked respondents what they have learned about covering research during the COVID-19 pandemic. Here are some of their responses, many of which refer to the challenges of how and whether to cover preprint studies:

“Science evolves; it isn’t static. Be clear with your readers what we know and what we don’t know at this point in time, because things change.”

“We had a lot of pandemic-inspired conversations about whether to cover preprints! I don’t think that’s a settled question yet for our (science news) publication, but it would have been out of the question before and is now less so.”

“The good, the bad, and the ugly of preprints.”

“How useful preprint servers can be — the importance of noting that they are preprints.”

“Preprints are alluring but dangerous. But peer review can be slow, and it’s difficult to wait while our lives are on hold and we’re desperate for information. And while I already knew research can prompt changes in understanding over time (don’t mask, mask, take off the masks, put them back on, etc.), the pandemic illustrated this principle in near-real-time.”

“The danger of pre-prints and reporting on them without lots and lots of context.”

“We started being very clear with readers about how peer-reviewed or conditional the studies were; how large the studies were, what other research in the field was like, etc. Before Covid, we might not have gone into that kind of detail, but there was so much new information and research so fast, and much of it was very unsettled.”

“Preprints and press releases are sources of news but need to be handled very carefully. Journalists should always link to the primary source of information and should state clearly where the information comes from and whether it has been vetted through peer review. (Peer review is not a guarantee of good science, though, as many papers are retracted or have flaws.)”

“That the world really was learning together about this novel virus and outsiders could see the findings come to light in real time. Also that science and reality evolve. What we know one day can be eclipsed the following day; experts cautioned against wearing masks at first, then encouraged masking.”

“I learned how to circumvent paywalls.”

“Peer review is slow! Perhaps too slow. Also it can be dangerous for people outside a field (say, virology) to interpret a study. And finally, everyone thinks they’re an epidemiologist — but they are not.”

“There’s a difference between a virologist, an epidemiologist, an MD etc. Many academics/researchers/MDs opined on the state of the pandemic/masks/herd immunity etc, who did not actually have training that made them an expert on said topics and many journalists ate it up.”

“It is difficult to find authoratitive work in a fast-changing environment. And it’s hard for people — researchers and journalists — to say, ‘I don’t know.’”

“It is exceedingly perilous to try to cover breaking research, if you will, in the midst of constantly shifting terrain such as in the pandemic. Rushing into print with that kind of research is perilous. You may find yourself reporting shallow or misleading research.”

“That search for scientific knowledge is a continuum, a collection of findings and observations and not a definitive answer that happens on a given day. And that we need to make this clear to our audiences. This is one way we fight mistrust in scientists and scientific knowledge.

About The Author