Expert Commentary

How to cover academic research fraud and errors: 4 big takeaways from our webinar

Read on for great tips from Ivan Oransky, Elisabeth Bik and Jodi Cohen, three experts who have covered research misconduct or have hands-on experience monitoring or detecting it.

research fraud
(Lucas Wendt from Pixabay)

In 2022, academic journals retracted more than 4,600 scientific papers, often because of ethical violations or research fraud, according to the Retraction Watch blog and database.

Although retractions represent a tiny fraction of all academic papers published each year, bad research can have tremendous impacts. Some studies involve new drugs, surgical procedures and disease prevention programs — all of which directly affect public health and safety. Also, government leaders rely on scholarly findings to help guide policymaking in areas such as crime, education, road safety, climate change and economic development.

On Nov. 30, The Journalist’s Resource hosted a free webinar to help journalists find and report on problematic research. Three experts who have covered research misconduct or have hands-on experience monitoring or detecting it offered a variety of tips and insights.

“How to Cover Academic Research Fraud and Errors” — a video of our Nov. 30 webinar

For those of you who missed the webinar, here are four of the big takeaways from our presenters, Ivan Oransky, a former president of the national Association of Health Care Journalists who teaches medical journalism at New York University and co-founded Retraction Watch; Elisabeth Bik, a microbiologist and science integrity consultant who has been called “the public face of image sleuthing;” and Jodi Cohen, an award-winning investigative reporter at ProPublica whose series “The $3 Million Research Breakdown” exposed misconduct in a psychiatric research study at the University of Illinois at Chicago.

1. Retraction Watch and PubPeer are two online resources that can help journalists identify and track research fraud and errors.

Retraction Watch, a blog launched in 2010, is a treasure-trove of information about research papers that have been removed from academic journals. The website features:

  • The Retraction Watch Database, which journalists can use to search for retractions connected to a specific researcher, university or research organization. Use it to look for patterns — for example, retractions among groups of researchers who tend to work together or among multiple researchers working at the same institution.
  • The Retraction Watch Leaderboard, an unofficial list of researchers with the highest number of paper retractions.
  • A list of scientific sleuths, including self-described “data thug” James Heathers and Michèle B. Nuijten, who, along with Chris Hartgerink, created statcheck, designed to find statistical mistakes in psychology papers. Some of these experts use aliases to protect against retaliation and harrassment.

Retraction Watch helped Cohen report on and provide context for a ProPublica investigation into the work of prominent child psychiatrist Mani Pavuluri.

It “was a huge resource in trying to understand this,” Cohen told webinar viewers. “The amount of information there and the ability to use that database — completely amazing.”

In her series, co-published in The Chronicle of Higher Education in 2018, Cohen revealed that Pavuluri “violated research rules by testing the powerful drug lithium on children younger than 13 although she was told not to, failed to properly alert parents of the study’s risks and falsified data to cover up the misconduct, records show.” The University of Illinois at Chicago, Cohen wrote, “paid a severe penalty for Pavuluri’s misconduct and its own lax oversight.” The federal government required the school to return the $3.1 million the National Institutes of Health gave it to fund Pavuluri’s study.

PubPeer is a website where researchers critique one another’s work. Comments are public, allowing journalists to observe part of the scientific process and collect information that could be useful in a news story.

Bik noted during the webinar that PubPeer is “heavily moderated” to reduce the likelihood of name-calling and speculation about a researcher’s work. The website explains its commenting rules in detail, warning users to base their statements on publicly verifiable information and to cite their sources. Allegations of misconduct are prohibited.

“You cannot just say, ‘You’re a fraud,’” Bik explained. “You have to come with evidence and arguments similar to a peer review report.”

PubPeer played a key role in student journalist Theo Baker’s investigation of academic papers co-authored by Stanford University President Marc Tessier-Lavigne. Tessier-Lavigne ultimately resigned and Holden Thorp, the editor-in-chief of the Science family of journals, announced in late August that two of Tessier-Lavigne’s papers had been retracted.

The Journalist’s Resource created a tip sheet on using PubPeer in August. Tip #1 from that tip sheet: Install a free PubPeer browser extension. When you look up a published research paper, or when you visit a website that links to a research paper, the browser extension will alert you to any comments made about it on PubPeer.

2. Early in the reporting process, ask independent experts to help you confirm whether a research study has problems.

Getting guidance from independent experts is critical when reporting on research fraud and errors. Experts like Elisabeth Bik can help you gauge whether problems exist, whether they appear to be intentional and how serious they are.

During the webinar, Bik advised journalists to ask for help early in the reporting process and seek out experts with the specific expertise needed to assess potential problems. Bik specializes in spotting misleading and manipulated images. Others specialize in, for example, statistical anomalies or conflicts of interest.

Bik’s work has resulted in 1,069 retractions, 1,008 corrections and 149 expressions of concern, according to her Science Integrity Digest blog. Journal editors typically issue an expression of concern about an academic paper when they become aware of a potential problem, or when an investigation is inconclusive but there are well-founded indicators of misleading information or research misconduct.

Bik stressed the importance of journalists helping correct the scientific record and holding researchers accountable.

“It seems that there’s relatively very few papers that have big problems that get corrected or retracted,” she said. “Institutional investigations take years to perform and there’s very rarely an action [as a result]. And senior researchers, who are the leaders, the mentors, the supervisors and the responsible people for these things happening in their lab, they are very rarely held accountable.”

Oransky encouraged journalists to get to know the scientific sleuths, some of whom are active on X, formerly known as Twitter.

“You can find dozens of people who do this kind of work,” he said. “It’s like any kind of whistleblower or source that you can develop.”

Oransky also highlighted common types of misconduct that journalists can look out for:

  • Faked data.
  • Image manipulation.
  • Plagiarism.
  • Duplication or “self-plagiarism” — when researchers reuse their own writings or data, taking them from a study that has already been published and inserting them into a newer paper.
  • Fake peer review — a peer review process that has, in whole or in part, been fabricated or altered to ensure a paper gets published.
  • Paper mills — organizations that create and sell fraudulent or potentially fraudulent papers.
  • Authorship issues.
  • Publisher errors.

3. One of the best ways to get tips about research fraud is to report on research fraud.

Oransky shared that he and other people at Retraction Watch continually receive tips about research misconduct. Tipsters will come to journalists they think will report on the issue, he said.

“You write about it and then people come to you,” Cohen added. “They don’t know you’re there unless you’re covering it regularly. And not even regularly, but like you start writing about it and show it’s something your interested in, you’re going to get more ideas.”

Another place journalists can go to check for allegations of research misconduct: court records, including subpoenas. They can also ask public colleges and universities for copies of records such as investigative reports and written communication between researchers and their supervisors, Cohen pointed out. If the research involves human subjects, journalists could request copies of reports and communications sent to and from members of the Institutional Review Board, a group charged with reviewing and monitoring research to ensure human subjects’ safety and rights are protected.

Cohen suggested journalists ask local colleges and universities for records tied to research funding and any money returned to funders. The National Institutes of Health maintains a database of organizations that receive federal grant money to conduct biomedical research.

“You could just start digging around a little bit at the institutions you cover,” Cohen said. “Be skeptical and ask questions of the data and ask questions of the people you cover.”

4. Discuss with your editors whether and how you’ll protect the identities of whistleblowers and experts who want to remain anonymous.

Many experts who leave comments on PubPeer or raise questions about research on other online platforms use aliases because they don’t want their identities known.

“You can imagine that not everybody wants to work under their full name so some of them are using all kinds of pseudonyms, although recently some of these people have come out under their full names,” she said. “But it is work obviously that doesn’t leave you with a lot of fans. Especially the people whose work we criticize are sometimes very mad about that, understandably so. But some of them have sued or threatened to sue some of us.”

Oransky said he has no issues letting scientific sleuths stay anonymous. They can explain their concerns in detail and show journalists their evidence. As with any source, journalists need to check out and independently confirm information they get from an anonymous source before reporting on it.

“Anonymous sources that are vulnerable — which a whistleblower is, which someone in a lab whose pointing out problems is, especially a junior person — as long as you know who they are, your editor knows who they are, that’s my rule,” he said. “We want to understand why they want anonymity, but it’s usually pretty obvious.”

Download Oransky’s slides from his presentation.

Download Bik’s slides from her presentation.

About The Author