PubPeer, a website where researchers critique one another’s work, has played a key role in helping journalists uncover scientific misconduct in several prominent investigative stories in recent years — including the student newspaper series that led to Stanford University President Marc Tessier-Lavigne’s recent resignation.
The platform was created 10 years ago to encourage discussion of individual academic studies and “accelerate the correction of science,” cofounder Boris Barbour, a neuroscience researcher in France, told The Journalist’s Resource. Conversations generally center on papers that already have been peer reviewed and published in academic journals. When a discussion of a paper begins, PubPeer automatically invites an author to respond, sometimes spurring lengthy, detailed exchanges.
Comments are public, allowing journalists to observe part of the scientific process and collect information that could become the basis of an important news story. The vast majority of comments, however, are made anonymously, allowing scholars to raise questions and concerns without risking retaliation, Barbour noted.
“Although all forms of scientific discussion are welcome on PubPeer, the site has become known as the channel by which an astonishing volume of research misconduct has come to light,” he wrote in an email.
Last year, Stanford student journalist Theo Baker discovered allegations of altered images in Tessier-Lavigne’s research on PubPeer and began to investigate. Had journalists checked the website earlier, they would have found criticisms dating back to at least 2015, Baker writes in a July 30 essay for The New York Times.
“Reporters did not pick up on the allegations, and [academic] journals did not correct the scientific record,” he writes. “Questions that should have been asked, weren’t.”
While PubPeer is an excellent reporting tool for journalists across beats, it’s crucial they use the information they find there responsibly. For example, don’t assume a comment is accurate or a calculation is correct because it came from an academic. Likewise, don’t shrug off serious accusations because the people making them don’t use their real names.
The platform is relatively easy to navigate. On the homepage, you’ll find a list of discussions organized according to their most recent comment. You can search the PubPeer database using an author’s name, key words, a paper’s title or a paper’s numeric or alphanumeric identifier, such as a Digital Object Identifier, or DOI.
This tip sheet aims to help journalists make the most of the website. We created the five tips listed below based on advice several journalists familiar with PubPeer shared with us during phone and email interviews.
Keep reading to learn more from Julia Belluz, a former senior health correspondent at Vox who’s working on a book about nutrition and obesity; Stephanie M. Lee, a senior reporter at The Chronicle of Higher Education who covers research and the academic community; and Charles Piller, an investigative journalist for Science magazine and founding board member of the Center for Public Integrity.
1. Install the PubPeer browser extension.
PubPeer offers browser extensions for the four major web browsers. Once installed, the extension will alert you if PubPeer comments exist for any research paper you’re reading online.
There’s also a PubPeer plugin for Zotero, a web tool some journalists use to organize and share research. The plugin lets you see whether there are PubPeer comments on any of the papers saved in your Zotero collection.
2. Don’t publish a news story simply to point out people have made negative comments about a research study.
The fact that an individual study has drawn a certain number of probing comments on PubPeer is not, on its own, newsworthy, Lee says. She uses PubPeer as a barometer of sorts, a quick way to get a cursory read on the credibility of certain researchers and studies.
“Basically, when I start looking into a group of scientists or a scientist whose work I’m interested in for whatever reason, PubPeer is the first place I’ll go to see if questions have been raised,” says Lee, who received the 2022 Victor Cohn Prize for Excellence in Medical Science Reporting for her investigative reporting about scientific misconduct.
Because PubPeer allows the whole research community to weigh in on a paper, it’s not surprising some scholars have spotted problems peer reviewers missed. The peer review process typically involves a small number of academics whose evaluation is limited to, for example, making sure the authors write clearly, their research design is sound, they cite other researchers’ work correctly and that what they have written corresponds with the information presented in their tables and figures.
“We know peer review can fail to catch errors or even outright fraud in research before it’s published,” Belluz wrote.
The website, she added, has become “a place where scientists go to whistleblow about problematic research, so anyone interested in scientific misconduct might find sources or ideas for stories on PubPeer.”
3. Verify all claims made on PubPeer before relying or reporting on them.
Piller describes PubPeer as “more of a tip sheet than an authoritative source.”
“I’m not casting aspersions on it — it’s great,” he says. “It’s extremely valuable, and a lot of information on it is correct. The problem is, there’s a lot of anonymous sources.”
Barbour, who helped create PubPeer, estimated that “maybe 90%” of comments are posted anonymously.
“Any criticism of the work of colleagues tends to be badly received, and you never know who will be making — often anonymous — decisions about your future,” Barbour wrote in an email to The Journalist’s Resource. “So, understandably, people worry. Furthermore, even an implicit (but convincing) suggestion of misconduct immediately creates an extraordinarily high-stakes situation. A career might be on the line. It’s unsurprising that those threatened would fight with every weapon at their disposal.”
Piller urges journalists to thoroughly verify everything they find on PubPeer before using it. He recommends enlisting the help of multiple experts with deep experience, including scholars with subject-area expertise and technical experts such as Elisabeth Bik, a well-known image analyst.
Bik helped Baker, the Stanford student journalist, with his investigation. Last year, she helped Piller look into possible image tampering in dozens of Alzheimer’s studies.
“You need to treat anonymous postings [on PubPeer] the same as you would treat anonymous postings of any kind,” Piller notes. “You have no way of knowing if that person has some sort of ulterior motive.”
4. Use caution when describing the likelihood that researcher misconduct happened.
Piller warns journalists to carefully choose words and phrases to convey how certain they are that a study’s findings are erroneous or that a researcher has intentionally participated in some form of misconduct. It can be difficult to establish with complete confidence that data or images have been fabricated or manipulated, especially if journalists lack access to the original data and cannot compare the images that appear in an academic article against the unedited, uncropped, high-resolution, earliest version of those images.
Piller suggests using language that reflects some level of uncertainty — for example, “apparent fabrication” or “potential errors.”
“You can wreck someone’s career, so you have to be really careful and really fair-minded about it, obviously,” he says. “Even if the evidence appears to be incontrovertible, I would use qualifiers.”
He stresses the importance of journalists explaining and making available the evidence they have found.
“Show evidence,” he advises. “It’s better to let the evidence speak for itself most of the time.”
5. No matter how small a role a researcher plays in your story, check PubPeer before including them.
“If you’re looking at a scientist as a source — a credible source of authority — it’s good to know whether that person is regarded as above reproach,” Piller points out.
Many journalists already know they need to look into possible conflicts of interest.
“Another [red flag] would be credible evidence a person has engaged in data manipulation or image manipulation or other ways of operating scientifically that would cause you to want to question their credibility as a source for a story,” Piller adds.
Lee’s advice: Make a habit of checking PubPeer for the researchers and studies you’re considering including in your coverage, keeping in mind that a lack of comments does not mean there aren’t problems.
“If [journalists] come across research that is influential or controversial in whatever field they’re covering, it’s a good, proactive step to plug the URL into PubPeer and see what they get,” Lee says.
Because PubPeer’s search function does not work for all URLs, Barbour recommended using a DOI or other unique identifier.