The once ignored community of science sleuths is now being pursued by the research community.

By | February 14, 2024

A community of detectives hunting down errors in scientific research has sent shockwaves through some of the most prestigious research institutions in the world and the scientific community at large.

High-profile cases of alleged image manipulation in articles written by the former president of Stanford University and leaders of the Dana-Farber Cancer Institute have made national media headlines, and some top scientific leaders think this may be just the beginning.

“At the rate things are going, we expect another one of these to pop up every few weeks,” said Holden Thorp, editor-in-chief of the Science family of scientific journals. within the area.

The detectives argue that their work is necessary to correct the scientific record and prevent generations of researchers from pursuing dead-end topics because of flawed papers. Some scientists say it’s time for universities and academic publishers to reform the way they handle flawed research.

“I understand why the sleuths who find these things are so angry,” he said. Michael Eisen, a biologist, former editor of eLife magazine, and a leading voice for reform in scientific publishing. “Everyone — author, journal, institution, everyone — is encouraged to minimize the importance of these things.”

For nearly a decade, science sleuths have uncovered widespread problems with scientific images in published articles, posting concerns online but receiving little attention.

This began to change after he became Stanford President last summer. Marc Tessier-LavigneA neuroscientist, he resigned from his position after reviewing a report criticizing laboratory culture and allegations of image manipulation in studies he helped author. Tessier-Lavigne herself was not found to have committed misconduct, but members of her lab were found to have manipulated the images in questionable ways, according to a report from a scientific panel appointed to examine the allegations.

In January, a scathing post by a blogger exposed questionable work by senior leaders at the Dana-Farber Cancer Institute and subsequently asked the journals to retract six papers and issue corrections for dozens of others.

In a resignation statement, Tessier-Lavigne stated that the panel did not determine that she was aware of the abuse and that she never produced documents that she did not consider to be accurate. In a statement from its research integrity officer, Dana-Farber said it had taken decisive action to correct the scientific record and that the image discrepancies were not necessarily evidence that an author was trying to deceive.

“We’re definitely going through a period of time — public awareness — that really took a shift when the Marc Tessier-Lavigne thing happened and has continued steadily since then, with Dana-Farber being the last one,” Thorp said.

The long-standing problem is now in the national spotlight, and new AI tools are making it easier to spot problems ranging from decades-old errors and sloppy science to unethically enhanced images in photo editing software.

This intense scrutiny is reshaping the way some publishers operate. It forces universities, journals, and researchers to take into account new technology, the potential accumulation of undiscovered bugs, and how to be more transparent when problems are detected.

This comes at an alarming time in academic halls. Venture capitalist Bill Ackman in a broadcast on X last month He discussed weaponizing artificial intelligence to detect plagiarism by leaders at top universities with whom he has ideological differences, and raised questions about political motivations in plagiarism investigations. More generally, public trust in scientists and science has declined steadily in recent years, according to the Pew Research Center.

Eisen said he didn’t think detectives’ concerns about scientific images strayed into “McCarthyian” territory.

“I think they’re targeting a very specific problem in the literature, and they’re right; it’s bad,” Eisen said.

Scientific publishing underpins what scientists understand about their discipline and is the primary way that researchers, with new findings, outline their work for colleagues. Before publication, scientific journals evaluate submissions and send them to outside researchers in the field for review, called peer review, and to detect errors or faulty reasoning. Journal editors will review work for plagiarism and copyediting before publication.

This system is not perfect and still relies on researchers’ well-intentioned efforts not to manipulate their findings.

Over the past 15 years, scientists have become increasingly concerned about problems arising from some researchers digitally altering images in their papers to distort or emphasize results. Discovering irregularities in images in experiments typically involving mice, gels, or smears has become a greater priority of the work of scientific journals.

Scientific images expert Jana Christopher, who works for the Federation of European Biochemical Societies and its journals, said that the field of image integrity scanning has grown rapidly since she started working in this field about 15 years ago.

At the time, “nobody was doing it and people were kind of in denial about research fraud,” Christopher said. “The consensus was that this was very rare and occasionally someone could be found to falsify their results.”

Nowadays, there are teams in scientific journals that try to deal with images and ensure their accuracy. A record number of papers were retracted last year, more than 10,000, and more papers than ever before, according to a Nature analysis.

A loose group of scientific sleuths increased the outside pressure. Detectives often discover and flag errors or possible manipulations on the PubPeer online forum. Some sleuths receive little or no payment for their work or are not known to the public.

“There is a degree of wariness around it,” Eisen said.

Analysis of comments on more than 24,000 articles published on PubPeer revealed that more than 62% of comments on PubPeer were related to image manipulation.

For years, detectives have relied on sharp eyes, sharp pattern recognition and an understanding of photo processing tools. Over the past few years, rapidly developing artificial intelligence tools that can scan papers for irregularities have been accelerating their work.

Scientific journals are now adopting similar technology to prevent errors from reaching publication. In January, Science announced it was using an AI tool called Proofig to screen peer-reviewed articles edited for publication.

Thorp, Science editor-in-chief, said the family of six journals “quietly” added the tool to its workflow about six months before the January announcement. Previously, the magazine relied on eye checks to detect such problems.

Thorp said Proofig identified several articles late in the editorial process that were not published due to problematic images that were difficult to explain and other situations where authors had “plausible explanations” for problems that they fixed before publication.

“Serious errors that would cause us not to publish a paper are less than 1%,” Thorp said.

Chris Graf, director of research integrity at publishing company Springer Nature, said in a statement that his company has developed and tested “in-house AI image integrity software” to check for image duplicates. Graf’s research integrity unit currently uses Proofig to help evaluate papers if concerns are raised after publication.

Processes vary among journals, but some Springer Nature publications manually check images for manipulation with Adobe Photoshop tools and look for inconsistencies in raw data for experiments that visualize cell components or common scientific experiments, Graf said.

“While AI-based tools are helpful in accelerating and scaling research, we think the human element is still very important in all of our research,” Graf said, adding that image recognition software is not perfect and human expertise is required. To protect against false positives and negatives.

No tool can catch every bug or cheat.

“There are too many people involved in this process. “We can never catch everything,” Thorp said. “As journals, institutions, and authors, we need to get much better at managing this situation.”

Many science detectives were frustrated that their concerns were ignored or that investigations progressed slowly and without a public resolution.

Sholto David, who went public with his concerns about the Dana-Farber study in a blog post, said he largely “gave up” on writing letters to journal editors about the errors he discovered because their responses were so inadequate.

Elisabeth Bik, a microbiologist and longtime image detective, said she often flags image problems and “nothing happens.”

Leaving public comments questioning research figures on PubPeer can spark a public debate about questionable research, but authors and research institutions often do not directly respond to online criticism.

While journals may issue corrections or retractions, it is usually the responsibility of a research institution or university to investigate cases. When cases involve biomedical research supported by federal funding, the federal Office of Research Integrity may investigate.

Thorp said institutions need to move more quickly to take responsibility when errors are discovered and speak openly and publicly about what’s happening to gain the public’s trust.

“Universities are very slow to react and execute their processes, and the longer this goes on, the more damage is done,” Thorp said. “We don’t know what would happen if, instead of starting this investigation, Stanford said, ‘These documents are false.’ We will withdraw them. This is our responsibility. But for now, we’re taking the blame and owning it.’”

Some scientists worry that concerns about images only scratch the surface of science’s integrity problems; Problems in visuals are much easier to spot than data errors in spreadsheets.

While moderating bad papers and seeking accountability is important, some scientists think these measures will treat symptoms of a larger problem: a culture that rewards the careers of those who publish the most exciting results rather than those that last over time.

“Scientific culture itself does not say that we care about being right; It says we care about getting fancy papers,” Eisen said.

This article first appeared on NBCNews.com.

Leave a Reply

Your email address will not be published. Required fields are marked *