Nature | News

Problematic images found in 4% of biomedical papers

Giant survey suggests journals should pay more attention to detecting inappropriate duplications.


Article tools

Rights & Permissions

Around 1 out of every 25 biomedical papers contains inappropriately duplicated images, a huge analysis of 20,621 research articles suggests1. The finding has prompted renewed calls for research journals to routinely check images in accepted papers before they publish them.

Previous studies have analysed image duplication rates, but the latest analysis is unusually large and has the advantage of spanning multiple journals, says Bernd Pulverer, who is chief editor of The EMBO Journal in Heidelberg, Germany. He notes that although most duplications do not point to fraud or malevolence, they do misrepresent experiments.

"The rates were higher than I had initially expected,” says Elisabeth Bik, a sharp-eyed microbiologist who led the analysis. Though she still thinks most scientists are honest, she had even higher expectations when she began the project, she says.

Bik, who is at Stanford University in California, spent two years looking at articles published from 1995 to 2014 in 40 different journals, hunting for instances in which identical images were used to represent different experiments within the same paper. She cross-checked the duplications that she found with her two co-authors, both microbiologists.

Overall, 4% of the inspected papers contained such images, the researchers found. But rates ranged from over 12% in the International Journal of Oncology, to 0.3% in the Journal of Cell Biology, which has since 2002 systematically scanned images in its accepted papers before publication. Journals with higher impact factors generally had lower rates of duplicated images.

The study, which has not yet been peer-reviewed, was posted on the biology preprint server bioRxiv on 20 April, and was first reported by the Retraction Watch blog.

Sloppy mistakes?

Many of the problems were probably sloppy mistakes where people selected the wrong photograph, says Bik. But half or more look deliberate — because images are flipped or rotated or the same features occur twice in the same photograph.

The 4% duplicated image rate is about what Pulverer would expect. Unlike most journals, his has screened images in papers it accepts since 2012 – and last year reported that it finds image aberrations (including problems other than duplication) in around 20% of papers2. When the journal editors look at the raw data, they usually find that the issue can be corrected before publication, Pulverer adds; only in a few cases do they end up rejecting the paper.  

Previous studies have found higher rates of image duplication than Bik's team, but it's not clear that they are directly comparable. A small study of 120 cancer-biology papers in three journals, published last year, found that one-quarter contained duplicated figures — but only around 1 in 8 (13%) were the inappropriate kind that Bik's team were hunting, notes Bik 3. She adds that she screened 427 papers in the same journals, and found the average problematic duplication rate to be 6.8%.

Still, both Bik and Pulverer think that their screens miss duplications because they would not detect well-executed fraud or duplications across different papers.

Quality control

Paolo Cipriani/Getty

Images used in biomedical experiments can be inappropriately presented in papers.

In 2013, Enrico Bucci, who heads the biomedical services and information consultancy firm BioDigitalValley in Aosta, Italy, told Nature that he had examined gel images in biomedical papers authored by Italian scientists with close links to researchers who had retracted papers and found that about one-quarter contained anomalous images (including issues wider than duplications). Around 10% contained breaches such as cutting and pasting of gel-electrophoresis bands.

Overall, fewer than 1% of research papers are corrected each year, according to data from Thomson Reuters' Web of Science; retraction rates, despite a recent rise, are at around 0.02%4.

Bik’s findings “add to a body of evidence that has indicated there is an urgent need to institute better quality-control mechanisms at journals”, says Pulverer. He says that several publishers have contacted him to learn how to set up systematic screening for accepted papers, which should help to stamp out inaccurate data before it is printed. The scientific community also needs to improve with more training, self-policing and a stronger sense of publication ethics, he adds.

Bik says that she has reported all the papers with duplicated images to journals, which has so far resulted in 62 corrections and 6 retractions.

Journal name:



The original story mischaracterized Enrico Bucci’s survey by implying he had looked at all biomedical papers authored by Italian scientists. In fact, his probe was limited to gel images in papers authored by Italian scientists who had been co-authors at least three times with authors of retractions. The text has been corrected to reflect this.


  1. Bik, E. M. Casadevall, C. & Fang, F. C. Preprint at bioRxiv (2016).

  2. Pulverer, B. EMBO J. 34, 24832485 (2015).

  3. Oskvold, M. P. Sci. Eng. Ethics 22, 487496 (2016).

  4. Fanelli, D. PLoS Med. 10, e1001563 (2013).

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.


Commenting is currently unavailable.

sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up



Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.