Nature | Research Highlights: Social Selection

Collaborate and listen to reproduce research

Better communication between labs may resolve many reproducibility problems, according to report.

Corrected:

Article tools

Cell-biology labs often struggle to reproduce the research results of other groups. But a 15 July report suggests that many of those troubles would vanish if scientists reached out to the original experimenters. The report, released by the American Society for Cell Biology (ASCB), includes survey results from hundreds of ASCB members and calls for changes in scientific culture to make results easier to confirm. Besides better communication, it urges scientists to adopt more-uniform standards within their fields and to focus more on data quality rather than on publishing in high-impact journals. Arturo Casadevall, a microbiologist and immunologist at Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland, tweeted:

More than 70% of the ASCB members who completed the survey reported having had trouble reproducing results from other labs. One of the report’s authors, Mark Winey, a molecular biologist at the University of Colorado in Boulder and chair of the ASCB’s Data Reproducibility Task Force, notes that only about 11% of its 8,000 members responded to the survey, so the results should be interpreted with caution. Still, he says, there is no doubt that unrepeatable experiments are a common phenomenon. “I see it as an overall quality-control issue,” he says. “We need to think about ways to do our jobs better.”

The problem of irreproducible research continues to attract attention, particularly with the publication of a study in PLoS Biology1 last month suggesting that the United States spends $28 billion each year on preclinical research that cannot be replicated (see Nature http://doi.org/477; 2015). Responding to the PLoS Biology paper at the time, ASCB executive director and task-force member Stefano Bertuzzi wrote in a blog post that the paper probably overestimated the prevalence of irreproducible results, creating the “wrong impression” that all of that money was wasted. In reality, he writes, many reproducibility problems could be resolved by modifying and refining experimental approaches.

The ASCB survey confirmed that view, showing that 60% of those who reported problems with reproducibility said that they were able to fix the issue by checking with the lab that conducted the experiment in question to resolve issues about the methods. “The first thing to do is contact the other lab and try to sort it out,” Winey says. But some may be reluctant, because of competition or other reasons.

To enable fundamental change, researchers should adopt consistent standards of proof in their field, Winey says. As an example, the report highlights the work of Daniel Klionsky, a cellular biologist at the University of Michigan in Ann Arbor. He led a community effort to establish specific criteria for proving that a cellular process can be called autophagy (the breakdown of unnecessary or non-functional cell components). Klionsky, also editor-in-chief of the journal Autophagy, worked with many others in his field to codify those standards in two papers in the journal, which have been highly cited by autophagy researchers.

The report also highlights support for the San Francisco Declaration on Research Assessment (DORA), created after discussions at the 2012 ASCB annual meeting. It reiterates that scientists should put less emphasis on the impact factors of publications and focus instead on finding other ways to judge the merits of published work. A majority of respondents to the survey said that the quest to publish in high-profile journals hampered reproducibility, an opinion shared by Casadevall.

“Science is messy, and high-impact journals often demand clean stories with a clear punchline,” he says. “That creates perverse incentives for cherry-picking data.”

Journals, however, are helping to improve reproducibility, says US cell biologist Kenneth Yamada, an academic editor of the Journal of Cell Biology (JCB). The JCB, for instance, screens all images — not to look for misconduct, but to make sure that the images follow best-practice guidelines. There could be bigger steps to come, Yamada notes. “My personal preference is that all primary data should be published,” he says. “That would solve a significant number of problems.” Yamada says that he has already approached the JCB about this possibility: “They’re receptive to the idea.”

Journal name:
Nature
Volume:
523,
Pages:
385
Date published:
()
DOI:
doi:10.1038/523385f

Corrections

Corrected:

The original version incorrectly gave Arturo Casadevall’s main affiliation as Albert Einstein College of Medicine of Yeshiva University in New York City, where he holds an adjunct position.

References

  1. Freedman, L., Cockburn, I., and Simcoe, T. PLoS Biol. 13, e1002165 (2015)

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.

Comments for this thread are now closed.

Comments

Comments Subscribe to comments

There are currently no comments.

sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up

Listen

new-pod-red

Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.