Nature | News

Cancer reproducibility project scales back ambitions

Budget problems force replication project to drop one-quarter of its workload.

Article tools

Rights & Permissions

An effort to reproduce the key findings of 50 influential cancer studies has announced that it will have to settle for just 37, citing budgetary constraints.

The Reproducibility Project: Cancer Biology aims to get a better, quantitative estimate of the reproducibility of important work and to understand the challenges such efforts present. Begun in 2013, the project is run jointly by the Center for Open Science (COS) in Charlottesville, Virginia, and Science Exchange in Palo Alto, California. A related project looking at key results in psychology attracted attention in August with a meta-analysis1 that failed to replicate the findings of 61 out of 100 studies.

Critics have dismissed the cancer-study endeavour as time-consuming, out-of-touch with the realities of basic science and unlikely to produce interpretable results. “It’s a naÏveté that by simply embracing this ethic, which sounds eminently reasonable, that one can clean out the Augean stables of science,” says Robert Weinberg, a cancer biologist at the Whitehead Institute for Biomedical Research in Cambridge, Massachusetts. A study2 from his lab about converting cells to a stem-cell-like state is one of the replication attempts that have been put on hold. 

Tim Errington, a project manager at the COS, says that he and his team chose to continue with experiments that had made the most progress, and stop those that were at an early stage or that involved pricey animal studies. “The best thing for us to do is move forward with what we have,” Errington says. Although the decision to scale back the effort was made in June, the project, which maintains detailed records online, revealed the sidelined experiments only this week. The authors of the original studies had not been formally notified of the decision.

Repetitive strain

With a budget of US$1.6 million, the Reproducibility Project: Cancer Biology planned to choose key experiments from highly cited research papers in cancer biology from 2010–12, consult with the original authors about methods and publish peer-reviewed plans for the replication attempts. It would then identify a lab to do the work and publish the results in the journal eLife.

The project originally budgeted $25,000–$35,000 on average for each experiment, but Errington says that this figure turned out to be too low. Confronted with time-consuming peer reviews, materials-transfer agreements and costly experiments involving animals, the team determined earlier this year that the figure should be roughly $40,000 per experiment on average.

“I don’t think that was truly appreciated at the beginning of the project.” Errington says. The team scrutinized the 23 replication studies that had made the least progress and chose to stop pursuing 10 that involved animal experiments and another three for which contact with the original authors had been minimal. Most of the papers put on hold are Nature publications. (Nature's news team is editorially independent of its research editorial team.)

“This is news to me,” says René Bernards, a cancer biologist at the Netherlands Cancer Institute in Amsterdam. Replication of his team's paper3, exploring why a cancer drug behaves differently in distinct but related cancers, did require animal studies. Errington says that authors were not notified of the change in status in case his team was able to secure extra funding and resume efforts on the stalled projects. He is hopeful that his group or someone else may be able to pick up the work eventually.

Project leaders expect that a small batch of replications for the cancer project will be published early next year. The rest, along with a meta-analysis of all the results, are anticipated to be released before the end of 2017.

Although he is disappointed by having to scale back the project's ambitions, Errington says that it has been an important part of the learning process. “This gives us a nicer understanding for what the costs are of replication,” he says. “It’s so concrete. It’s a nice way to watch where every dollar goes.”

Journal name:


  1. Open Science Collaboration. Science (2015).

  2. Chaffer, C. L. et al. Proc. Natl Acad. Sci. 108, 79507955 (2011).

  3. Prahallad, A. et al. Nature 483, 100103 (2012).

For the best commenting experience, please login or register as a user and agree to our Community Guidelines. You will be re-directed back to this page where you will see comments updating in real-time and have the ability to recommend comments to other users.


Commenting is currently unavailable.

sign up to Nature briefing

What matters in science — and why — free in your inbox every weekday.

Sign up



Nature Podcast

Our award-winning show features highlights from the week's edition of Nature, interviews with the people behind the science, and in-depth commentary and analysis from journalists around the world.