Here’s a one-two punch to spark camaraderie among scientists. First, ask: “How long did it take to get your PhD?” Then follow up with: “How long would it have taken if all your experiments had worked the first or second time?”
Part of the probable time difference is due to inexperience, but not all of it. News last month brought a powerful reminder that access to detailed methods can be essential for getting experiments to work. In 2013, the US$1.6-million Reproducibility Project: Cancer Biology set out to repeat key experiments from 50 high-profile cancer papers, and so assess the extent to which published results can be replicated. Instead, the project has decided to stop at 18 papers. One big reason for this was the difficulty of working out what exactly was done in the original experiments. Protocols — precise step-by-step recipes for repeating experiments — are missing from published research more often than not, and even the original researchers can have trouble pinpointing particulars years later.
I sympathize with those who struggle to chase down these details, but I am glad that their efforts have generated such a buzz. It will accelerate a shift towards better reporting.
Now should be springtime for methods sharing. Mobile-friendly, web-based technologies are maturing just as the need to improve reproducibility has gained widespread attention. A new era of more-efficient, more-confident science is ours to lose.
My own obsession with sharing methods started with an all-too-common frustration. I spent the first year and a half of my postdoc working out a protocol for single-cell microscopy. Assiduous tinkering showed that subtle changes to sample preparation were crucial. Instead of 1 microlitre of a chemical, I needed 5. Instead of a 15-minute incubation, I needed an hour. Alas, the general technique had already been published, so I got no credit for the work. Anyone else using the published recipe would have been either getting misleading results, or sharing my frustration at having had to waste time discovering the necessary adjustments for themselves — hence my enthusiasm for a central place to update protocols and share tips.
In 2012, two colleagues and I decided to launch exactly this kind of resource. It turns out that we were far from the first to recognize this need. When my PhD co-adviser realized I was serious about leaving my postdoc to pursue such a project, he connected me with a researcher who had tried something similar a dozen years earlier. That project, BioProtocol.com, raised a million dollars of venture capital to build a protocol repository. This was in the era of flip phones with green-and-black screens, and before online tools and ideals of sharing had surged. The company shut down during the dotcom bust, but the entrepreneurs retained a wealth of experience and insight, which they generously shared. Another venture, Protocol Online, was launched around the same time to organize disparate life-science protocols in a central database. Later efforts include OpenWetWare (a wikipedia-like site for sharing step-by-step protocols) and Protocol Exchange (a preprint server for protocols, hosted by the publisher of Nature).
We launched protocols.io (a company in which I own equity) in 2014. This open-access repository of science methods lets researchers create and modify protocols, update versions and share them, either with select collaborators or with everyone. The protocols are dynamic and interactive, rather than static PDFs, so researchers can take notes and track time on smartphones as they perform experiments. When they publish papers, they can get a persistent identifier (in the form of a digital object identifier, or DOI) for their protocol and add it to the methods section. More than 10,000 protocols have now been uploaded, attracting 100,000 views every month. We and the authors frequently receive e-mails from researchers thanking us for saving them time.
One reason these efforts have taken off is that increased attention to problems with reproducibility has spurred initiatives with publishers, vendors, funders, and individual laboratories. Some 200 journals now instruct authors to link to protocols.io or a similar repository; reviewers have also started to push for this. And funders including the US-based Gordon and Betty Moore Foundation, Chan Zuckerberg Initiative and childhood-cancer charity Alex’s Lemonade Stand Foundation either state explicitly in their guidelines, or plan to state, that methodological resources are to be shared in a public repository.
But the gap between meticulous methods and adequate description remains. To fill it, efforts must start at the bench, well before results are ready to be written up. Lab members and lab heads should be on the lookout for tools that facilitate tracking, and be willing to give them a try. And decisions about how to document and share methods should be made when researchers are designing their experiments, not when they are writing their manuscripts.
Of course, fully described, shared protocols will not fix everything in science. Knowing how much to document is a judgement call; some conditions will necessarily vary between labs. And even a detailed method will not produce reliable results if an experiment is vulnerable to artefacts, or if assays have not been thoroughly validated.
Still, skimpy protocols stall science, and the scientific community must mobilize itself to do more. Writing “we used a slightly modified version of the protocol from PaperZ”, whether or not the full protocol can be found there, is clearly easier than spelling out all the steps — but it’s not good enough. One day soon, publishing a paper without a link to a useful protocol in a methods repository will seem as outdated as using a mobile phone that can’t connect to the Internet.
Nature 560, 411 (2018)
Lenny Teytelman is an employee of protocols.io and owns equity in the company.