Credit: Illustration by Matt Murphy

When Danielle Fraser first submitted her paper for publication, she had little idea of the painful saga that lay ahead.

She had spent some 18 months studying thousands of fossil species spread across North America from the past 36 million years, and now she had an intriguing result: animal populations were spread widest across latitudes in warm, wet climates. Her work, crucial to earning her PhD at Carleton University in Ottawa, Canada, might be used to make predictions about the response of mammals to climate change — a key question in ecology today. So, with her PhD adviser's encouragement, she sent it to Science in October 2012.

Rate that journal

Ten days later, the paper was rejected with a form letter. She sent it to another prestigious journal, the Proceedings of the National Academy of Sciences. Rejected. Next, she tried Ecology Letters. Bounced. “At this point, I definitely was frustrated. I hadn't even been reviewed and I would've loved to know how to improve the paper,” recalls Fraser. “I thought, 'Let's just get it out and go to a journal that will assess the paper'.”

In May 2013, she submitted the paper to Proceedings of the Royal Society B, considered a high-impact journal in her field. The journal sent it out for review — seven months after her initial submission to Science. “Finally!” Fraser thought. What she didn't know was that she had taken only the first steps down the long, bumpy road to publication: it would take another three submissions, two rejections, two rounds of major revisions and numerous drafts before the paper would finally appear. By that point, she could hardly bear to look at it.

Fraser's frustration is widely shared: researchers are increasingly questioning the time it takes to publish their work. Many say that they feel trapped in a cycle of submission, rejection, review, re-review and re-re-review that seems to eat up months of their lives, interfere with job, grant and tenure applications and slow down the dissemination of results. In 2012, Leslie Vosshall, a neuroscientist at the Rockefeller University in New York City, wrote a commentary that lamented the “glacial pace” of scientific publishing1. “In the past three years, if anything, it's gotten substantially worse,” she says now. “It takes forever to get the work out, regardless of the journal. It just takes far too long.”

Publishing: The peer-review scam

But is the publication process actually becoming longer — and, if so, then why? To find out, Nature examined some recent analyses on time to publication — many of them performed by researchers waiting for their own work to see the light of day — and spoke to scientists and editors about their experiences.

The results contain some surprises. Daniel Himmelstein, a computational-biology graduate student at the University of California, San Francisco, analysed all the papers indexed in the PubMed database that had listed submission and acceptance dates. His study, done for Nature, found no evidence for lengthening delays2: the median review time — the time between submission and acceptance of a paper — has hovered at around 100 days for more than 30 years (see 'Paper wait'). But the analysis comes with major caveats. Not all journals — including some high-profile ones — deposit such time-stamp data in PubMed, and some journals show when a paper was resubmitted, rather than submitted for the first time. “Resetting the clock is an especially pernicious issue,” Himmelstein says, and it means that the analysis might be underestimating publication delays.

Concern raised over payment for fast-track peer review

Some data suggest that wait times have increased within certain subsets of journals, such as popular open-access ones and some of the most sought-after titles. At Nature, the median review time has grown from 85 days to just above 150 days over the past decade, according to Himmelstein's analysis, and at PLoS ONE it has risen from 37 to 125 days over roughly the same period.

Many scientists find this odd, because they expect advances in digital publishing and the proliferation of journals to have sped things up. They say that journals are taking too long to review papers and that reviewers are requesting more data, revisions and new experiments than they used to. “We are demanding more and more unreasonable things from each other,” says Vosshall. Journal editors counter that science itself has become more data-rich, that they work to uphold high editorial and peer-review standards and that some are dealing with increasing numbers of papers. They also say that they are taking steps to expedite the process.

Social sciences suffer from severe publication bias

Publication practices and waiting times also vary widely by discipline — with social sciences being notoriously slow. In physics, the pressure to publish fast is reduced because of the common practice of publishing preprints — early versions of a paper before peer review — on the arXiv server. Some of the loudest complaints about publication delays come from those in biological fields, in which competition is fierce and publishing in prestigious journals can be required for career advancement. This month, a group of more than 70 scientists, funders, journal editors and publishers are meeting at the Howard Hughes Medical Institute campus in Chevy Chase, Maryland, to discuss whether biologists should adopt the preprint model to accelerate publishing. “We need a fundamental rethinking of how we do this,” Vosshall says.

Credit: Sources: Review time: Daniel Himmlstein; Wait times: Stephen Royle; Production Time: Daniel Himmelstein; Data: Ref. 5

The pitch

In March 2012, Stephen Royle, a cell biologist at the University of Warwick, UK, started on a publication mission of his own. His latest work answered a controversial question about how cells sense that chromosomes are lined up before dividing, so he first sent it to Nature Cell Biology (NCB), because it is a top journal in his field and an editor there had suggested he send it after hearing Royle give a talk. It was rejected without review. Next, he sent it to Developmental Cell. Rejected. His next stop, the Journal of Cell Biology, sent the paper out for review. It came back with a long list of necessary revisions — and a rejection.

Royle and his lab spent almost six months doing the suggested experiments and revising the paper. Then he submitted the updated manuscript to Current Biology. Rejected. EMBO Journal. Reviewed and rejected.

Finally, in December 2012, he submitted it to the Journal of Cell Science (JCS), where it was reviewed. One reviewer mentioned that they had already assessed it at another journal and thought that it should have been published then. They wrote that the work was “beautifully conducted, well controlled, and conservatively interpreted”. A second reviewer said that it should not be published. The editor at JCS decided to accept it. The time between first submission to Nature Cell Biology and acceptance at JCS was 317 days. It appeared online another 53 days later3. The work went on to win the JCS prize as the journal's top paper for 2013.

Despite the accolade, Royle says that the multiple rejections were demoralizing for his student, who had done the experiments and needed the paper to graduate. He also thinks that the paper deserved the greater exposure that comes from publication in a more prestigious journal. “Unfortunately, the climate at the moment is that if papers aren't in those very top journals, they get overlooked easily,” he says. And Royle, who has done several publication-time analyses and blogged about what he found, has shown that this experience is not unusual. When he looked at the 28 papers that his lab had published in the previous 12 years, the average time to gestate from first submission to publication was the same as a human baby — about 9 months (see go.nature.com/79h2n3).

The focus on bibliometrics makes papers less useful

But how much of these delays were his own doing? To publish the chromosome paper, Royle indulged in the all-too-familiar practice of journal shopping: submitting first to the most prestigious journals in his field (often those with the highest impact factor) and then working his way down the hierarchy. (Nature Cell Biology's current impact factor is 19; JCS's is 5.) Journal impact factor or reputation are widely used by scientists and grant-review and hiring committees as a proxy for the quality of the paper. On the flip side, critics say that editors seek out the splashiest papers to boost their publication's impact factor, something that encourages journal shopping, increases rejection rates and adds to the wait time. Journal editors reject this; Ritu Dhand, Nature editorial director in London, says that Nature's policy of selecting original, important work “may lead to citation impact and media coverage, but Nature editors aren't driven by those considerations”.

How much time does journal shopping add? In the analysis of his group's research papers, Royle found that more than half were shopped around, and that this consumed anywhere from a few days to more than eight months. He went on to analyse all the papers published in 2013 that are indexed in PubMed, and examined whether higher impact factor correlated with longer median publication times. He found an inverted bell-shaped curve — the journals with the lowest and highest impact factors had longer review times than did those in the middle. For the vast majority of those in the middle, review times stood at around 100 days — matching Himmelstein's analysis. Those with the very highest impact factors (30–50) had a review time of 150 days, supporting the idea that pitching a paper to a series of top journals could result in significant delays in publication.

What’s in a journal name?

Many scientists, editors and publishers have long acknowledged that journal name is a flawed measure of the quality and value of a piece of research — but the problem shows no signs of going away. “Where your paper is published doesn't say anything about you, your paper's impact or whether it's right or wrong,” says Maria Leptin, director of EMBO, an organization of Europe's leading life scientists and publisher of the EMBO Journal. “Nobody has the courage to say, we, as a funding organization, or we, as a tenure committee, are not going to look at where you publish as opposed to what you publish.”

And the obsession with prestigious journals is just one source of delay — as Fraser, who was battling to publish her paper on ancient animal populations — was about to found out.

Peer review

By October 2013, a full year had passed since Fraser had first submitted her paper to a journal, and she had pretty much stopped caring about impact factor. By this point, the paper had spent two months in review at Proceedings of the Royal Society B, before coming back with mixed reports — and a rejection. So Fraser decided to try PLoS ONE, a journal that says it will publish any rigorous science, regardless of its significance, scope or anticipated citations. It has an impact factor of 3, and a reputation for rapid publication.

Pioneer behind controversial PubPeer site reveals his identity

PLoS ONE sent the paper out to a single reviewer. Two months later, Fraser got a decision letter that essentially stated that the paper was rejected but might be eligible for re-review if the suggested revisions were made. She made the revisions, adding citations and a small amount of reanalysis. In March 2014, she resubmitted the manuscript, which PLoS ONE sent out to a different reviewer. Another two months passed before she received the new review: major revisions, please.

“I'm just happy they didn't tell me to go away,” recalls Fraser. “I do have e-mails from the time that say, '1-millionth draft'!” She persevered, making more revisions to meet the reviewer's demands, and in June 2014 submitted the paper to PLoS ONE for a third time. Success! The paper4 was published online 23 months after she had first sent it to Science. The long peer-review and revision process did improve the paper, Fraser says now. “It was really much better.” But did the main conclusion of the paper change? “Not really.”

It takes forever to get the work out, regardless of the journal.

Last year, Chris Hartgerink, a behavioural-sciences graduate student at Tilburg University in the Netherlands, ran an analysis of the Public Library of Science (PLOS) family of journals since the first one launched in 2003. (He chose the journals largely because they make the data easily accessible, and because he was waiting for a paper to be published in PLoS ONE.) He found that the mean review time had roughly doubled in the past decade, from 50–130 days to 150–250 days, depending on the journal (see go.nature.com/s3voeq). And when Royle looked at eight journals that had published cell-biology papers over the past decade, he found that publication times had lengthened at seven of them, mostly because review times had stretched out.

Open access is tiring out peer reviewers

One contention is that peer reviewers now ask for more. When Ron Vale, a cell biologist at the University of California, San Francisco, analysed biology papers that had been published in Cell, Nature and the Journal of Cell Biology, in the first six months of 1984 and compared that with the same period in 2014, he found that both the average number of authors and the number of panels in experimental figures rose by 2–4 fold5. This showed, he argued, that the amount of data required for a publication had gone up, and Vale suspects that much of the added data come from authors trying to meet reviewers' demands. Scientists grumble about overzealous critics who always seem to want more, or different, experiments to nail a point. “It's very rare for the revisions to fundamentally change a paper — the headline doesn't change,” Royle says. His analysis of his group's publication times showed that almost 4 months of the average 9-month gestation was spent revising papers for resubmission.

Many scientists also blame journal editors, who, they say, can be reluctant to provide clear guidance and decisions to authors when reviews are mixed — unnecessarily stringing out the review and revision process. Journal heads disagree, and say that their editors are accomplished at handling mixed reviews. Cell editor-in-chief Emilie Marcus in Cambridge, Massachusetts, says that editors at her journal take responsibility for publication decisions and help authors to map out a plan for revisions.

Technological advances mean that research now involves handling more and more data, editors say, and there is greater emphasis on making that information available to the community. Marcus says that her journal is working to cut review times by, for example, increasing the number of papers that go through only one round of revision — 14% of their papers did so in 2015. In 2009, Cell also restricted the amount of supplemental material that could accompany papers as a way to keep requests for “additional, unrelated experiments” at bay.

The image detective who roots out manuscript flaws

PLOS executive editor Veronique Kiermer, based in San Francisco, declined to discuss the specifics of Fraser's paper, but she called its total review time of nine months an “outlier” and said that it was “not ideal to have research being evaluated by a single person”. She acknowledges that PLoS ONE's publication time has risen; one factor is that the volume of papers has, too — from 200 in 2006 to 30,000 per year now — and it takes time to find and assign appropriate editors and reviewers. (PLOS used 76,000 reviewers in 2015.) Another, says Kiermer, is that the number of essential checkpoints — including competing-interest disclosures, animal-welfare reports and screens for plagiarism — have increased in the past decade. “We'll do everything we can both in terms of technology and looking at workflows to bring these times to publication down,” she says.

Dhand says that at Nature, too, editors find it harder to find reviewers than in the past, “presumably because there are so many more papers that need reviewing”. Himmelstein found that the number of papers in PubMed more than doubled between 2000 and 2015, reaching nearly 1 million articles.

Technology advances

Digital publishing may have had benefits in shortening 'production' time — the time from acceptance to publication — rather than time in review. In Himmelstein's analysis, time spent in production has halved since the early 2000s, falling to a stable median of 25 days.

Several new journals and online publishing platforms have promised to speed up the process even more. PeerJ, a family of journals that launched in 2013, is one of several that now encourage open peer review, in which reviewers' names and comments are posted alongside articles. The hope is that the transparency will prevent unnecessary delays or burdensome revision requests from reviewers.

It's very rare for the revisions to fundamentally change a paper.

The biomedical and life-sciences journal eLife launched in 2012 with a pledge to make initial editorial decisions within a few days and to review papers quickly. Reviewers get strict instructions not to suggest the 'perfect experiment', and they can ask for extra analysis only if it can be completed within 2 months. Otherwise, the paper is rejected. Randy Schekman, a cell biologist at the University of California, Berkeley, and editor-in-chief of eLife, says that these policies mean that more than two-thirds of the journal's accepted papers undergo just one round of review.

In a 2015 analysis, Himmelstein created a ranking by the median review time for all 3,482 journals that had papers with time stamps in the PubMed database from January 2014 to June 2015 (see go.nature.com/sscrr6). PeerJ had a relatively fast time: 74 days after submission. At eLife, it took 108 days, and PLoS ONE took 117. By comparison, Cell's review time was 127 days; Nature's was 173 days; PLoS Medicine took 177 days; and Developmental Cell was among the slowest of the popular biomedical journals, at 194 days. Marcus notes that comparison between journals is difficult because the publications define received, revised and accepted days differently, and that Developmental Cell places a high priority on timely review.

Preprints reconsidered

Open journals that piggyback on arXiv gather momentum

One way for biologists to accelerate publication is by embracing preprints. These allow work to quickly receive credit and critique, says Bruno Eckhardt, associate editor of Physical Review E and a theoretical physicist at the University of Marburg in Germany. “It is almost like going on Facebook — it means you are ready to go public,” he says. A preprint submitted to bioRxiv — a server run by Cold Spring Harbor Laboratory in New York — is published online within 24 hours and given a digital object identifier (DOI); subsequent revisions are time-stamped and anyone can read and comment on the paper. “The minute a research story gets into the public domain, it benefits from the collective power of different brains looking at a problem,” says Vale. What's more, proponents say, preprint publishing can simply be added onto the conventional publication process. F1000Research, which launched in 2012, does this by publishing papers first, then inviting open peer review and revision.

Some scientists are going a step further, and using platforms such as GitHub, Zenodo and figshare to publish each hypothesis, data collection or figure as they go along. Each file can be given a DOI, so that it is citable and trackable. Himmelstein, who already publishes his papers as preprints, has been using the Thinklab platform to progressively write up and publish the results of a new project since January 2015. “I push 'publish' and it gets a DOI with no delay,” he says. “Am I really gaining that much by publishing [in a conventional journal]? Or is it better to do what is fastest and most efficient to get your research out there?”

But preprints and real-time digital publishing platforms are no panacea. Vosshall says that many biologists are “terrified” of preprints because they fear getting scooped by competitors or losing credit and intellectual-property rights for their ideas. And even after preprint publishing, scientists can still find themselves slogging through peer review and chasing high-impact journals for a final publication to adorn their CV. Vosshall says that the scientific community relies on conventional journals to serve as a 'prestige filter' so that important papers are brought to the attention of the right readers. Without them, “How do we find the good stuff?” she asks.

The future of the postdoc

For Fraser, her PLoS ONE publication proved a success. When the paper was finally published after its almost-two-year wait, she got positive responses, she says. It has been viewed nearly 2,000 times, had 51 shares on Facebook and Twitter and got 280 downloads. The publication also helped her to secure her current position — as a postdoctoral fellow at the Smithsonian Institution Museum of Natural History in Washington DC. “I pretty much got the top postdoc that I could have gotten.”

Still, the whole process is not something she wants to endure again — so these days, she tends to send her papers to mid-range journals that are likely to publish her work right away. “If my ultimate goal is to get a faculty job, I can't afford to wait two years on a single paper,” she says.