Like many other scientific journals, Nature Biotechnology places a strong emphasis on novelty when selecting research for publication. As a result, studies describing replications or confirmations of previously published reports are less likely to be chosen. And studies detailing null or negative findings may not meet stringent editorial requirements for significance and relevance to our broad readership. Why then are we publishing on p. 965 a replication of a report published elsewhere in the literature?

One reason is that the new report, resulting from a collaboration between miRagen Therapeutics and Monsanto, clarifies what were controversial findings in a paper previously published in Cell Research (22, 107–126, 2012). The latter study, led by Chen-Yu Zhang of Nanjing University, China, required a corrigendum (Cell Res. 22, 273–274, 2012) and sparked vigorous debate because it reported the presence of plant microRNA (miRNA) in human blood plasma and suggested that one in particular, miRNA168a, from ingested rice could traverse into the circulation of mice resulting in the modulation of miRNA target genes in the animal.

In contrast to these findings, the report on p. 965 finds no evidence for uptake of plant miRNA168a in the plasma and liver of mice fed a rice diet. Enzyme-linked immunosorbent assay data from the current study also contradict western blots from the Zhang paper that suggested miR168a directly suppressed levels of low-density lipoprotein receptor adapter protein 1 (LDLRAP1) in mice. Finally, the miRagen study suggests differences in diet composition, rather than miRNA-mediated cross-kingdom gene regulation, likely account for alterations in low-density lipoprotein in mouse plasma.

But why put the paper in Nature Biotechnology rather than Cell Research, where the original report was published? In fact, the miRagen investigators did submit their paper to that journal but were told that “it is a bit hard to publish a paper of which the results are largely negative.”

We differ with this assessment and believe the paper is worthy of publication precisely because it is a negative result throwing light on a key research question.

The original finding from Zhang and colleagues that plant miRNAs are capable of cross-kingdom gene regulation was an extraordinary claim. It went against a large body of research in which the systemic administration of double-stranded RNAs was shown incapable of triggering the RNA interference pathway in humans (and mice). It also raised concerns that plant miRNAs could pose health risks to humans. Indeed, last March, an article published in Environment International (55, 43–55, 2013) went so far as to claim that gene modification of plants using gene silencing mechanisms raises concerns for human health and that these concerns are not adequately considered in food safety assessments. This prompted the regulator Food Standards Australia New Zealand to undertake an assessment of the scientific literature on the issue and to publish a position statement on the regulation of genetically modified crops developed using gene silencing.

When an initial report prompts this level of concern and involves a considerable investment of time, effort and resources from both researchers and regulators in evaluating its findings and understanding its implications, then a carefully controlled and executed replication study clearly warrants publication. It is unfortunate it was not published in Cell Research where it could have been bi-directionally linked to the original paper.

Providing space for publishing a replication study is one way in which Nature Biotechnology and other top-tier journals can facilitate the process of self-correction of the scientific literature. Journals could also actively solicit papers that seek to replicate research studies where corroboration by independent laboratories would be of particular interest (e.g., to corroborate a controversial finding).

The Reproducibility Initiative (Nat. Biotechnol. 30, 806, 2012) represents another way of replicating research. A collaboration between the Science Exchange and PLOS ONE, the initiative offers to broker independent validation of a researcher's work in return for a fee, with subsequent publication in the journal. In October, the Laura and John Arnold Foundation provided $1.3 million to the initiative to authenticate 50 high-profile cancer papers from the past two years (only $20,000 per study).

But the faction with the greatest motivation to replicate academic findings must surely be industry. Companies have the deepest financial resources, and they have the most to gain. And it was groups at Amgen and Bayer that raised the recent chorus of concern about irreproducibility of the literature in the first place. Then again, corporations have few incentives to jump through all the hoops of peer review when they fail to reproduce results; in this respect, miRagen deserves praise for seeking to publish its negative findings.

Apart from the above post-publication correction mechanisms, efforts are also underway to improve reproducibility before findings become papers. For example, one idea being floated by certain funders is to set aside a portion of a research grant specifically for independent verification of the main study's results before publication; in this scheme, submission to a journal would proceed only after the results were corroborated.

This summer, the journal Cortex started offering yet another means of improving reproducibility and reducing bias. The mechanism, termed a “Registered Report,” involves peer review of an investigator's experimental design before data are collected. If the scientific question and methods are deemed sound, then authors are offered “in principle acceptance” of their article, irrespective of the study's outcome.

Replication is a difficult and thankless task. Until now, journals, funders and academics have shown little interest in it. Nature Biotechnology will remain open to publishing replication studies and rigorous efforts that fail to reproduce findings from other publications of high interest to our readers. It is our view, however, that the best practice is to publish such replication failures in the journal where the original findings were published. That way, the power of the scientific process to consolidate and modify our understanding of initial findings in a report is clearly visible to all.