As many of our readers know, early this year (Nat. Neurosci. 16, 1, 10.1038/nn0113-1, 2013) we developed a set of guidelines for the reporting of basic methods information in our pages, along with a checklist (available at http://www.nature.com/neuro/pdf/sm_checklist.pdf) for papers, to prompt authors to disclose technical and statistical information in their submissions and to encourage referees to consider these issues in their reviews. Authors are asked to fill out any relevant portions of this checklist before their paper is reviewed, and this checklist is made available to the referees during review. Following acceptance of the paper, the editors work with the authors to ensure that all crucial methods-related information is indeed contained in the final paper. The checklist is by no means exhaustive, but rather focuses on a small number of experimental and analytical design elements critical for the interpretation of research results that are often reported incompletely. For example, authors are required to describe methodological parameters that may introduce bias or influence robustness. The checklist also consolidates several existing policies about data deposition and data presentation. Our experience with this checklist indicates that many referees find such a checklist useful and the resulting published papers are also much clearer. We are pleased to announce that Nature and many of the Nature research journals have now adopted a similar checklist, and as part of this broader initiative we shall now require authors to include minimal statements on reproducibility and replicability as a condition of publication.

We recognize that there is no single prescribed way of conducting an experimental study. Exploratory investigations often are not amenable to the same degree of statistical rigor as hypothesis-testing studies. Indeed, most academic laboratories do not have the means to carry out the level of validation that would be required, for example, to translate a finding from the laboratory to the clinic. However, there is no justification for not reporting, with full transparency, how a study is designed, conducted and analyzed so that reviewers and readers can adequately interpret and build on the results. For studies using biological samples, we will require authors to state whether statistical methods were used (or not) to predetermine sample size, and what criteria they used to identify and deal with outliers while running the experiment. Likewise, although we fully recognize that this is not possible or practical in many experiments, all papers should include a statement on randomization and blinding. Authors will also be required to state how many times the experiments shown in the figures were replicated in the laboratory.

Specifically, we will require more precise descriptions of statistics. To help improve the statistical robustness of papers, Nature and its sister biology journals will now employ statisticians as consultants on the rare papers in which there are serious issues about the statistical techniques used, at the editor's discretion and based on the referees' suggestions.

To allow authors to describe their experimental designs and methods in enough detail for others to interpret and replicate them, the participating journals, including Nature Neuroscience, are removing length restrictions on Methods sections. There will also be no reference limit for papers that are cited solely in the methods section.

To further increase transparency, we now also encourage authors to provide, in tabular form, the data underlying the graphical representations used in figures. This is in addition to our well-established data-deposition policy for specific types of experiments and large data sets. The source data will be made accessible directly from the figure legend for readers interested in seeing it for themselves. We also continue to encourage authors to make use of resources for sharing detailed methods and reagent descriptions by providing direct online linking between primary research articles and Protocol Exchange, an open resource into which authors can deposit the detailed experimental protocols used in their study.

Ensuring systematic attention to reporting and transparency is only a small step toward solving the issues of reproducibility that have been highlighted across the life sciences and particularly in biomedical research. Much bigger underlying issues contribute to the problem. Too many biologists still do not receive adequate training in statistics and other quantitative aspects of their subject. Mentoring of young scientists on matters of rigor and transparency is inconsistent at best. In academia, the ever-increasing pressures to publish and obtain the next level of funding provide little incentive to pursue and publish studies that contradict or confirm previously published results. Those who would put effort into documenting the validity or irreproducibility of a published piece of work have little prospect of seeing their efforts valued by journals and funders; meanwhile, funding and efforts are wasted on false assumptions.

Tackling these issues is a long-term endeavor that will require the commitment of funders, institutions, researchers and publishers. It is encouraging that funding agencies such as the National Institute of Neurological Disorders and Stroke and the National Cancer Institute have led community discussions and are considering recommendations for researchers and themselves. We hope that these efforts will expand further and translate into noticeable improvements. Meanwhile, our effort is a small step in improving how science is reported. We trust that our authors will grasp the significance of this step and we hope that other publishers will adopt similar initiatives.