In April 2013, Nature announced new editorial measures to improve the consistency, transparency and quality of reporting in life sciences articles1. These measures, which are now implemented in all of our sister life sciences journals, include removing length restrictions in the methods section to ensure key methodological details are reported, examining statistics more closely and encouraging deposition of data in public repositories. Key to this initiative is a checklist2, which serves to prompt authors to disclose in their submissions all the information necessary for others to reproduce the work, and to guide referees to consider these issues during peer review. Importantly, many journals, including ours, are now united in this drive to improve reproducibility3.

Beginning in January 2015, we will start asking authors of all life sciences submissions that are sent out for peer review to complete relevant portions of the same checklist (available at http://www.nature.com/authors/policies/checklist.pdf), and will make the document available to referees during review. On acceptance of the paper, editors will work with authors to ensure all the key methods-related information is indeed contained in the final manuscript. To this end, like other participating journals, we will relax our word limits in the methods section as necessary to accommodate all the essential details.

We welcome working with communities to create customized checklists as appropriate.

The checklist is not exhaustive (and not meant to be onerous). It is intended to ensure good reporting by reminding authors to describe in sufficient detail the important experimental designs and methods that are often reported incompletely but are crucial for others to interpret and replicate the work. For example, authors are required to report parameters such as sample size, number and type of replicates, standards or reference used, definition and justification of statistical methods, precise characterization and description of key reagents and materials used and their potential variability, and what criteria were used to include or exclude any data. To improve statistical robustness of papers, we will ask the same pool of consultant statisticians used by our sister journals to examine certain papers, at the editor's discretion and referees' suggestion.

There is of course no single way to run an experiment. Exploratory research, for example, might not always be done with the same statistical rigour as hypothesis-testing studies, and not all laboratories may have the means to perform the level of validation required. There is, however, no good reason for not reporting in full how a study was designed, conducted and analysed. We appreciate that some communities might find a checklist containing requirements specifically relevant to their field to be more useful, and we welcome working with these communities to create customized checklists as appropriate.

There is no good reason for not reporting in full how a study was designed, conducted and analysed.

The checklist also includes a section on data deposition, which consolidates our existing policies on the availability of data and materials, aimed at increasing transparency2. Under these policies authors are “required to make materials, data and associated protocols promptly available to others without undue qualifications” as a condition for publication. We will prompt authors to deposit datasets in community-endorsed public repositories, and to provide the raw data in tabular format for figures and graphs presented in the paper. These source data will be made available for readers who are interested. To enhance reusability of datasets deposited in public repositories, authors can publish data descriptors in metadata journals such as Scientific Data (which is a part of Nature Publishing Group). Publishing such metadata, which describe how the data are collected and formatted, facilitates the discovery, reuse, linking and mining of the data. Furthermore, we will encourage authors to use open resources such as Protocol Exchange to share detailed methods and reagent descriptions, which can be linked back to their primary research article.

The nanoscience and nanotechnology community does not have a comprehensive public repository dedicated to the field. There are, however, nano-specific options such as the recently developed nanomaterials registry4 and the cancer nanotechnology laboratory portal5, which are funded by the US National Institutes of Health, and we will be encouraging authors to deposit data in repositories. We recommend that authors choose repositories that provide expert curation to ensure the data are discoverable and can be linked to the paper (examples include Dryad and Figshare). Over time, it might be valuable for the the nanoscience community to create public repositories for sharing actual nanomaterials.

Tackling these issues is a long-term endeavour and a journey that requires the commitment of all those involved, from funders, institutions and researchers to editors and publishers. Experiences of the new practice at our sister life sciences journals have been positive — editors and referees find the checklist useful. Although there has been a slight delay in manuscript processing times in the initial stages, while authors and referees became more familiar with the requirements, the practice has quickly become routine. By implementing these steps, we hope to further improve the clarity and quality of papers appearing in our journal.