For the past five years, the Nature journals that publish life science research have required authors to complete a reporting summary that compiles key information about the materials, methods and analysis used in a manuscript1. This checklist has in general been well received2 and studies suggest that it has improved the quality of published research3,4. However, the reporting summary previously focussed mainly on laboratory-based and biomedical sciences, making it a poor fit for many ecology and evolution studies, particularly those that are based on field-collected data. Therefore, while we are reluctant to introduce more forms for our authors to complete, we hope that the introduction of an ecological, evolutionary and environmental sciences module into our reporting summary will improve both its usability and usefulness (https://www.nature.com/authors/policies/ReportingSummary.pdf).

The reproducibility crisis confronting all of science, and certain disciplines in particular, has been widely reported and discussed. Over the past decade or so, several types of approach have been introduced to tackle the crisis. These include improved data transparency5, greater respect for attempts to replicate published studies, striving for enhanced statistical rigour, and the use of checklists to help authors, reviewers and editors ensure that egregious oversights are avoided. A recent development at the Nature journals is that the reporting summary is now published alongside the paper6, in order to make reproducibility efforts more transparent and accessible to readers.

The reporting summary has also evolved in response to feedback and the needs of different disciplines. Most recently, the editors of Nature Ecology & Evolution, together with expert advisors from the research community and our colleagues from other Nature journals, have designed a new set of checklist questions that are better suited to the research we publish. These questions have been combined with the previous ones into a modular reporting summary that allows authors to select only the sections that apply to their manuscript. For example, authors of studies that do not involve fieldwork will not be required to answer questions on geographic location, and questions about dating and specimen provenance will apply only to archaeological and palaeontological studies. We envisage that most of our authors will fill in the ecological, evolutionary and environmental sciences module, but some may select either the original life sciences or the new behavioural and social sciences modules as being more relevant (the latter is described in more detail in a recent Nature Editorial7). The sections of the reporting summary that are filled in by all authors, such as the statistical checklist, also now include questions that are more relevant to ecologists and evolutionary biologists; for example, specific information about Bayesian methods.

While revising our reporting summary, we made considerable use of the Tools for Transparency in Ecology and Evolution (TTEE; https://osf.io/g65cb/) developed by the Open Science Framework. The TTEE is at the forefront of promoting best practice in the ecology and evolution community, and has guided initiatives at several journals; for example, the new guidelines for statistical practice recently announced by Conservation Letters8. A recent blog post by TTEE associates discusses where ecology and evolution stand in relation to the reproducibility crisis in science more broadly, what we can learn from other disciplines, and why we should challenge assumptions about a priori expectations and the likelihood of bias (http://www.ecoevotransparency.org/2018/01/31/a-conversation-where-do-ecology-and-evolution-stand-in-the-broader-reproducibility-crisis-of-science/).

We will continue to investigate ways in which we can make the Nature journal reporting summary more useful, and this will be done in collaboration with researchers. For example, the focus is still predominantly on empirical research, rather than theory or research synthesis. So, please do contact us with suggestions for improvements, which we will endeavour to take into account the next time the document is revised.

Our checklists are designed to help authors, reviewers and editors verify scientific integrity, and are compulsory for publication. But informal checklists can also guide the publication process. In this issue, a Perspective by Parker et al. sets out a checklist that can be used by peer reviewers. The authors present ten questions that they suggest reviewers should ask themselves when assessing a manuscript. These include questions about the reporting of methods, data and statistics, and questions about potential biases of interpretation. The idea is not to add to the burden on peer reviewers, who may also have journal checklists to look at, but to provide an optional aid to systematic assessment. For new peer reviewers, this could be useful as a training exercise, whereas more established reviewers might use it to streamline the process and reduce the chance of oversights. The peer-reviewer checklist can be used for assessing manuscripts submitted to any journal, whether or not that journal uses an author checklist, and although it has been developed primarily for ecology and evolution, most of the questions can be applied far more widely.

Checklists are not going to solve all of science’s problems with reproducibility and transparency. Major changes in training and research culture are also needed. However, checklists do provide a useful tool to address a subset of problems and can help increase the momentum for wider changes. As Parker et al. note, checklists are also used by surgeons, pilots and architects to ensure minimum standards of safety and free up intellectual capacity for the more interesting creative problems.