Recent years have seen mounting concerns around reproducibility in the scientific literature. While the specific form and severity of those concerns varies from discipline to discipline, field to field, and sub-field to sub-field, it’s clear that more needs to be done at many different levels to try to address them. An important part of our job as journal editors is to listen to these concerns from our authors, reviewers and readers and to change the way we operate. To this end, based on what we hear, we have been introducing new transparency requirements and reporting standards that we hope will make the science published in our journal more transparent and more replicable for the scientific community.

One such introduction was our data availability policy. Since late 2016, Nature Energy and the other Nature journals have required a statement to be included in papers that explains how to access the data contained within them, including any restrictions that prevent the data from being shared1. This policy works in conjunction with our policy on availability of other materials during submission, including code, to help other researchers build on the findings we publish.

In February of this year, we also expanded our policy on competing interests2. In the past, authors were only required to declare competing interests of a financial nature — funding support, employment, stocks and shares, patents, and so on — where individuals or organizations stood to gain or lose financially as a result of the publication. Now, authors must also declare any competing non-financial interests, such as membership of governmental, non-governmental, advocacy or lobbying organizations, or service as an expert witness, to name just a few examples. By including this information, we hope to provide greater transparency for reviewers and readers around potential influences on the researchers whose work they are reading.

To facilitate this change — and make it easier for authors to ensure compliance with all our editorial policies on research ethics and reproducibility — Nature journals have now introduced an editorial policy checklist that must be completed for submitted manuscripts before they can be sent to reviewers. The information contained within the checklist should in turn offer reviewers a quick reference guide to high-level information on issues such as competing interests, data availability and custom code.

Policies like the above apply to all disciplines and so by necessity are very general in nature. However, each discipline and each field face their own challenges with transparency and reproducibility. Nature journals have thus been working for many years to develop more tailored solutions to meet the needs of specific groups of researchers. Many of our authors and reviewers may be familiar with our reporting summary for solar cells research, which was implemented in 2015. Arising from discussions between experts in the solar photovoltaics community and editors of Nature-branded journals, this reporting summary is intended to increase the transparency of reporting solar cell characterizations and to facilitate the reproducibility of published results. Previously only provided for reviewers, our journals will now publish these summaries alongside their papers. Key technical information on measurement and analysis procedures, such as the light source used, should still be included in the published articles. The published summary will help readers locate this crucial information in each article and identify potential limitations. We also hope that the summaries may help readers approach and compare the procedures and most-common practices used in different areas of photovoltaics research.

We are also introducing a reporting summary for the behavioural and social sciences. As with the solar cells reporting summary, this summary arose from discussions among the behavioural and social science editors across Nature journals and engagements with experts across the relevant disciplines. Indeed, the social sciences cover a remarkable range of disciplines that approach similar questions from different perspectives and methodological traditions, including experimental, survey and observational designs, quantitative and qualitative analysis, and lab-based and field studies. With this breadth in mind, the behavioural and social science reporting summary was designed to be flexible with regards to the kinds of responses authors may provide. For instance, the sampling considerations are necessarily very different for a quantitative psychology experiment conducted in the lab and a qualitative analysis of semi-structured expert interviews. However, in both cases authors should be able to provide justification for the sample used, whether it is university students or key stakeholders, and provide a full report of how sample size was selected, whether based on convenience, availability of suitable subjects, or statistical methods. It is our hope that describing study elements in a standardized way across the broad range of methodologies and types of data will foster greater appreciation and understanding of research approaches in our interdisciplinary readership.

These reporting summaries are not intended to be exhaustive; nor do they seek to impose prescribed standards on the communities they apply to. Their aim instead is to provide clear and concise information on key aspects of a study and present them in a consistent manner that can be easily referred to outside the main text of the paper. We fully expect that they will continue to evolve and develop over time so that they can better serve their communities. In that sense, we encourage authors, reviewers and readers to provide feedback on all of the above. Our policies arise from the conversations we have with the many different types of researcher we engage with; by working together on such reporting and transparency measures, we hope to make a useful contribution to counter reproducibility concerns for the communities we serve.