Rear view of computer programmers using laptop at office desk

The computer code that underpins some scientific papers will now get more attention during peer review.Credit: Getty

Computer code written by scientists forms the basis of an increasing number of studies across many fields — and an increasing number of papers that report the results. So, more papers should include these executable algorithms in the peer-review process. From this week, Nature journal editors handling papers in which code is central to the main claims or is the main novelty of the work will, on a case-by-case basis, ask reviewers to check how well the code works, and report back.

The move builds on growing demand in recent years for authors to publish the details of bespoke software used to process and analyse data. And it aims to make studies that use such code more reliable. Computational science — like other disciplines — is grappling with reproducibility problems, partly because researchers find it difficult to reproduce results based on custom-built algorithms or software.

This policy is the latest stage in the evolution of our editorial processes, which aims to keep up with technological change across the research community. All Nature journals, for example, already require that authors make materials, data, code and associated protocols promptly available to readers on request, without undue qualifications. In 2014, the Nature journals adopted a “code availability” policy to ensure that all studies using custom code deemed central to the conclusions include a statement indicating whether and how the code can be accessed, and explain any restrictions to access.

Some journals have for years gone a step further and ensured that the new code or software is checked by peer reviewers and published along with the paper. When relevant, Nature Methods, Nature Biotechnology and, most recently, journals including Nature and Nature Neuroscience encourage authors to provide the source code, installation guide and a sample data set, and to make this code available to reviewers for checking.

To assist authors, reviewers and editors, we have updated our guidelines to authors and have developed a code and software submission checklist to help authors compile and present code for peer review. We also strongly encourage researchers to take advantage of repositories such as GitHub, which allow code to be shared for submission and publication.

According to the guidelines, authors must disclose any restrictions on a program’s accessibility when they submit a paper. Nature understands that in some cases — such as commercial applications — authors may not be able to make all details fully available. Together, editors and reviewers will decide how the code or mathematical algorithm must be presented and released to allow the paper to be published.

Occasionally, other exceptions will be made — for example, when custom code or software needs supercomputers, specialized hardware or very lengthy running times that make it unfeasible for reviewers to run the necessary checks. We also recognize that preparing code in a form that is useful to others, or sharing it, is still not common in some areas of science.

Nevertheless, we expect that most authors and reviewers will see value in the practice. Last year, Nature Methods and Nature Biotechnology between them published 47 articles that hinged on new code or software. Of these, approximately 85% included the source code for review.

As with other scientific fields, the impact of computational tools is determined by their uptake. Open implementation increases the likelihood that other researchers can use and build on techniques. So, although many researchers already embrace the idea of releasing their code on publication, we hope this initiative will encourage more to do so.