Trust in scientific research relies on transparency and reproducibility. Yet when it comes to complex computer code that underlies many of the earth and planetary science papers published in Nature Geoscience and elsewhere, the view can be frustratingly opaque. There is not only the obstacle for researchers to adequately document and share their codes, but obstacles for peer reviewers and readers to install, run, and test them. Part of tackling shortcomings in computational reproducibility requires making the sharing and peer review of code easier.

Credit: ronstik / Alamy Stock Photo

A decade ago, Nature Portfolio journals first introduced a policy on computer code. This required a code availability statement describing how and where the code can be accessed and any restrictions to access for all papers where code is central to the main conclusions. The policy also encouraged the sharing of code as best practice. At the time, there were many hurdles to researchers sharing code, despite the benefits to reproducibility, so authors’ statements were more often than not that the code was not available. Still, we hoped our policy would help nudge geoscientists working with code towards a more transparent future1.

In 2018, the policy for Nature Portfolio journals evolved to more actively encourage the peer review of code by having papers where code is central to the main claims reviewed on a case-by-case basis2. This policy, which remains in force today, states that authors are required to make any new custom computer code used to generate their results available to editors and reviewers upon request (although exceptions can be made if software is unfeasible to share or run by reviewers, or if there are legal restrictions). However, the practicalities of code peer review are complex and — until now — Nature Geoscience has not been set up to systematically and consistently run code peer review.

Since July, authors submitting primary research articles to any Nature Portfolio journal can share their code for peer review on the cloud-based reproducibility platform Code Ocean as part of the submission process. For authors who opt to use this service, if the paper is sent to review, they will next be sent instructions on how to set up a “Compute Capsule” on Code Ocean’s web interface. Reviewers will then be sent instructions on how to access the capsule confidentially. An advantage of Code Ocean is that the code can be run and checked by reviewers anonymously and without them having to install the code themselves.

If the paper is published, the capsule is made publicly available with a citable DOI. If the paper is not accepted, the authors have the option to maintain the capsule. There are no strings attached: the Code Ocean service is free for both our authors and reviewers to use.

Authors do not have to use Code Ocean to share their code. There are also limitations on the types of codes that Code Ocean can support. Alternatively, authors may use another repository like GitHub or provide the code as supplementary material as part of their submission. Whether Code Ocean is used or not, authors whose papers rely on custom code are asked to let us know where their code can be accessed as part of the submission process, and reviewers are asked whether or not they have checked the code as part of their review.

We continue to require all primary research papers where computer code is central to the main findings to provide a code availability statement. Although we do not require custom code to be made publicly available at this time, as we do for new research data3, we strongly encourage our authors to share code if possible. Our editors have found it is becoming increasingly common to see publicly available code cited in the papers we read and publish, and community norms are shifting towards researchers expecting that transparency. We hope to evolve with the communities we serve and strengthen our code sharing policies in the future. The partnership between Nature Portfolio and Code Ocean to enable and actively support the sharing and peer review of code is one step in that direction.