In 2012, Nature Nanotechnology published an Editorial titled “Join the dialogue” that discussed the issue of reproducibility in the field of nanotoxicology1. The Editorial voiced the concerns of the community regarding the lack of reporting standards for the characterization of the physicochemical properties of nanomaterials (in isolation and in the experimental environment), and the absence of clear guidelines on how to log vital experimental details. It also prompted our readers to consider how to best tackle this issue, and to comment on whether some kind of checklist should be created and enforced by the journal in an effort to reduce inconsistencies between experiments. A few months later, we published a collection of articles that pointed to possible steps to take to achieve transparency in the field: the overall consensus was that a minimum set of requirements would be beneficial if coupled with parallel initiatives, such as extending the standardization procedure beyond academic publications, adopting specific ontology and creating repositories and databases for data organization2.

Despite this initial push, we have to admit that no editorial action followed. This did not, however, reflect a passive attitude towards the issue, but rather it resulted from the unique challenges that distinguish multidisciplinary areas like nanotechnology, where the term ‘nano’ can be interpreted differently according to the application and where in many cases the properties described are strictly linked to the technique used to measure them — let alone the variability that arises in measures involving biological specimens. Indeed, since 2015 Nature Nanotechnology, in line with all of the Nature Research journals, has implemented a series of compulsory reporting summaries to improve data transparency in papers published in different areas of science, such as lasing, solar cells and life sciences. The aim of these forms is to ensure that authors provide the necessary information for others to reproduce their experiments, with the hope that this might lead to better research practices by raising awareness of the experimental aspects that are known to cause variability in results. A recent survey shows that the reporting summaries are helping to improve the quality of the research published in Nature and guide the design of subsequent research projects, though it seems clear that the ‘reproducibility crisis’ is not solved yet3.

We would like to start that conversation again. In a Perspective in this issue, Faria et al. propose a minimum set of information to be reported for papers in the bio–nano field. The suggested checklist is available at https://osf.io/smvtf/ and can be updated any time. We would like to stress that the proposal is intended as a starting point to discuss which parameters should potentially be reported and that by publishing it we are not endorsing the checklist as such, nor are we immediately planning to implement an official one. Instead, once again, we would like to ask the community to come forward and share their thoughts with us.

Hardly any checklist can cover the entire experimental space and, considering the complexity intrinsic to the field, we appreciate that in specific instances certain aspects might merit a more in-depth description, while others do not require the same level of scrutiny. With this in mind, we would like to pick our readers’ brains and invite you to send us comments on whether or not you think that the suggested checklist is a valuable tool for the field and, if so, whether it should be modified. More generally, our question is whether we should consider a checklist only as an unofficial aid for editors to make sure that the most important experimental parameters have been reported, or whether we should instead ask for a reporting summary to be submitted and published alongside the papers.

Alternatively, the implementation of a checklist could occur only if in sync with other initiatives. An idea is the introduction of a community-led nanomaterial data bank. As previously pointed out, despite the existence of few repositories for the deposition of nanomaterial data (such as MIAN; https://go.nature.com/2PmjwFM), what is missing in the nano community is a concerted effort for establishing a universal, easy-to-use data bank that could work as a reference point and self-checking tool for researchers4. The Protein Data Bank (https://www.rcsb.org/), for example, receives direct submissions from its users, checks that all the requirements have been met, and tabulates the methodological details and the results (that is, the spatial coordinates) from each experiment. What’s more, academic journals and funding agencies rely on this validation to accept a paper or a proposal. Having an analogous streamlined tool in the nanomaterial community could prove fundamental for research advancement, though the problem of defining a set of widely-accepted parameters and experimental conditions that should be reported, deposited and validated still remains.

For this reason, we think that the checklist approach should be explored again. We are now in a better position to draft a reporting summary for nanobiomedicine. First, Faria et al. present a defined set of points that might facilitate a focused and productive debate on the pros and cons of having a nanomaterial checklist. Moreover, we can now leverage on the experience gained with the implementation of reporting summaries in other areas.

The checklist proposed here mainly regards papers in the nanobiomedicine field, but we think that researchers working in environmental sciences might come across similar problems when faced with data reporting nanomaterial characterization and their interaction with the environment, and therefore we would be happy to hear from this community as well.

If you would like to join in the discussion, please send your comments to naturenano@nature.com by 30 November 2018. As we have done in the past, we will consider publication of some of these in the form of short correspondences.