After our call to action regarding the suitability and usefulness of a checklist for papers in the field of nanobiomedicine, we received mixed replies.
As the hopes raised by the success of nanoformulations such as Doxil and Abraxane were not fully realized, the nanomedicine community — in particular those working in cancer nanomedicine — have taken a step back to rethink the progress and mistakes made over the years.
While the issue of whether or not nanomedicine has really failed to deliver is controversial and multifaceted, the disproportion between the number of scientific publications in nanomedicine claiming clinical relevance and the actual number of nanoformulations in clinical trials is unquestionable. The reasons for this gap in translation are diverse, but the lack of reproducibility of the reported results has contributed to slowing down progress.
In the September 2018 issue of Nature Nanotechnology, we published a Perspective introducing MIRIBEL1 (Minimum Information Reporting in Bio–Nano Experimental Literature), a guideline suggesting parameters that — according to the authors — need reporting in papers that present research in the field of nano–bio interactions. The guideline aims to standardize the way nano–bio research is presented and should (1) aid data comparison among different papers, and facilitate data availability for meta-analyses and in silico modelling; (2) promote quantitative rather than qualitative reporting, to reduce the gap between preclinical and clinical relevance; (3) be easy to implement; and (4) ensure data reusability and reproducibility.
In the Editorial that accompanied the piece2, we invited our readers to comment on whether transforming the MIRIBEL guidelines into a mandatory checklist for our authors to complete and submit alongside their paper would have helped to improve transparency and reproducibility in nanobiomedicine. We invited our readers to send us their views in this regard, as well as their opinions on whether or not the aspects covered by MIRIBEL were enough to reach the above goals, or if they needed to be complemented with other initiatives or additional points.
Extracts from the Correspondences that we received are now available in the current issue. By publishing short summaries, we want to provide a space for the various voices within the community and to highlight the complex nature of the topic (readers can access the complete pieces in the Supplementary Information). Two of the Correspondences are set apart from the others. One, by Twan Lammers and Gert Storm, provides a snapshot of the heterogeneous views of the community on the utility of MIRIBEL with regard to the four points that it sets out to tackle, something that also emerges from the short summaries. The other, by Helena Florindo, Asaf Madi and Ronit Satchi-Fainaro, shows an analysis of published literature that underscores a trend between the seniority and research experience of an author (intended as a broad term that includes a variety of factors such as — but not limited to — lab funding availability, familiarity with specific methodologies and interdisciplinary collaborations), and the level of adherence to a potential checklist that requires several experiments and techniques to be satisfied.
Generally, our contributors agree that a reporting checklist would be beneficial to develop a more rigorous approach to presenting data, but several think that it would best serve the community if used as a training and reference tool for young researchers, reviewers and editors, rather than as a mandatory requirement for publication. Some argue that good practices in the lab and a scrupulous peer review process should be enough to ensure that all the necessary controls and characterizations have been performed, without the need to follow prescriptive formulations. A recurrent point is that mandating a checklist that requires a high level of standardization might end up stifling innovation, becoming a burden for researchers. On the other hand, many authors indicate the need to include more characterization points, especially taking into account how properties of nanomaterials change according to their environment, which would further complicate the requirements for a potential reporting checklist. Authors with a foot in industry warn about the deep differences between academic and industry settings, which can affect even the way one defines reproducibility. From our side, we feel that the term ‘mandatory’ attached to our original question is confusing; in fact, it would not be our intention to mandate ticking all the boxes of a checklist, but rather we would like to promote transparency, and hopefully reproducibility, by asking researchers to more systematically describe the procedures adopted to characterize a material and its biological interactions, without enforcing which ones to report.
It seems, however, that because of the complexity and kaleidoscopic nature of the issue, further discussion is needed before reaching a consensus. Therefore, for the time being, a simpler and more straightforward approach to the problem is to ask authors to deposit their data in public repositories, easily accessible by peers via an accession code. Since April 2017, when we started demanding a data availability statement in published papers, we have noticed that the majority of the statements indicate that the original data can be obtained from the corresponding authors upon reasonable request. We would like to change this. While we currently mandate data deposition only if a curated database is available, such as in the case of proteomics data, our intention now is to strongly encourage authors to use public repositories as much as possible, even when there are none dedicated to their specific data type. We think that this, and a more critical upfront analysis of the advantages and limitations of one’s research — as advocated by Lammers and Storm — might help with the nanobiomedicine transparency issue, while setting more realistic expectations of the research that is being presented.
Faria, M. et al. Nat. Nanotechnol. 13, 777–785 (2018).
Nat. Nanotechnol. 13, 765 (2018).
About this article
Nature Nanotechnology (2020)