Giving research a sporting chance

Science is a competitive business and in any competition there can be ‘shortcuts’ to success. Transparency is the only way to guarantee that research results can be trusted more than some sporting achievements.

Sport has a long and inglorious history of athletes using unethical means to defeat their competitors. In recent times, the most prevalent approaches to gaining an unfair advantage are the taking of performance-enhancing drugs, a near-universal problem yet one which seems to afflict some sports more than others. Professional cycling, for example, has for decades seen a constant stream of doping crises, with winners of major events retrospectively stripped of their titles and those victories passed to others, or left blank as is the case with the Tour de France from 1999 to 2005. Athletics too is particularly prone, last year resulting in a much reduced Russian team competing at the Olympic Games in Rio de Janeiro, Brazil, following a World Anti-Doping Agency (WADA) recommendation of a blanket ban on all athletes from the country.

Of course, any law or regulation can be transgressed inadvertently or by accident. The tennis player Maria Sharapova is currently attempting a comeback following a 15-month suspension for an anti-doping rule violation that the Court of Arbitration in Sport (CAS) classified as with “No Significant Fault” while also ruling that “she bore some degree of fault”; injury not injunction has prevented her from competing in this month's All England Championships at Wimbledon.

In science, the equivalent to doping is data manipulation. Tweaking a blot or ‘tidying up’ a micrograph to better support a study's conclusions can be thought of as similar to taking a banned steroid in training. Mislabelling a figure or using a control from a different study might share the moral and ethical status of presenting a blood or urine sample from a colleague as your own to pass a spot check.

But unlike sport, science has no ultimate authority or court of appeal; no equivalent to WADA or CAS. Instead, the responsibility for investigating and punishing cases of alleged scientific misconduct lays with universities or research institutions. It can be argued that these organizations are not disinterested parties and that it is in their own interests to downplay the seriousness of any irregularities that take place within their domain. However, it is important for a scientific institution's reputation that justice is seen to be done, as brushing embarrassing events under the carpet can, and should, be more damaging than exposing the extent of a failure in integrity and tackling the problem head on.

Journals, editors and referees must be alert to the possibilities of figure manipulation, data fabrication and other forms of scientific misconduct. We must identify any possible occurrences, question them, correct them and where appropriate report them to the appropriate authority. However, like trackside officials, we have jurisdiction only over what is happening in front of us: that is the particular study under consideration. We must in the first instance trust the work as it is presented to us, and when serious problems arise we must trust scientific institutions to investigate and resolve them appropriately.

This issue of Nature Plants includes a paper from Patrice Dunoyer and colleagues at the Institut de Biologie Moléculaire des Plantes du CNRS, Strasbourg (article no. 17094). It describes a mechanism employed by peanut clump virus to suppress its host's defences by hijacking small inhibitory RNA molecules, which are essential to the plants anti-viral response, and direct them into peroxisomes where they cannot impede the virus' spread. The work is accompanied by a News & Views from José-Antonio Daròs of the Instituto de Biología Molecular y Celular de Plantas, Valencia, Spain (article no. 17098).

Dunoyer has been a long-time colleague and collaborator of Olivier Voinnert, and recently a number of their studies, three with Dunoyer as first author, have been retracted while a number more have had formal corrections published to address problems with presented data. However, these instances were investigated by the CNRS and Dunoyer served a temporary suspension as a result. We therefore treated the study we received as we would any other. It was accepted following two rounds of review, during which it was seen by four reviewers. The published paper contains substantial supplementary information (SI). Along with 10 additional figures, there are a further 12 pages presenting the raw data from which the presented figures have been assembled.

We believe that it is our role to ensure that the work of our authors commands the highest possible level of trust. We feel that using SI in this way is not exceptional, but should be the norm. It is our position that whenever a figure is assembled from multiple experiments or cropped to display only the parts relevant to the described study, the unedited raw data should be provided in SI. Information is inevitably lost when figures are edited, however benignly, but making the original data available ensures that the reproducibility of the result can be judged.

As part of our continuing drive to greater reproducibility, we and the other Nature Research Journals have implemented some changes in our use of reporting summaries. For some time, we have asked authors to complete a Reporting Checklist detailing specific information about the experimental design, statistical and other treatments of data, and its availability. These checklists were available to reviewers but from now on we will be publishing a new, standardized Life Science Reporting Summary (http://go.nature.com/2sVAv9G) with all research papers. More information about these policies can be found in a recent editorial in Nature (http://go.nature.com/2sYgazu).

We are also developing additional documents to cover more specialized types of experimental data. At the moment, these cover chromatin immunoprecipitation sequencing, flow cytometry and magnetic resonance imaging but more accessory summaries will be coming soon. With these, we are not trying to enforce a set of standards for how experiments should be performed, but rather improve the reporting of experiments to establish trust in the reproducibility of their results.

Despite the prizes, science is not a sport. Retrospective disqualification may be disastrous for an sportperson's reputation but the retraction of a scientific study undermines all subsequent work that builds upon it. A culture of transparency ensures the trust upon which scientific advances rely.