All scientists should find the time to understand the software packages that they use to collect, analyse and display their data, and share this knowledge with new researchers.
Proven cases of misconduct in science are rare but they do happen, and this means the research community — including journal publishers — must have policies to prevent and detect misconduct. In recent years, various Nature journals have been randomly 'spot checking' figures from papers that have been 'accepted in principle'. At Nature Nanotechnology, we are thankful that we have experienced only the occasional case of “oops, my student was not aware of this” and “sorry, but we thought it was okay to make the images clearer”. These authors were quick to provide the original data and the issues were rapidly sorted before publication. However, it is clear there is still plenty of teaching and learning to do among senior and junior scientists to keep up with our ever-changing digital era.
Scientific wrongdoings range from sloppy record keeping to more serious issues such as intentional fraud and the misuse of confidential information obtained from reviewing grant applications for a funding agency or manuscripts for a journal. The Office of Research Integrity in the United States defines research misconduct as “fabrication, falsification, or plagiarism”1 and makes it clear that this can occur at any stage of the research process — from writing grant applications, to performing or reviewing research, and reporting results. Fabrication is the making up of data, whereas falsification refers to manipulating data such that the manuscript does not accurately reflect the results obtained. Plagiarism is stealing someone else's ideas, results or words without properly crediting the source (and thus misleading the reader about the contributions of the author). Those found guilty of research misconduct can face dismissal and/or be barred from applying for grants or submitting papers. It is everyone's responsibility in the scientific community — not just institutions, funding agencies and journal publishers — to prevent misconduct at all levels.
A prevalent form of scientific misconduct in recent times has been the manipulation of images using image-processing software. In the wake of several controversies, scientific journal publishers have drafted guidelines for the preparation of images2,3. Although most guidelines are targeted to certain fields of biology, digital processing of data extends into many other disciplines. Not understanding how the data acquisition and analysis software works can easily lead to the distortion of data in any field.
Whether it is combating intentional fraud or ignorance, training on ethics at all levels (from undergraduates to faculty) and face-to-face mentoring are needed. Tying funding to research integrity — whereby institutions are rewarded with 'centre of excellence' status and allowed to apply for certain grants if they show a high commitment towards reinforcing responsible behaviour — has recently4 been proposed. However, creating a culture of awareness and a sense of personal responsibility are just as important5. Several countries around the world6 have also started to address these issues, and the Second World Conference on Research Integrity was held in Singapore in July7.
Specific examples aimed at promoting awareness are the training courses on Responsible Conduct of Research run by Colorado State University, Duke University and others that cover topics ranging from bioethics to publication practices and mentor/trainee responsibilities. As well as these general courses, departments should consider offering practical courses on data preparation methods that meet basic standards set by professional organizations of the respective scientific fields. And because digital technologies evolve so rapidly, researchers may need refresher courses on these topics throughout their career. Last year, the United States National Academies identified the integrity, accessibility and stewardship of research data as important issues resulting from the expanding use of digital technologies in research, and recommended that all researchers should receive training in the management of research data8.
So, what are considered best practices and what constitutes unacceptable image manipulation? Intentional distortion of data by moving, adding or removing parts of an image to misguide, downplay and/or emphasize certain features is not acceptable. The best policy to adopt is to submit images that are minimally processed and, if processing is necessary, retain all original files so they are readily available for assessment if needed. All the key settings and processing manipulations must be documented in the methods or supplementary sections.
Touch-up tools (such as cloning and healing in Photoshop) should be avoided, and changing brightness and contrast is only acceptable when applied across the entire image and applied equally on controls. (Researchers should be aware that modern imaging software contains powerful tools that can be used to check whether images have been manipulated.) In the case of biological data, if several different gels are spliced together, controls and molecular-weight markers should be included and splice sites must be clearly visible and documented in the figure caption. It is impossible to compile a comprehensive set of guidelines that covers all research disciplines but, as a rule, authors should always provide enough information for someone else to reproduce the work.
Senior investigators and corresponding authors are responsible for the accurate representation and reporting of data, and are in charge of educating newcomers in their team on appropriate scientific conduct. It is worthwhile making it routine for everyone to explain and question the way in which data is collected, analysed and processed during lab meetings, and for senior investigators to discuss the fine line between making an image look nice and presentable (that is, beautification), and fraud.
If you discover that you have made a mistake (whether in the beautification, collection or analysis), it is best to own up, even though this might be uncomfortable and bad for your career in the short term. And if you suspect someone else of misconduct, speak up: a guide by Keith-Speigel, Sieber and Koocher offers useful advice on how to avoid these problems and methods to resolve potential issues of misconduct9. Above all, find time to close the gap between what you know and what others know.
Rossner, M. & Yamada, K. M. J. Cell Biol. 166, 11–15 (2004).
Titus, S. & Bosch, X. Nature 466, 436–437 (2010).
Koocher, G. & Keith-Speigel, P. Nature 466, 438–440 (2010).
Ensuring the Integrity, Accessibility, and Stewardship of Research Data in the Digital Age (The National Academies, 2009); http://go.nature.com/gEKQ6J
Keith-Speigel, P., Sieber, J. & Koocher, G. P. Responding to Research Wrongdoing: A User-friendly Guide (2010); http://go.nature.com/hTYwOE