Skip to main content

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

Empowering peer reviewers with a checklist to improve transparency

Abstract

Peer review is widely considered fundamental to maintaining the rigour of science, but it often fails to ensure transparency and reduce bias in published papers, and this systematically weakens the quality of published inferences. In part, this is because many reviewers are unaware of important questions to ask with respect to the soundness of the design and analyses, and the presentation of the methods and results; also some reviewers may expect others to be responsible for these tasks. We therefore present a reviewers’ checklist of ten questions that address these critical components. Checklists are commonly used by practitioners of other complex tasks, and we see great potential for the wider adoption of checklists for peer review, especially to reduce bias and facilitate transparency in published papers. We expect that such checklists will be well received by many reviewers.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Relationship between prior probability, statistical power and the false positive report probability.

Similar content being viewed by others

References

  1. Parker, T. H. et al. Transparency in ecology and evolution: real problems, real solutions. Trends Ecol. Evol. 31, 711–719 (2016).

    Article  Google Scholar 

  2. TTEE Working Group Tools for Transparency in Ecology and Evolution (TTEE) (Open Science Framework, 2016); https://osf.io/g65cb/

  3. Arriaga, A. F. et al. Simulation-based trial of surgical-crisis checklists. New Engl. J. Med. 368, 246–253 (2013).

    Article  CAS  Google Scholar 

  4. Gawande, A. A. The Checklist Manifesto: How to Get Things Right (Metropolitan Books, New York, 2009).

    Google Scholar 

  5. Gerstner, K. et al. Will your paper be used in a meta-analysis? Make the reach of your research broader and longer lasting. Methods Ecol. Evol. 8, 777–784 (2017).

    Article  Google Scholar 

  6. Ferreira, V. et al. A meta-analysis of the effects of nutrient enrichment on litter decomposition in streams. Biol. Rev. 90, 669–688 (2015).

    Article  Google Scholar 

  7. Fidler, F., Burgman, M. A., Cumming, G., Buttrose, R. & Thomason, N. Impact of criticism of null-hypothesis significance testing on statistical reporting practices in conservation biology. Conserv. Biol. 20, 1539–1544 (2006).

    Article  Google Scholar 

  8. Zhang, Y., Chen, H. Y. H. & Reich, P. B. Forest productivity increases with evenness, species richness and trait variation: a global meta-analysis. J. Ecol. 100, 742–749 (2012).

    Article  Google Scholar 

  9. Nickerson, R. S. Confirmation bias: a ubiquitous phenomenon in many guises. Rev. Gen. Psychol. 2, 175–220 (1998).

    Article  Google Scholar 

  10. Fischhoff, B. Hindsight not equal to foresight – effect of outcome knowledge on judgment under uncertainty. J. Exp. Psychol. Human. 1, 288–299 (1975).

    Article  Google Scholar 

  11. Kozlov, M. V., Zverev, V. & Zvereva, E. L. Confirmation bias leads to overestimation of losses of woody plant foliage to insect herbivores in tropical regions. PeerJ 2, e709 (2014).

    Article  Google Scholar 

  12. van Wilgenburg, E. & Elgar, M. A. Confirmation bias in studies of nestmate recognition: a cautionary note for research into the behaviour of animals. PLoS ONE 8, e53548 (2013).

  13. Holman, L., Head, M. L., Lanfear, R. & Jennions, M. D. Evidence of experimental bias in the life sciences: why we need blind data recording. PLoS Biol. 13, e1002190 (2015).

  14. Lee, C. J., Sugimoto, C. R., Zhang, G. & Cronin, B. Bias in peer review. Adv. Inf. Sci. 64, 2–17 (2013).

    Google Scholar 

  15. Mislan, K. A. S., Heer, J. M. & White, E. P. Elevating the status of code in ecology. Trends Ecol. Evol. 31, 4–7 (2016).

    Article  CAS  Google Scholar 

  16. Fidler, F. et al. Metaresearch for evaluating reproducibility in ecology and evolution. BioScience 67, 282–289 (2017).

    PubMed  PubMed Central  Google Scholar 

  17. Parker, T. H. What do we really know about the signalling role of plumage colour in blue tits? A case study of impediments to progress in evolutionary biology. Biol. Rev. 88, 511–536 (2013).

    Article  Google Scholar 

  18. Simmons, J. P., Nelson, L. D. & Simonsohn, U. A 21 word solution. Dialogue 26, 4–7 (2012).

    Google Scholar 

  19. Kardish, M. R. et al. Blind trust in unblinded observation in ecology, evolution and behavior. Front. Ecol. Evol. 3, 51 (2015).

    Article  Google Scholar 

  20. Simmons, J. P., Nelson, L. D. & Simonsohn, U. False positive psychology: undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychol. Sci. 22, 1359–1366 (2011).

    Article  Google Scholar 

  21. Forstmeier, W., Wagenmakers, E.-J. & Parker, T. H. Detecting and avoiding likely false-positive findings – a practical guide. Biol. Rev. 92, 1941–1968 (2017).Forstmeier et al. present insights that can help reviewers recognize and guide authors away from potentially biased and unreliable reporting.

    Article  Google Scholar 

  22. Forstmeier, W. & Schielzeth, H. Cryptic multiple hypotheses testing in linear models: overestimated effect sizes and the winner’s curse. Behav. Ecol. Sociobiol. 65, 47–55 (2011).

    Article  Google Scholar 

  23. Palmer, A. R. Quasireplication and the contract of error: lessons from sex ratios, heritabilities and fluctuating asymmetry. Annu. Rev. Ecol. Syst. 31, 441–480 (2000).

    Article  Google Scholar 

  24. Halsey, L. G., Curran-Everett, D., Vowler, S. L. & Drummond, G. B. The fickle P value generates irreproducible results. Nat. Methods 12, 179–185 (2015).

    Article  CAS  Google Scholar 

  25. Gelman, A. & Weakliem, D. Of beauty, sex, and power. Am. Sci. 97, 310–316 (2009).

    Article  Google Scholar 

  26. Barto, E. K. & Rillig, M. C. Dissemination biases in ecology: effect sizes matter more than quality. Oikos 121, 228–235 (2012).Barto and Rillig provide evidence that various forms of bias, rather than concerns about data quality, have often influenced publication patterns in ecology.

    Article  Google Scholar 

  27. Lemoine, N. P. et al. Underappreciated problems of low replication in ecological field studies. Ecology 97, 2554–2561 (2016).Lemoine et al. discuss how bias can emerge from low-powered studies, and also how bias can be avoided, even in systems where low power is inevitable due to logistical constraints.

    Article  Google Scholar 

  28. Møller, A. P. & Jennions, M. D. How much variance can be explained by ecologists and evolutionary biologists? Oecologia 132, 492–500 (2002).

    Article  Google Scholar 

  29. Duffy, J. E., Godwin, C. M. & Cardinale, B. J. Biodiversity effects in the wild are common and as strong as key drivers of productivity. Nature 549, 261–264 (2017).

    Article  CAS  Google Scholar 

  30. Nakagawa, S. & Cuthill, I. C. Effect size, confidence interval and statistical significance: a practical guide for biologists. Biol. Rev. 82, 591–605 (2007).

    Article  Google Scholar 

  31. Benjamin, D. J. et al. Redefine statistical significance. Nat. Hum. Behav. 2, 6–10 (2018).

    Article  Google Scholar 

Download references

Acknowledgements

We thank A. Moore for suggestions that improved the manuscript.

Author information

Authors and Affiliations

Authors

Contributions

T.H.P. composed the original draft of this manuscript in consultation with S.C.G. and S.N. S.N. made the figure. The manuscript was edited substantially over multiple rounds with input from all co-authors.

Corresponding author

Correspondence to Timothy H. Parker.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

Supplementary Text

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Parker, T.H., Griffith, S.C., Bronstein, J.L. et al. Empowering peer reviewers with a checklist to improve transparency. Nat Ecol Evol 2, 929–935 (2018). https://doi.org/10.1038/s41559-018-0545-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41559-018-0545-z

This article is cited by

Search

Quick links

Nature Briefing Anthropocene

Sign up for the Nature Briefing: Anthropocene newsletter — what matters in anthropocene research, free to your inbox weekly.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing: Anthropocene