Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Psychologists update their beliefs about effect sizes after replication studies



Self-correction—a key feature distinguishing science from pseudoscience—requires that scientists update their beliefs in light of new evidence. However, people are often reluctant to change their beliefs. We examined belief updating in action by tracking research psychologists’ beliefs in psychological effects before and after the completion of four large-scale replication projects. We found that psychologists did update their beliefs; they updated as much as they predicted they would, but not as much as our Bayesian model suggests they should if they trust the results. We found no evidence that psychologists became more critical of replications when it would have preserved their pre-existing beliefs. We also found no evidence that personal investment or lack of expertise discouraged belief updating, but people higher on intellectual humility updated their beliefs slightly more. Overall, our results suggest that replication studies can contribute to self-correction within psychology, but psychologists may underweight their evidentiary value.

This is a preview of subscription content

Access options

Rent or Buy article

Get time limited or full article access on ReadCube.


All prices are NET prices.

Fig. 1: Overview of study procedures.
Fig. 2: A visual example of calculating a Bayesian posterior.
Fig. 3: Summary of results for all studies.

Data availability

All data are available on the Open Science Framework (

Code availability

The algorithm for computing Bayesian posteriors is available on the Open Science Framework (


  1. 1.

    Kunda, Z. The case for motivated reasoning. Psychol. Bull. 108, 480–498 (1990).

    CAS  PubMed  Google Scholar 

  2. 2.

    Nickerson, R. S. Confirmation bias: a ubiquitous phenomenon in many guises. Rev. Gen. Psychol. 2, 175–220 (1998).

    Google Scholar 

  3. 3.

    McCarthy, R. J. et al. Registered replication report on Srull and Wyer (1979). Adv. Methods Pract. Psychol. Sci. 1, 321–336 (2018).

    Google Scholar 

  4. 4.

    Verschuere, B. et al. Registered replication report on Mazar, Amir, and Ariely (2008). Adv. Methods Pract. Psychol. Sci. 1, 299–317 (2018).

    Google Scholar 

  5. 5.

    Klein, R. A. et al. Many Labs 2: investigating variation in replicability across samples and settings. Adv. Methods Pract. Psychol. Sci. 1, 443–490 (2018).

    Google Scholar 

  6. 6.

    Camerer, C. F. et al. Evaluating replicability of laboratory experiments in economics. Science 351, 1433–1436 (2016).

    CAS  PubMed  Google Scholar 

  7. 7.

    Camerer, C. F. et al. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat. Hum. Behav. 2, 637–644 (2018).

    PubMed  Google Scholar 

  8. 8.

    Ebersole, C. R. et al. Many Labs 3: evaluating participant pool quality across the academic semester via replication. J. Exp. Soc. Psychol. 67, 68–82 (2016).

    Google Scholar 

  9. 9.

    Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015).

  10. 10.

    Fanelli, D. Negative results are disappearing from most disciplines and countries. Scientometrics 90, 891–904 (2012).

    Google Scholar 

  11. 11.

    Ioannidis, J. P. Why science is not necessarily self-correcting. Perspect. Psychol. Sci. 7, 645–654 (2012).

    PubMed  Google Scholar 

  12. 12.

    McElreath, R. & Smaldino, P. E. Replication, communication, and the population dynamics of scientific discovery. PLoS ONE 10, e0136088 (2015).

    PubMed  PubMed Central  Google Scholar 

  13. 13.

    Nissen, S. B., Magidson, T., Gross, K. & Bergstrom, C. T. Research: publication bias and the canonization of false facts. eLife 5, e21451 (2016).

    PubMed  PubMed Central  Google Scholar 

  14. 14.

    Scheel, A. M., Schijen, M. & Lakens, D. An excess of positive results: comparing the standard psychology literature with registered reports. Adv. Methods Pract. Psychol. Sci. 4, 1–12 (2021).

    Google Scholar 

  15. 15.

    van Elk, M. et al. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming. Front. Psychol. 6, 1365 (2015).

    PubMed  PubMed Central  Google Scholar 

  16. 16.

    Anderson, C. A., Lepper, M. R. & Ross, L. Perseverance of social theories: the role of explanation in the persistence of discredited information. J. Personal. Soc. Psychol. 39, 1037–1049 (1980).

    Google Scholar 

  17. 17.

    Hayden, T. & Mischel, W. Maintaining trait consistency in the resolution of behavioral inconsistency: the wolf in sheep’s clothing? J. Personal. 44, 109–132 (1976).

    Google Scholar 

  18. 18.

    Hergovich, A., Schott, R. & Burger, C. Biased evaluation of abstracts depending on topic and conclusion: further evidence of a confirmation bias within scientific psychology. Curr. Psychol.: A J. Divers. Perspect. Divers. Psychol. Issues 29, 188–209 (2010).

    Google Scholar 

  19. 19.

    Koehler, J. J. The influence of prior beliefs on scientific judgments of evidence quality. Organ. Behav. Hum. Decis. Process. 56, 28–55 (1993).

    Google Scholar 

  20. 20.

    Luchins, A. S. in The Order of Presentation in Persuasion (ed. C. Hovland) (Yale Univ. Press, 1957).

  21. 21.

    Mahoney, M. J. Publication prejudices: an experimental study of confirmatory bias in the peer review system. Cogn. Ther. Res. 1, 161–175 (1977).

    Google Scholar 

  22. 22.

    Rhine, R. J. & Severance, L. J. Ego-involvement, discrepancy, source credibility, and attitude change. J. Personal. Soc. Psychol. 16, 175–190 (1970).

    Google Scholar 

  23. 23.

    Ross, L., Lepper, M. R. & Hubbard, M. Perseverance in self-perception and social perception: biased attributional processes in the debriefing paradigm. J. Personal. Soc. Psychol. 32, 880–892 (1975).

    CAS  Google Scholar 

  24. 24.

    Hart, W. et al. Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychol. Bull. 135, 555–588 (2009).

    PubMed  PubMed Central  Google Scholar 

  25. 25.

    Jacks, J. Z. & Cameron, K. A. Strategies for resisting persuasion. Basic Appl. Soc. Psychol. 25, 145–161 (2003).

    Google Scholar 

  26. 26.

    Lord, C. G., Ross, L. & Lepper, M. R. Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence. J. Personal. Soc. Psychol. 37, 2098–2109 (1979).

    Google Scholar 

  27. 27.

    Munro, G. D. The scientific impotence excuse: discounting belief‐threatening scientific abstracts. J. Appl. Soc. Psychol. 40, 579–600 (2010).

    Google Scholar 

  28. 28.

    Munro, G. D. & Ditto, P. H. Biased assimilation, attitude polarization, and affect in reactions to stereotype-relevant scientific information. Personal. Soc. Psychol. Bull. 23, 636–653 (1997).

    Google Scholar 

  29. 29.

    Munro, G. D., Leary, S. P. & Lasane, T. P. Between a rock and a hard place: biased assimilation of scientific information in the face of commitment. North Am. J. Psychol. 6, 431–444 (2004).

    Google Scholar 

  30. 30.

    Barrett, L. F. Psychology Is Not in Crisis. The New York Times (1 September 2015).

  31. 31.

    Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J. & Reinero, D. A. Contextual sensitivity in scientific reproducibility. Proc. Natl Acad. Sci. USA 113, 6454–6459 (2016).

    PubMed  PubMed Central  Google Scholar 

  32. 32.

    Duhem, P. La théorie physique, son objet et sa structure (Chevalier et Rivière, 1906).

  33. 33.

    Gershman, S. J. How to never be wrong. Psychon. Bull. Rev. 26, 13–28 (2019).

    PubMed  Google Scholar 

  34. 34.

    Quine, W. V. O. Two dogmas of empiricism. Philos. Rev. 60, 20–43 (1951).

    Google Scholar 

  35. 35.

    Apsler, R. & Sears, D. O. Warning, personal involvement, and attitude change. J. Personal. Soc. Psychol. 9, 162–166 (1968).

    CAS  Google Scholar 

  36. 36.

    MacCoun, R. J. Biases in the interpretation and the use of research results. Annu. Rev. Psychol. 49, 259–287 (1998).

    CAS  PubMed  Google Scholar 

  37. 37.

    Miller, A. G., McHoskey, J. W., Bane, C. M. & Dowd, T. G. The attitude polarization phenomenon: role of response measure, attitude extremity, and behavioral consequences of reported attitude change. J. Personal. Soc. Psychol. 64, 561–574 (1993).

    Google Scholar 

  38. 38.

    Murdock, B. B. Jr. The serial position effect of free recall. J. Exp. Psychol. 64, 482–488 (1962).

    Google Scholar 

  39. 39.

    Nisbett, R. E. & Ross, L. Human Inference: Strategies and Shortcomings of Social Judgment (Prentice-Hall, 1980).

  40. 40.

    Panagiotou, O. A. & Ioannidis, J. P. A. Primary study authors of significant studies are more likely to believe that a strong association exists in a heterogeneous meta-analysis compared with methodologists. J. Clin. Epidemiol. 65, 740–747 (2012).

    PubMed  Google Scholar 

  41. 41.

    Pronin, E. & Kugler, M. B. Valuing thoughts, ignoring behavior: the introspection illusion as a source of the bias blind spot. J. Exp. Soc. Psychol. 43, 565–578 (2007).

    Google Scholar 

  42. 42.

    Kahneman, D. Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011).

  43. 43.

    Krumrei-Mancuso, E. & Rouse, S. The development and validation of the Comprehensive Intellectual Humility Scale. J. Personal. Assess. 98, 209–221 (2016).

    Google Scholar 

  44. 44.

    Leary, M. R. et al. Cognitive and interpersonal features of intellectual humility. Personal. Soc. Psychol. Bull. 43, 793–813 (2017).

    Google Scholar 

  45. 45.

    McElroy, S. E. et al. Intellectual humility: scale development and theoretical elaborations in the context of religious leadership. J. Psychol. Theol. 42, 19–30 (2014).

    Google Scholar 

  46. 46.

    Hardwicke, T. E., et al. Post-replication citation patterns in psychology: four case studies. Adv. Methods Pract. Psychol. Sci. (2021).

  47. 47.

    Kuran, T. Private Truths, Public Lies: The Social Consequences of Preference Falsification (Harvard Univ. Press, 1997).

  48. 48.

    Tetlock, P. E., Mellers, B. A. & Scoblic, J. P. Bringing probability judgments into policy debates via forecasting tournaments. Science 355, 481–483 (2017).

    CAS  PubMed  Google Scholar 

  49. 49.

    Jost, A. Die Assoziationsfestigkeit in ihrer Abha ̈ngigkeit von der Verteilung der Wiederholungen [The strength of associations in their dependence on the distribution of repetitions]. Z. Psychol. Physiol. Si. 16, 436–472 (1897).

    Google Scholar 

  50. 50.

    Longino, H. E. Science as Social Knowledge (Princeton Univ. Press, 1990).

  51. 51.

    Oreskes, N. Why Trust Science? (Princeton Univ. Press, 2019).

  52. 52.

    Carlson, J. & Harris, K. Quantifying and contextualizing the impact of bioRxiv preprints through automated social media audience segmentation. PLoS Biol. 18, e3000860 (2020).

    CAS  PubMed  PubMed Central  Google Scholar 

  53. 53.

    Pashler, H. & Harris, C. R. Is the replicability crisis overblown? Three arguments examined. Perspect. Psychol. Sci. 7, 531–536 (2012).

    PubMed  Google Scholar 

  54. 54.

    Ebersole, C. R., et al. (2019). Many Labs 5: Testing pre-data collection peer review as an intervention to increase replicability (results-blind manuscript).

  55. 55.

    Frederick, S. Cognitive reflection and decision making. J. Econ. Perspect. 19, 25–42 (2005).

    Google Scholar 

  56. 56.

    Browne, W. J., Goldstein, H. & Rasbash, J. Multiple membership multiple classification (MMMC) models. Stat. Model. 1, 103–124 (2001).

    Google Scholar 

Download references


We received funding from grant 1728332 from the National Science Foundation (A.M.T. and S.V.) The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. The authors are grateful to K. Finnigan and J. Sun for their assistance in recruiting study participants, J. Miranda for helping upload and organize data files on the Open Science Framework, C. Ebersole, R. Klein and D. Simons for providing updates on the timelines for publication of Many Labs 2 and Many Labs 5 and W. Hart and D. McDiarmid for feedback on statistical analyses.

Author information




A.D.M., A.M.T. and S.V. designed the study. A.D.M., A.M.T. and C.M.W. developed stimuli and collected data. A.D.M. created the analytic plan and analysed data with contributions from A.M.T., P.E.S. and E.E.S. A.D.M. and A.M.T. wrote the manuscript, and all authors edited the manuscript and gave conceptual advice.

Corresponding authors

Correspondence to Alex D. McDiarmid or Alexa M. Tullett.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Peer review information Nature Human Behaviour thanks Barbara Spellman and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

McDiarmid, A.D., Tullett, A.M., Whitt, C.M. et al. Psychologists update their beliefs about effect sizes after replication studies. Nat Hum Behav (2021).

Download citation


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing