Abstract
Self-correction—a key feature distinguishing science from pseudoscience—requires that scientists update their beliefs in light of new evidence. However, people are often reluctant to change their beliefs. We examined belief updating in action by tracking research psychologists’ beliefs in psychological effects before and after the completion of four large-scale replication projects. We found that psychologists did update their beliefs; they updated as much as they predicted they would, but not as much as our Bayesian model suggests they should if they trust the results. We found no evidence that psychologists became more critical of replications when it would have preserved their pre-existing beliefs. We also found no evidence that personal investment or lack of expertise discouraged belief updating, but people higher on intellectual humility updated their beliefs slightly more. Overall, our results suggest that replication studies can contribute to self-correction within psychology, but psychologists may underweight their evidentiary value.
This is a preview of subscription content, access via your institution
Access options
Access Nature and 54 other Nature Portfolio journals
Get Nature+, our best-value online-access subscription
$29.99 / 30 days
cancel any time
Subscribe to this journal
Receive 12 digital issues and online access to articles
$119.00 per year
only $9.92 per issue
Buy this article
- Purchase on SpringerLink
- Instant access to full article PDF
Prices may be subject to local taxes which are calculated during checkout
Similar content being viewed by others
Data availability
All data are available on the Open Science Framework (https://doi.org/10.17605/OSF.IO/JTP4B).
Code availability
The algorithm for computing Bayesian posteriors is available on the Open Science Framework (https://doi.org/10.17605/OSF.IO/Y5N3F).
References
Kunda, Z. The case for motivated reasoning. Psychol. Bull. 108, 480–498 (1990).
Nickerson, R. S. Confirmation bias: a ubiquitous phenomenon in many guises. Rev. Gen. Psychol. 2, 175–220 (1998).
McCarthy, R. J. et al. Registered replication report on Srull and Wyer (1979). Adv. Methods Pract. Psychol. Sci. 1, 321–336 (2018).
Verschuere, B. et al. Registered replication report on Mazar, Amir, and Ariely (2008). Adv. Methods Pract. Psychol. Sci. 1, 299–317 (2018).
Klein, R. A. et al. Many Labs 2: investigating variation in replicability across samples and settings. Adv. Methods Pract. Psychol. Sci. 1, 443–490 (2018).
Camerer, C. F. et al. Evaluating replicability of laboratory experiments in economics. Science 351, 1433–1436 (2016).
Camerer, C. F. et al. Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nat. Hum. Behav. 2, 637–644 (2018).
Ebersole, C. R. et al. Many Labs 3: evaluating participant pool quality across the academic semester via replication. J. Exp. Soc. Psychol. 67, 68–82 (2016).
Open Science Collaboration. Estimating the reproducibility of psychological science. Science 349, aac4716 (2015).
Fanelli, D. Negative results are disappearing from most disciplines and countries. Scientometrics 90, 891–904 (2012).
Ioannidis, J. P. Why science is not necessarily self-correcting. Perspect. Psychol. Sci. 7, 645–654 (2012).
McElreath, R. & Smaldino, P. E. Replication, communication, and the population dynamics of scientific discovery. PLoS ONE 10, e0136088 (2015).
Nissen, S. B., Magidson, T., Gross, K. & Bergstrom, C. T. Research: publication bias and the canonization of false facts. eLife 5, e21451 (2016).
Scheel, A. M., Schijen, M. & Lakens, D. An excess of positive results: comparing the standard psychology literature with registered reports. Adv. Methods Pract. Psychol. Sci. 4, 1–12 (2021).
van Elk, M. et al. Meta-analyses are no substitute for registered replications: a skeptical perspective on religious priming. Front. Psychol. 6, 1365 (2015).
Anderson, C. A., Lepper, M. R. & Ross, L. Perseverance of social theories: the role of explanation in the persistence of discredited information. J. Personal. Soc. Psychol. 39, 1037–1049 (1980).
Hayden, T. & Mischel, W. Maintaining trait consistency in the resolution of behavioral inconsistency: the wolf in sheep’s clothing? J. Personal. 44, 109–132 (1976).
Hergovich, A., Schott, R. & Burger, C. Biased evaluation of abstracts depending on topic and conclusion: further evidence of a confirmation bias within scientific psychology. Curr. Psychol.: A J. Divers. Perspect. Divers. Psychol. Issues 29, 188–209 (2010).
Koehler, J. J. The influence of prior beliefs on scientific judgments of evidence quality. Organ. Behav. Hum. Decis. Process. 56, 28–55 (1993).
Luchins, A. S. in The Order of Presentation in Persuasion (ed. C. Hovland) (Yale Univ. Press, 1957).
Mahoney, M. J. Publication prejudices: an experimental study of confirmatory bias in the peer review system. Cogn. Ther. Res. 1, 161–175 (1977).
Rhine, R. J. & Severance, L. J. Ego-involvement, discrepancy, source credibility, and attitude change. J. Personal. Soc. Psychol. 16, 175–190 (1970).
Ross, L., Lepper, M. R. & Hubbard, M. Perseverance in self-perception and social perception: biased attributional processes in the debriefing paradigm. J. Personal. Soc. Psychol. 32, 880–892 (1975).
Hart, W. et al. Feeling validated versus being correct: a meta-analysis of selective exposure to information. Psychol. Bull. 135, 555–588 (2009).
Jacks, J. Z. & Cameron, K. A. Strategies for resisting persuasion. Basic Appl. Soc. Psychol. 25, 145–161 (2003).
Lord, C. G., Ross, L. & Lepper, M. R. Biased assimilation and attitude polarization: the effects of prior theories on subsequently considered evidence. J. Personal. Soc. Psychol. 37, 2098–2109 (1979).
Munro, G. D. The scientific impotence excuse: discounting belief‐threatening scientific abstracts. J. Appl. Soc. Psychol. 40, 579–600 (2010).
Munro, G. D. & Ditto, P. H. Biased assimilation, attitude polarization, and affect in reactions to stereotype-relevant scientific information. Personal. Soc. Psychol. Bull. 23, 636–653 (1997).
Munro, G. D., Leary, S. P. & Lasane, T. P. Between a rock and a hard place: biased assimilation of scientific information in the face of commitment. North Am. J. Psychol. 6, 431–444 (2004).
Barrett, L. F. Psychology Is Not in Crisis. The New York Times (1 September 2015).
Van Bavel, J. J., Mende-Siedlecki, P., Brady, W. J. & Reinero, D. A. Contextual sensitivity in scientific reproducibility. Proc. Natl Acad. Sci. USA 113, 6454–6459 (2016).
Duhem, P. La théorie physique, son objet et sa structure (Chevalier et Rivière, 1906).
Gershman, S. J. How to never be wrong. Psychon. Bull. Rev. 26, 13–28 (2019).
Quine, W. V. O. Two dogmas of empiricism. Philos. Rev. 60, 20–43 (1951).
Apsler, R. & Sears, D. O. Warning, personal involvement, and attitude change. J. Personal. Soc. Psychol. 9, 162–166 (1968).
MacCoun, R. J. Biases in the interpretation and the use of research results. Annu. Rev. Psychol. 49, 259–287 (1998).
Miller, A. G., McHoskey, J. W., Bane, C. M. & Dowd, T. G. The attitude polarization phenomenon: role of response measure, attitude extremity, and behavioral consequences of reported attitude change. J. Personal. Soc. Psychol. 64, 561–574 (1993).
Murdock, B. B. Jr. The serial position effect of free recall. J. Exp. Psychol. 64, 482–488 (1962).
Nisbett, R. E. & Ross, L. Human Inference: Strategies and Shortcomings of Social Judgment (Prentice-Hall, 1980).
Panagiotou, O. A. & Ioannidis, J. P. A. Primary study authors of significant studies are more likely to believe that a strong association exists in a heterogeneous meta-analysis compared with methodologists. J. Clin. Epidemiol. 65, 740–747 (2012).
Pronin, E. & Kugler, M. B. Valuing thoughts, ignoring behavior: the introspection illusion as a source of the bias blind spot. J. Exp. Soc. Psychol. 43, 565–578 (2007).
Kahneman, D. Thinking, Fast and Slow (Farrar, Straus and Giroux, 2011).
Krumrei-Mancuso, E. & Rouse, S. The development and validation of the Comprehensive Intellectual Humility Scale. J. Personal. Assess. 98, 209–221 (2016).
Leary, M. R. et al. Cognitive and interpersonal features of intellectual humility. Personal. Soc. Psychol. Bull. 43, 793–813 (2017).
McElroy, S. E. et al. Intellectual humility: scale development and theoretical elaborations in the context of religious leadership. J. Psychol. Theol. 42, 19–30 (2014).
Hardwicke, T. E., et al. Post-replication citation patterns in psychology: four case studies. Adv. Methods Pract. Psychol. Sci. https://doi.org/10.1177/25152459211040837 (2021).
Kuran, T. Private Truths, Public Lies: The Social Consequences of Preference Falsification (Harvard Univ. Press, 1997).
Tetlock, P. E., Mellers, B. A. & Scoblic, J. P. Bringing probability judgments into policy debates via forecasting tournaments. Science 355, 481–483 (2017).
Jost, A. Die Assoziationsfestigkeit in ihrer Abha ̈ngigkeit von der Verteilung der Wiederholungen [The strength of associations in their dependence on the distribution of repetitions]. Z. Psychol. Physiol. Si. 16, 436–472 (1897).
Longino, H. E. Science as Social Knowledge (Princeton Univ. Press, 1990).
Oreskes, N. Why Trust Science? (Princeton Univ. Press, 2019).
Carlson, J. & Harris, K. Quantifying and contextualizing the impact of bioRxiv preprints through automated social media audience segmentation. PLoS Biol. 18, e3000860 (2020).
Pashler, H. & Harris, C. R. Is the replicability crisis overblown? Three arguments examined. Perspect. Psychol. Sci. 7, 531–536 (2012).
Ebersole, C. R., et al. (2019). Many Labs 5: Testing pre-data collection peer review as an intervention to increase replicability (results-blind manuscript). https://doi.org/10.31234/osf.io/sxfm2
Frederick, S. Cognitive reflection and decision making. J. Econ. Perspect. 19, 25–42 (2005).
Browne, W. J., Goldstein, H. & Rasbash, J. Multiple membership multiple classification (MMMC) models. Stat. Model. 1, 103–124 (2001).
Acknowledgements
We received funding from grant 1728332 from the National Science Foundation (A.M.T. and S.V.) The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript. The authors are grateful to K. Finnigan and J. Sun for their assistance in recruiting study participants, J. Miranda for helping upload and organize data files on the Open Science Framework, C. Ebersole, R. Klein and D. Simons for providing updates on the timelines for publication of Many Labs 2 and Many Labs 5 and W. Hart and D. McDiarmid for feedback on statistical analyses.
Author information
Authors and Affiliations
Contributions
A.D.M., A.M.T. and S.V. designed the study. A.D.M., A.M.T. and C.M.W. developed stimuli and collected data. A.D.M. created the analytic plan and analysed data with contributions from A.M.T., P.E.S. and E.E.S. A.D.M. and A.M.T. wrote the manuscript, and all authors edited the manuscript and gave conceptual advice.
Corresponding authors
Ethics declarations
Competing interests
The authors declare no competing interests.
Additional information
Peer review information Nature Human Behaviour thanks Barbara Spellman and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.
Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Supplementary information
Supplementary information
Supplementary Data 1–10 and Methods.
Rights and permissions
About this article
Cite this article
McDiarmid, A.D., Tullett, A.M., Whitt, C.M. et al. Psychologists update their beliefs about effect sizes after replication studies. Nat Hum Behav 5, 1663–1673 (2021). https://doi.org/10.1038/s41562-021-01220-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1038/s41562-021-01220-7
This article is cited by
-
Considering the Purposes of Moral Education with Evidence in Neuroscience: Emphasis on Habituation of Virtues and Cultivation of Phronesis
Ethical Theory and Moral Practice (2024)
-
A meta-analysis of correction effects in science-relevant misinformation
Nature Human Behaviour (2023)
-
Predictors and consequences of intellectual humility
Nature Reviews Psychology (2022)
-
Can scientists change their minds?
Nature Human Behaviour (2021)