Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • Perspective
  • Published:

Misunderstanding the harms of online misinformation


The controversy over online misinformation and social media has opened a gap between public discourse and scientific research. Public intellectuals and journalists frequently make sweeping claims about the effects of exposure to false content online that are inconsistent with much of the current empirical evidence. Here we identify three common misperceptions: that average exposure to problematic content is high, that algorithms are largely responsible for this exposure and that social media is a primary cause of broader social problems such as polarization. In our review of behavioural science research on online misinformation, we document a pattern of low exposure to false and inflammatory content that is concentrated among a narrow fringe with strong motivations to seek out such information. In response, we recommend holding platforms accountable for facilitating exposure to false and extreme content in the tails of the distribution, where consumption is highest and the risk of real-world harm is greatest. We also call for increased platform transparency, including collaborations with outside researchers, to better evaluate the effects of online misinformation and the most effective responses to it. Taking these steps is especially important outside the USA and Western Europe, where research and data are scant and harms may be more severe.

This is a preview of subscription content, access via your institution

Access options

Buy this article

Prices may be subject to local taxes which are calculated during checkout

Fig. 1: Academic research on the use of social media by country.

Similar content being viewed by others


  1. Myers, S. L. How social media amplifies misinformation more than information. The New York Times, (13 October 2022).

  2. Haidt, J. Why the past 10 years of American life have been uniquely stupid. The Atlantic, (11 April 2022).

  3. Haidt, J. Yes, social media really is undermining democracy. The Atlantic, (28 July 2022).

  4. Tufekci, Z. YouTube, the great radicalizer. The New York Times, (10 March 2018).

  5. Romer, P. A tax that could fix big tech. The New York Times, (6 May 2019).

  6. Schnell, M. Clyburn blames polarization on the advent of social media. The Hill, (7 November 2021).

  7. Robert F. Kennedy Human Rights/AP-NORC Poll (AP/NORC, 2023).

  8. Goeas, E. & Nienaber, B. Battleground Poll 65: Civility in Politics: Frustration Driven by Perception (Tarrance Group, 2019).

  9. Murray, M. Poll: Nearly two-thirds of Americans say social media platforms are tearing us apart. NBC News, (2021).

  10. Auxier, B. 64% of Americans say social media have a mostly negative effect on the way things are going in the U.S. today. Pew Research Center (2020).

  11. Koomey, J. G. et al. Sorry, wrong number: the use and misuse of numerical facts in analysis and media reporting of energy issues. Annu. Rev. Energy Env. 27, 119–158 (2002).

    Article  Google Scholar 

  12. Gonon, F., Bezard, E. & Boraud, T. Misrepresentation of neuroscience data might give rise to misleading conclusions in the media: the case of attention deficit hyperactivity disorder. PLoS ONE 6, e14618 (2011).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  13. Copenhaver, A., Mitrofan, O. & Ferguson, C. J. For video games, bad news is good news: news reporting of violent video game studies. Cyberpsychol. Behav. Soc. Netw. 20, 735–739 (2017).

    Article  PubMed  Google Scholar 

  14. Bratton, L. et al. The association between exaggeration in health-related science news and academic press releases: a replication study. Wellcome Open Res. 4, 148 (2019).

    Article  PubMed  PubMed Central  Google Scholar 

  15. Allcott, H., Braghieri, L., Eichmeyer, S. & Gentzkow, M. The welfare effects of social media. Am. Econ. Rev. 110, 629–676 (2020).

    Article  Google Scholar 

  16. Braghieri, L., Levy, R. & Makarin, A. Social media and mental health. Am. Econ. Rev. 112, 3660–3693 (2022).

    Article  Google Scholar 

  17. Guess, A. M., Barberá, P., Munzert, S. & Yang, J. The consequences of online partisan media. Proc. Natl Acad. Sci. USA 118, e2013464118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  18. Sabatini, F. & Sarracino, F. Online social networks and trust. Soc. Indic. Res. 142, 229–260 (2019).

    Article  Google Scholar 

  19. Lorenz-Spreen, P., Lewandowsky, S., Sunstein, C. R. & Hertwig, R. How behavioural sciences can promote truth, autonomy and democratic discourse online. Nat. Hum. Behav. 4, 1102–1109 (2020). This paper provides a review of possible harms from social media.

    Article  PubMed  Google Scholar 

  20. Lapowsky, I. The mainstream media melted down as fake news festered. Wired, (26 December 2016).

  21. Lalani, F. & Li, C. Why So Much Harmful Content Has Proliferated Online—and What We Can Do about It Technical Report (World Economic Forum, 2020).

  22. Stewart, E. America’s growing fake news problem, in one chart. Vox, (22 December 2020).

  23. Sanchez, G. R., Middlemass, K. & Rodriguez, A. Misinformation Is Eroding the Public’s Confidence in Democracy (Brookings Institution, 2022).

  24. Bond, S. False Information Is Everywhere. ‘Pre-bunking’ Tries to Head It off Early. NPR, (National Public Radio, 2022).

  25. Tufekci, Z. Algorithmic harms beyond Facebook and google: emergent challenges of computational agency. Colo. Tech. Law J. 13, 203 (2015).

    Google Scholar 

  26. Cohen, J. N. Exploring echo-systems: how algorithms shape immersive media environments. J. Media Lit. Educ. 10, 139–151 (2018).

    Article  Google Scholar 

  27. Shin, J. & Valente, T. Algorithms and health misinformation: a case study of vaccine books on Amazon. J. Health Commun. 25, 394–401 (2020).

    Article  PubMed  Google Scholar 

  28. Ceylan, G., Anderson, I. A. & Wood, W. Sharing of misinformation is habitual, not just lazy or biased. Proc. Natl Acad. Sci. USA 120, e2216614120 (2023).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  29. Pauwels, L., Brion, F. & De Ruyver, B. Explaining and Understanding the Role of Exposure to New Social Media on Violent Extremism. an Integrative Quantitative and Qualitative Approach (Belgian Science Policy, 2014).

  30. McHugh, B. C., Wisniewski, P., Rosson, M. B. & Carroll, J. M. When social media traumatizes teens: the roles of online risk exposure, coping, and post-traumatic stress. Internet Res. 28, 1169–1188 (2018).

    Article  Google Scholar 

  31. Soral, W., Liu, J. & Bilewicz, M. Media of contempt: social media consumption predicts normative acceptance of anti-Muslim hate speech and Islamo-prejudice. Int. J. Conf. Violence 14, 1–13 (2020).

    Google Scholar 

  32. Many believe misinformation is increasing extreme political views and behaviors. AP-NORC (2022).

  33. Fandos, N., Kang, C. & Isaac, M. Tech executives are contrite about election meddling, but make few promises on Capitol Hill. The New York Times, (31 October 2017).

  34. Eady, G., Paskhalis, T., Zilinsky, J., Bonneau, R., Nagler, J. & Tucker, J. A. Exposure to the Russian Internet Research Agency foreign influence campaign on Twitter in the 2016 US election and its relationship to attitudes and voting behavior. Nat. Commun. 14, 62 (2023). This paper shows that exposure to Russian misinformation on social media in 2016 was a small portion of people’s news diets and not associated with shifting attitudes.

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  35. Badawy, A., Addawood, A., Lerman, K. & Ferrara, E. Characterizing the 2016 Russian IRA influence campaign. Soc. Netw. Anal. Min. 9, 31 (2019). This paper shows that exposure to and amplification of Russian misinformation on social media in 2016 was concentrated among Republicans (who would have been predisposed to support Donald Trump regardless).

    Article  Google Scholar 

  36. Hosseinmardi, H., Ghasemian, A., Clauset, A., Mobius, M., Rothschild, D. M. & Watts, D. J. Examining the consumption of radical content on YouTube. Proc. Natl Acad. Sci. USA 118, e2101967118 (2021). This paper shows that extreme content is consumed on YouTube by a small portion of the population who tend to consume similar content elsewhere online and that consumption is largely driven by demand, not algorithms.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  37. Chen, A. Y., Nyhan, B., Reifler, J., Robertson, R. E. & Wilson, C. Subscriptions and external links help drive resentful users to alternative and extremist YouTube channels. Sci. Adv. 9, eadd8080 (2023). This paper shows that people who consume extremist content on YouTube have highly resentful attitudes and typically find the content through subscriptions and external links, not algorithmic recommendations to non-subscribers.

    Article  PubMed  PubMed Central  Google Scholar 

  38. Munger, K. & Phillips, J. Right-wing YouTube: a supply and demand perspective. Int. J. Press Polit. 27, 186–219 (2022).

    Article  Google Scholar 

  39. Lasser, J., Aroyehun, S. T., Simchon, A., Carrella, F., Garcia, D. & Lewandowsky, S. Social media sharing of low-quality news sources by political elites. PNAS Nexus 1, pgac186 (2022).

    Article  PubMed  PubMed Central  Google Scholar 

  40. Muddiman, A., Budak, C., Murray, C., Kim, Y. & Stroud, N. J. Indexing theory during an emerging health crisis: how U.S. TV news indexed elite perspectives and amplified COVID-19 misinformation. Ann. Inte. Commun. Assoc. 46, 174–204 (2022). This paper shows how mainstream media also spreads misinformation through amplification of misleading statements from elites.

    Google Scholar 

  41. Pereira, F. B. et al. Detecting misinformation: identifying false news spread by political leaders in the Global South. Preprint at OSF, (2022).

  42. Horwitz, J. & Seetharaman, D. Facebook executives shut down efforts to make the site less divisive. Wall Street Journal, (26 May 2020).

  43. Hosseinmardi, H., Ghasemian, A., Rivera-Lanas, M., Horta Ribeiro, M., West, R. & Watts, D. J. Causally estimating the effect of YouTube’s recommender system using counterfactual bots. Proc. Natl Acad. Sci. USA 121, e2313377121 (2024).

    Article  CAS  PubMed  Google Scholar 

  44. Nyhan, B. et al. Like-minded sources on facebook are prevalent but not polarizing. Nature 620, 137–144 (2023).

  45. Guess, A. M. et al. How do social media feed algorithms affect attitudes and behavior in an election campaign? Science 381, 398–404 (2023). This paper shows that algorithms supply less untrustworthy content than reverse chronological feeds.

    Article  ADS  CAS  PubMed  Google Scholar 

  46. Asimovic, N., Nagler, J., Bonneau, R. & Tucker, J. A. Testing the effects of Facebook usage in an ethnically polarized setting. Proc. Natl Acad. Sci. USA 118, e2022819118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  47. Allen, J., Mobius, M., Rothschild, D. M. & Watts, D. J. Research note: Examining potential bias in large-scale censored data. Harv. Kennedy Sch. Misinformation Rev. 2, (2021). This paper shows that engagement metrics such as clicks and shares that are regularly used in popular and academic research do not take into account the fact that fake news is clicked and shared at a higher rate relative to exposure and viewing than non-fake news.

  48. Scheuerman, M. K., Jiang, J. A., Fiesler, C. & Brubaker, J. R. A framework of severity for harmful content online. Proc. ACM Hum. Comput. Interact. 5, 1–33 (2021).

    Google Scholar 

  49. Vosoughi, S., Roy, D. & Aral, S. The spread of true and false news online. Science 359, 1146–1151 (2018).

    Article  ADS  CAS  PubMed  Google Scholar 

  50. Roy, D. Happy to see the extensive coverage of our science paper on spread of true and false news online, but over-interpretations of the scope of our study prompted me to diagram actual scope (caution, not to scale!). Twitter, (15 March 2018).

  51. Greenemeier, L. You can’t handle the truth—at least on Twitter. Scientific American, (8 March 2018).

  52. Frankel, S. Deceptively edited video of Biden proliferates on social media. The New York Times, (2 November 2020).

  53. Jiameng P. et al. Deepfake videos in the wild: analysis and detection. In Proc. Web Conference 2021 981–992 (International World Wide Web Conference Committee, 2021).

  54. Widely Viewed Content Report: What People See on Facebook: Q1 2023 Report (Facebook, 2023).

  55. Mayer, J. How Russia helped swing the election for Trump. The New Yorker, (24 September 2018).

  56. Jamieson, K. H. Cyberwar: How Russian Hackers and Trolls Helped Elect A President: What We Don’t, Can’t, and Do Know (Oxford Univ. Press, 2020).

  57. Solon, O. & Siddiqui, S. Russia-backed Facebook posts ‘reached 126m Americans’ during US election. The Guardian, (30 October 2017).

  58. Watts, D. J. & Rothschild, D. M. Don’t blame the election on fake news. Blame it on the media. Columbia J. Rev. 5, (2017). This paper explores how seemingly large exposure levels to problematic content actually represent a small proportion of total news exposure.

  59. Jie, Y. Frequency or total number? A comparison of different presentation formats on risk perception during COVID-19. Judgm. Decis. Mak. 17, 215–236 (2022).

  60. Reyna, V. F. & Brainerd, C. J. Numeracy, ratio bias, and denominator neglect in judgments of risk and probability. Learn. Individ. Differ. 18, 89–107 (2008). This paper details research into how salient numbers can lead to confusion in judgements of risk and probability, such as denominator neglect in which people fixate on a large numerator and do not consider the appropriate denominator.

    Article  Google Scholar 

  61. Jones, J. Americans: much misinformation, bias, inaccuracy in news. Gallup, (2018).

  62. Grinberg, N., Joseph, K., Friedland, L., Swire-Thompson, B. & Lazer, D. Fake news on Twitter during the 2016 US presidential election. Science 363, 374–378 (2019).

    Article  ADS  CAS  PubMed  Google Scholar 

  63. Guess, A. M., Nyhan, B. & Reifler, J. Exposure to untrustworthy websites in the 2016 US election. Nat. Hum. Behav. 4, 472–480 (2020). This paper shows untrustworthy news exposure was relatively rare in US citizens’ web browsing in 2016.

    Article  PubMed  PubMed Central  Google Scholar 

  64. Altay, S., Nielsen, R. K. & Fletcher, R. Quantifying the “infodemic”: people turned to trustworthy news outlets during the 2020 coronavirus pandemic. J. Quant. Descr. Digit. Media 2, 1–30 (2022).

  65. Allen, J., Howland, B., Mobius, M., Rothschild, D. & Watts, D. J. Evaluating the fake news problem at the scale of the information ecosystem. Sci. Adv. 6, eaay3539 (2020). This paper shows that exposure to fake news is a vanishingly small part of people’s overall news diets when you take television into account.

    Article  ADS  PubMed  PubMed Central  Google Scholar 

  66. Guess, A. M., Nyhan, B., O’Keeffe, Z. & Reifler, J. The sources and correlates of exposure to vaccine-related (mis)information online. Vaccine 38, 7799–7805 (2020). This paper shows hows how a small portion of the population accounts for the vast majority of exposure to vaccine-sceptical content.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  67. Chong, D. & Druckman, J. N. Framing public opinion in competitive democracies. Am. Polit. Sci. Rev. 101, 637–655 (2007).

    Article  Google Scholar 

  68. Arendt, F. Toward a dose-response account of media priming. Commun. Res. 42, 1089–1115 (2015). This paper shows that people may need repeated exposure to information for it to affect their attitudes.

    Article  Google Scholar 

  69. Arceneaux, K., Johnson, M. & Murphy, C. Polarized political communication, oppositional media hostility, and selective exposure. J. Polit. 74, 174–186 (2012).

    Article  Google Scholar 

  70. Feldman, L. & Hart, P. Broadening exposure to climate change news? How framing and political orientation interact to influence selective exposure. J. Commun. 68, 503–524 (2018).

    Article  Google Scholar 

  71. Druckman, J. N. Political preference formation: competition, deliberation, and the (ir)relevance of framing effects. Am. Polit. Sci. Rev. 98, 671–686 (2004).

    Article  Google Scholar 

  72. Bakshy, E., Messing, S. & Adamic, L. A. Exposure to ideologically diverse news and opinion on facebook. Science 348, 1130–1132 (2015).

    Article  ADS  MathSciNet  CAS  PubMed  Google Scholar 

  73. Bozarth, L., Saraf, A. & Budak, C. Higher ground? How groundtruth labeling impacts our understanding of fake news about the 2016 U.S. presidential nominees. In Proc. International AAAI Conference on Web and Social Media Vol. 14, 48–59 (Association for the Advancement of Artificial Intelligence, 2020).

  74. Gerber, A. S., Gimpel, J. G., Green, D. P. & Shaw, D. R. How large and long-lasting are the persuasive effects of televised campaign ads? Results from a randomized field experiment. Am. Polit. Sci. Rev. 105, 135–150 (2011). This paper shows that the effect of news decays rapidly; news needs repeated exposure for long-term impact.

    Article  Google Scholar 

  75. Hill, S. J., Lo, J., Vavreck, L. & Zaller, J. How quickly we forget: the duration of persuasion effects from mass communication. Polit. Commun. 30, 521–547 (2013). This paper shows that the effect of persuasive advertising decays rapidly, necessitating repeated exposure for lasting effect.

    Article  Google Scholar 

  76. Larsen, M. V. & Olsen, A. L. Reducing bias in citizens’ perception of crime rates: evidence from a field experiment on burglary prevalence. J. Polit. 82, 747–752 (2020).

    Article  Google Scholar 

  77. Roose, K. What if Facebook is the real ‘silent majority’? The New York Times, (27 August 2020).

  78. Breland, A. A new report shows how Trump keeps buying Facebook ads. Mother Jones, (28 July 2021).

  79. Marchal, N., Kollanyi, B., Neudert, L.-M. & Howard, P. N. Junk News during the EU Parliamentary Elections: Lessons from A Seven-language Study of Twitter and Facebook (Univ. Oxford, 2019).

  80. Ellison, N. B., Trieu, P., Schoenebeck, S., Brewer, R. & Israni, A. Why we don’t click: interrogating the relationship between viewing and clicking in social media contexts by exploring the “non-click”. J. Comput. Mediat. Commun. 25, 402–426 (2020).

    Article  Google Scholar 

  81. Pennycook, G., Epstein, Z., Mosleh, M., Arechar, A. A., Eckles,and, D. & Rand, D. G. Shifting attention to accuracy can reduce misinformation online. Nature 592, 590–595 (2021).

    Article  ADS  CAS  PubMed  Google Scholar 

  82. Ghezae, I. et al. Partisans neither expect nor receive reputational rewards for sharing falsehoods over truth online. Open Science Framework (2023).

  83. Guess, A. M. et al. Reshares on social media amplify political news but do not detectably affect beliefs or opinions. Science 381, 404–408 (2023).

    Article  ADS  CAS  PubMed  Google Scholar 

  84. Godel, W. et al. Moderating with the mob: evaluating the efficacy of real-time crowdsourced fact-checking. J. Online Trust Saf. 1, (2021).

  85. Rogers, K. Facebook’s algorithm is broken. We collected some suggestion on how to fix it. FiveThirtyEight, (16 November 2021).

  86. Roose, K. The making of a YouTube radical. The New York Times, (8 June 2019).

  87. Eslami, M. et al. First I “like” it, then I hide it: folk theories of social feeds. In Proc. 2016 CHI Conference on Human Factors in Computing Systems 2371–2382 (Association for Computing Machinery, 2016).

  88. Silva, D. E., Chen, C. & Zhu, Y. Facets of algorithmic literacy: information, experience, and individual factors predict attitudes toward algorithmic systems. New Media Soc. (2022).

  89. Eckles, D. Algorithmic Transparency and Assessing Effects of Algorithmic Ranking. Testimony before the Senate Subcommittee on Communications, Media, and Broadband, (U.S. Senate Committee on Commerce, Science, and Transportation, 2021).

  90. Kantrowitz, A. Facebook removed the news feed algorithm in an experiment. Then it gave up. OneZero, (25 October 2021).

  91. Ribeiro, M. H., Hosseinmardi, H., West, R. & Watts, D. J. Deplatforming did not decrease parler users’ activity on fringe social media. PNAS Nexus 2, pgad035 (2023). This paper shows that shutting down Parler just displaced user activity to other fringe social media websites.

    Article  Google Scholar 

  92. Alfano, M., Fard, A. E., Carter, J. A., Clutton, P. & Klein, C. Technologically scaffolded atypical cognition: the case of YouTube’s recommender system. Synthese 199, 835–858 (2021).

    Article  Google Scholar 

  93. Huszár, F. et al. Algorithmic amplification of politics on Twitter. Proc. Natl Acad. Sci. USA 119, e2025334119 (2022).

    Article  PubMed  Google Scholar 

  94. Levy, R. Social media, news consumption, and polarization: evidence from a field experiment. Am. Econ. Rev. 111, 831–870 (2021).

    Article  Google Scholar 

  95. Cho, J., Ahmed, S., Hilbert, M., Liu, B. & Luu, J. Do search algorithms endanger democracy? An experimental investigation of algorithm effects on political polarization. J. Broadcast. Electron. Media 64, 150–172 (2020).

    Article  Google Scholar 

  96. Lewandowsky, S., Robertson, R. E. & DiResta, R. Challenges in understanding human-algorithm entanglement during online information consumption. Perspect. Psychol. Sci. (2023).

  97. Narayanan, A. Understanding Social Media Recommendation Algorithms (Knight First Amendment Institute at Columbia University, 2023).

  98. Finkel, E. J. et al. Political sectarianism in America. Science 370, 533–536 (2020).

    Article  CAS  PubMed  Google Scholar 

  99. Auxier, B. & Anderson, M. Social Media Use in 2021 (Pew Research Center, 2021).

  100. Frimer, J. A. et al. Incivility is rising among American politicians on Twitter. Soc. Psychol. Personal. Sci. 14, 259–269 (2023).

    Article  Google Scholar 

  101. Broderick, R. & Darmanin, J. The “yellow vest” riots in France are what happens when Facebook gets involved with local news. Buzzfeed News, (2018).

  102. Salzberg, S. De-platform the disinformation dozen. Forbes, (2021).

  103. Karell, D., Linke, A., Holland, E. & Hendrickson, E. “Born for a storm”: hard-right social media and civil unrest. Am. Soc. Rev. 88, 322–349 (2023).

    Article  Google Scholar 

  104. Smith, N. & Graham, T. Mapping the anti-vaccination movement on Facebook. Inf. Commun. Soc. 22, 1310–1327 (2019).

    Article  Google Scholar 

  105. Brady, W. J., McLoughlin, K., Doan, T. N. & Crockett, M. J. How social learning amplifies moral outrage expression in online social networks. Sci. Adv. 7, eabe5641 (2021).

    Article  ADS  PubMed  PubMed Central  Google Scholar 

  106. Suhay, E., Bello-Pardo, E. & Maurer, B. The polarizing effects of online partisan criticism: evidence from two experiments. Int. J. Press Polit. 23, 95–115 (2018).

    Article  Google Scholar 

  107. Arugute, N., Calvo, E. & Ventura, T. Network activated frames: content sharing and perceived polarization in social media. J. Commun. 73, 14–24 (2023).

  108. Nordbrandt, M. Affective polarization in the digital age: testing the direction of the relationship between social media and users’ feelings for out-group parties. New Media Soc. 25, 3392–3411 (2023). This paper shows that affective polarization predicts media use, not the other way around.

  109. AFP. Street protests, a French tradition par excellence. The Local (2018).

  110. Spier, R. E. Perception of risk of vaccine adverse events: a historical perspective. Vaccine 20, S78–S84 (2001). This article documents the history of untrustworthy information about vaccines, which long predates social media.

    Article  PubMed  Google Scholar 

  111. Bryant, L. V. The YouTube algorithm and the alt-right filter bubble. Open Inf. Sci. 4, 85–90 (2020).

    Google Scholar 

  112. Sismeiro, C. & Mahmood, A. Competitive vs. complementary effects in online social networks and news consumption: a natural experiment. Manage. Sci. 64, 5014–5037 (2018).

    Article  Google Scholar 

  113. Fergusson, L. & Molina, C. Facebook Causes Protests Documento CEDE No. 41, (2019).

  114. Lu, Y., Wu, J., Tan, Y. & Chen, J. Microblogging replies and opinion polarization: a natural experiment. MIS Q. 46, 1901–1936 (2022).

  115. Porter, E. & Wood, T. J. The global effectiveness of fact-checking: evidence from simultaneous experiments in Argentina, Nigeria, South Africa, and the United Kingdom. Proc. Natl Acad. Sci. USA 118, e2104235118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  116. Arechar, A. A. et al. Understanding and combatting misinformation across 16 countries on six continents. Nat. Hum. Behav. 7, 1502–1513 (2023).

  117. Blair, R. A. et al. Interventions to Counter Misinformation: Lessons from the Global North and Applications to the Global South (USAID Development Experience Clearinghouse, 2023).

  118. Haque, M. M. et al. Combating misinformation in Bangladesh: roles and responsibilities as perceived by journalists, fact-checkers, and users. Proc. ACM Hum. Comput. Interact. 4, 1–32 (2020).

    Article  Google Scholar 

  119. Humprecht, E., Esser, F. & Van Aelst, P. Resilience to online disinformation: a framework for cross-national comparative research. Int. J. Press Polit. 25, 493–516 (2020).

    Article  Google Scholar 

  120. Gillum, J. & Elliott, J. Sheryl Sandberg and top Facebook execs silenced an enemy of Turkey to prevent a hit to the company’s business. ProPublica, (24 February 2021).

  121. Nord M. et al. Democracy Report 2024: Democracy Winning and Losing at the Ballot V-Dem Report (Univ. Gothenburg V-Dem Institute, 2024).

  122. Alba, D. How Duterte used Facebook to fuel the Philippine drug war. Buzzfeed, (4 September 2018).

  123. Zakrzewski, C., De Vynck, G., Masih, N. a& Mahtani, S. How Facebook neglected the rest of the world, fueling hate speech and violence in India. Washington Post, (24 October 2021).

  124. Simonite, T. Facebook is everywhere; its moderation is nowhere close. Wired, (21 October 2021).

  125. Cruz, J. C. B. & Cheng, C. Establishing baselines for text classification in low-resource languages. Preprint at (2020). This paper shows one of the challenges that makes content moderation costlier in less resourced countries.

  126. Müller, K. & Schwarz, C. Fanning the flames of hate: social media and hate crime. J. Eur. Econ. Assoc. 19, 2131–2167 (2021).

    Article  Google Scholar 

  127. Bursztyn, L., Egorov, G., Enikolopov, R. & Petrova, M. Social Media and Xenophobia: Evidence from Russia (National Bureau of Economic Research, 2019).

  128. Lewandowsky, S., Jetter, M. & Ecker, U. K. H. Using the President’s tweets to understand political diversion in the age of social media. Nat. Commun. 11, 5764 (2020).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  129. Bursztyn, L., Rao, A., Roth, C. P. & Yanagizawa-Drott, D. H. Misinformation During a Pandemic (National Bureau of Economic Research, 2020).

  130. Motta, M. & Stecula, D. Quantifying the effect of Wakefield et al. (1998) on skepticism about MMR vaccine safety in the US. PLoS ONE 16, e0256395 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  131. Sanderson, Z., Brown, M. A., Bonneau, R., Nagler, J. & Tucker, J. A. Twitter flagged Donald Trump’s tweets with election misinformation: they continued to spread both on and off the platform. Harv. Kennedy Sch. Misinformation Rev. 2, (2021).

  132. Anhalt-Depies, C., Stenglein, J. L., Zuckerberg, B., Townsend, P. A. & Rissman, A. R. Tradeoffs and tools for data quality, privacy, transparency, and trust in citizen science. Biol. Conserv. 238, 108195 (2019).

    Article  Google Scholar 

  133. Gerber, N., Gerber, P. & Volkamer, M. Explaining the privacy paradox: a systematic review of literature investigating privacy attitude and behavior. Comput. Secur. 77, 226–261 (2018). This paper explores the trade-offs between privacy and research.

    Article  Google Scholar 

  134. Isaak, J. & Hanna, M. J. User data privacy: Facebook, Cambridge Analytica, and privacy protection. Computer 51, 56–59 (2018).

    Article  Google Scholar 

  135. Vogus, C. Independent Researcher Access to Social Media Data: Comparing Legislative Proposals (Center for Democracy and Technology, 2022).

  136. Xie, Y. “Undemocracy”: inequalities in science. Science 344, 809–810 (2014).

    Article  ADS  CAS  PubMed  PubMed Central  Google Scholar 

  137. Nielsen, M. W. & Andersen, J. P. Global citation inequality is on the rise. Proc. Natl Acad. Sci. USA 118, e2012208118 (2021).

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  138. King, D. A. The scientific impact of nations. Nature 430, 311–316 (2004).

    Article  ADS  CAS  PubMed  Google Scholar 

  139. Zaugg, I. A., Hossain, A. & Molloy, B. Digitally-disadvantaged languages. Internet Policy Rev. 11, 1–11 (2022).

    Article  Google Scholar 

  140. Zaugg, I. A. in Digital Inequalities in the Global South (eds Ragnedda, M. & Gladkova, A.) 247–267 (Springer, 2020).

  141. Sablosky, J. Dangerous organizations: Facebook’s content moderation decisions and ethnic visibility in Myanmar. Media Cult. Soc. 43, 1017–1042 (2021). This paper highlights the challenges of content moderation in the Global South.

    Article  Google Scholar 

  142. Warofka, A. An independent assessment of the human rights impact of Facebook in Myanmar. Facebook Newsroom, (2018).

  143. Fick, M. & Dave, P. Facebook’s flood of languages leave it struggling to monitor content. Reuters, (23 April 2019).

  144. Newman, N. Executive Summary and Key Findings of the 2020 Report (Reuters Institute for the Study of Journalism, 2020).

  145. Hilbert, M. The bad news is that the digital access divide is here to stay: domestically installed bandwidths among 172 countries for 1986–2014. Telecommun. Policy 40, 567–581 (2016).

    Article  Google Scholar 

  146. Traynor, I. Internet governance too US-centric, says European commission. The Guardian, (12 February 2014).

  147. Pennycook, G., Cannon, T. D. & Rand, D. G. Prior exposure increases perceived accuracy of fake news. J. Exp. Psychol. Gen. 147, 1865–1880 (2018).

    Article  PubMed  PubMed Central  Google Scholar 

  148. Guess, A. M. et al. “Fake news” may have limited effects beyond increasing beliefs in false claims. Kennedy Sch. Misinformation Rev. 1, (2020).

  149. Loomba, S., de Figueiredo, A., Piatek, S. J., de Graaf, K. & Larson, H. J. Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA. Nat. Hum. Behav. 5, 337–348 (2021).

    Article  PubMed  Google Scholar 

  150. Lorenz-Spreen, P., Oswald, L., Lewandowsky, S. & Hertwig, R. Digital media and democracy: a systematic review of causal and correlational evidence worldwide. Nat. Hum. Behav. 7, 74–101 (2023). This paper provides a review of evidence on social media effects.

    Article  PubMed  Google Scholar 

  151. Donato, K. M., Singh, L., Arab, A., Jacobs, E. & Post, D. Misinformation about COVID-19 and Venezuelan migration: trends in Twitter conversation during a pandemic. Harvard Data Sci. Rev. 4, (2022).

  152. Wieczner, J. Big lies vs. big lawsuits: why Dominion Voting is suing Fox News and a host of Trump allies. Fortune, (2 April 2021).

  153. Calma, J. Twitter just closed the book on academic research. The Verge (2023).

  154. Edelson, L., Graef, I. & Lancieri, F. Access to Data and Algorithms: for an Effective DMA and DSA Implementation (Centre on Regulation in Europe, 2023).

Download references

Author information

Authors and Affiliations



C.B., B.N., D.M.R., E.T. and D.J.W. wrote and revised the paper. D.M.R. collected the data and prepared Fig. 1.

Corresponding author

Correspondence to David M. Rothschild.

Ethics declarations

Competing interests

The authors declare no competing interests, but provide the following information in the interests of transparency and full disclosure. C.B. and D.J.W. previously worked for Microsoft Research and D.M.R. currently works for Microsoft Research. B.N. has received grant funding from Meta. B.N. and E.T. are participants in the US 2020 Facebook and Instagram Election Study as independent academic researchers. D.J.W. has received funding from Google Research. D.M.R. and D.J.W. both previously worked at Yahoo!.

Peer review

Peer review information

Nature thanks Stephan Lewandowsky, David Rand, Emma Spiro and the other, anonymous, reviewer(s) for their contribution to the peer review of this work.

Additional information

Publisher’s note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Budak, C., Nyhan, B., Rothschild, D.M. et al. Misunderstanding the harms of online misinformation. Nature 630, 45–53 (2024).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing