Psychologist Sander van der Linden talks to Nature about the science of misinformation.

In the latest episode of Nature hits the books, psychologist Sander van der Linden joins us to discuss his new book Foolproof, which focuses on misinformation and what can be done to prevent people being duped, particularly by the falsehoods found online.

We discuss how misinformation messages are crafted, why they can be hard to shake once someone is exposed, and how Star Wars: Episode 3 helped in the fight against them…

Foolproof, Sander van der Linden, Fourth Estate (2023)

Music supplied by Airae/Epidemic Sound/Getty images.

Never miss an episode. Subscribe to the Nature Podcast on Apple Podcasts, Google Podcasts, Spotify or your favourite podcast app. An RSS feed for Nature Podcast is available too.

TRANSCRIPT

Benjamin Thompson

Hi, Benjamin Thompson here. Welcome to episode three of Nature hits the books, the show where I chat with authors about their science books. In this episode, I'm joined by Sander van der Linden, a psychologist from the University of Cambridge here in the UK. Sander's new book Foolproof focuses on misinformation, particularly the falsehoods that many of us have come across online. Misinformation can often be a simple mistake, but it can also be produced and posted with nefarious intent to deliberately deceive someone. And this confusion can create serious problems, say around the safety of a vaccine or whether an election result is valid. In fact, early in the pandemic, the WHO's director-general Tedros Ghebreyesus said the world wasn't just fighting a pandemic, it was fighting an infodemic of misinformation, conspiracy theories and fake news that needs to be tackled. Sander and I talked about a range of things: how misinformation messages are crafted; why they can be hard to shake once we're exposed to them; and how Star Wars: Episode 3 helped in the fight against them.

Benjamin Thompson

Sander, thank you so much for taking the time today.

Sander van der Linden

Pleasure to be on the show.

Benjamin Thompson

So your book is all about misinformation, which is different from disinformation. Could we sort of define what we mean when we're talking about misinformation?

Sander van der Linden

Yeah, so what I mean by misinformation is just any information that is either false or misleading in some way. Whereas I would say that disinformation is a subset of misinformation with some psychological intent to either deceive or harm other people. And so disinformation is a bit more nefarious.

Benjamin Thompson

And what are the sort of places that one would come across misinformation

Sander van der Linden

Well I think misinformation is everywhere in the sense that you know, there's wrong information, either according to scientific consensus, or fact checks, or even, you know, according to what we would normally refer to as bias or lacking context, whereas disinformation is often more targeted. It's manipulators trying to dupe people. And you can you can find that increasingly online.

Benjamin Thompson

And in terms of being a problem, it doesn't seem like it's a new one. In your book. You talk about campaigns in ancient Rome, for example, between Mark Antony and Gaius Octavian, beg your pardon if I'm not pronouncing that correctly, why have you written this book now?

Sander van der Linden

It's true that you know people say okay, 'well misinformation has always been around'. But I think what's different now is obviously we're living in a time where misinformation is incredibly harmful. Examples abound. People have burned down 5G masts in the UK because they're linking conspiracy theories about 5G cell towers to the coronavirus. People in Iran have died because they've ingested methanol because they thought it was here for the coronavirus because they read it on social media, people stormed the capitol in the US and Brazil. You know, misinformation on WhatsApp has caused mob lynchings. And so you know, there's, I think, widespread concern about the consequences of misinformation in society. And to some extent, that is not, you know, entirely new. But I think what we have now is a massive amplifier, which is, of course, social media, which can really spread misinformation within the split of a second. And I talk about research that kind of shows how falsehoods can travel on social media faster and reach more people than factual information. And so I think, you know, misinformation really is growing at an accelerating rate online, which we didn't have before. You know, if somebody puts out false information that can actually either harm people personally, or democracy, it can make its way around the world before the truth, even has had a chance to put its pants on, right. And so that's, I think, the issue we're facing.

Benjamin Thompson

And is it possible to assess what the scale of misinformation is? Like how often someone is exposed to it a day, for example, or a week or, you know, whatever it is?

Sander van der Linden

Yeah, absolutely. There's two ways of doing that. One is to ask people, and the other is to scrape data from the Internet and social media platforms to try to quantify how often misinformation is shared. And then it turns out, there's a huge discrepancy. So when you look at surveys, most people say that they are exposed multiple times a week to misinformation, then when you look at what we find from digital-data online, is that it tends to be much more infrequent than what people are saying. And there's two issues here. One, maybe people not remembering correctly, or not reporting correctly, or we don't have access to the right amount of data to make accurate estimates. And I think that has to do with the fact that most studies make these estimates based off of single platforms like Facebook or Twitter, in a limited period of time, for a limited number of people. And then you get an estimate that okay, well, people are exposed to a few articles during the election or during a year or during a week that were fake. And it hugely depends on your definition of misinformation, which differs across studies. So if you have a study, which uses the term 'fake news' in a very technical, purely fake sense, totally fabricated, then you're going to see that the prevalence is very low. If you use a study that says 'misinformation, including biased and misleading news', then all of a sudden that becomes a much larger fraction of misinformation exposure. And that still doesn't account for the cumulative effect of all technology we're on both offline and online. And those estimates don't actually exist. And that makes me actually concerned that we're hugely underestimating the problem.

Sander van der Linden

I mean, you talk there about some research into misinformation. You yourself are a researcher who's studied this for a long time, what to your mind are some of the key things that researchers have learned about misinformation?

Sander van der Linden

Well, I think one of their key things that we've learned, and this is a really robust finding is that repetition tends to increase the persuasiveness of misinformation. And so if you keep repeating a falsehood, it creates associations in the brain, and the brain mistakes, what we call fluency for truth. And so fluency has to do with the ease with which a piece of information can be processed. So if I repeat something very often to you, it becomes familiar and it becomes easier for the brain to process. So literally, that means it's faster at processing things you know than things you don't know. And that is taken as a signal for something that's likely to be true. That's why when you repeat a lie so often, and we all can think of examples of let's say, political elites who've done that, why they're so successful, because it creates these automatic associations for people and it starts to feel true. And to some extent, people even find it less morally objectionable to start posting it later on, because they've heard it so often, it almost seems to feel natural. There's what we call illusory truth. And so you know, it's hard when you do psychology to say something that's universally true for all people. But our best estimate suggests that that, you know, affects about 75% of people and even children, you know, from the age of five.

Benjamin Thompson

And this isn't a new area of research, right? There's been studies into this for decades.

Sander van der Linden

Yeah absolutely. So people have always wondered how people deal with incorrect information. So that's definitely true. But if you look at some of the, you know, the biggest leaps, you know, during the 1960s, in the 1970s, we didn't have social media. And so there was a crucial element lacking there in terms of research. So we've come a long way trying to understand how misinformation on social media affects people maybe differently from misinformation that is circulated in other ways. And so that's, you know, the interaction between algorithms and information. And we think about some of the things we didn't know back then, is that you can now micro-target people based on their digital footprint. So we leave traces behind on the Internet, you know, every time we click on something, people can collect that stuff, and build models that are quite predictive of trying to understand your behavior, and then target you with personalized ads, so that the ads people see during elections and other situations are somewhat unique to them. And people don't know that. And so in some ways it can undermine people's informed decision-making, particularly if the information that's targeted at you is incorrect. And so we have these sort of new, what are called 'weapons of mass persuasion' that didn't exist a long time ago. So that's I think it has been a really fascinating area of research. And as I say, in the book, what's interesting is that psychology has spent so much time over the years since the 1960s, on doing research on how to persuade other people. And there isn't actually that much research on how to resist persuasion when we don't want it or when it's targeted and malicious. And, and that's fascinated me this area of how can we help people identify and resist persuasion, which seems more relevant now than ever, when it's coming at us on all different ways and all different sides in all different modes.

Sander van der Linden

And on that, then what makes a good misinformation campaign? I don't necessarily want to give people any ideas, of course, but like what do misinformation campaigns have in common?

Sander van der Linden

It out that there's actually these discernible tropes that are used over and over again, that you can identify. We know that emotional content is more likely to go viral on social media, especially negative emotional content. And so one strategy that a lot of misinformation producers use is to appeal to anxiety and fear in the crafting of their messages, because that's what gets people going. That's what gets shared. That's what's exciting. You know, boring fact-checks don't go viral. They don't have sensationalist headlines. They don't have emotive words. But then as we talk about in some of our interventions, you can't be too ridiculous. So people always talk about conspiracy theories. And, you know, the reason why they're so psychologically attractive is because they use the sort of clever tricks that the mind is partly predisposed to want to accept. And so one of these tricks is to reinterpret randomness. And so the brain loves connecting the dots and finding patterns and we're quick to make causal inferences. So here's an example. 5G phone masts and coronavirus outbreaks, right? This is a correlation. Most people understand that, that two things are happening at the same time. Now you can easily wrap that into a convincing causal story that the phone masts are the cause of coronavirus outbreaks. And you can link that to past health concerns people have about phone masts that they might cause cancer and other sorts of things. And so, of course, what's really happening here is that it's explained by a third variable, which is population density, where there's more people, there's more phone masts, and more coronavirus outbreaks. But it's easy to trick the brain into wanting to see that pattern, especially if you wrap that up into a convincing story. And smart manipulators use a grain of truth to get their message out. A favorite example of mine is the birds aren't real satire. And that's because actually, it leverages some of these concepts. So for people who don't know, there's this fake conspiracy, that birds aren't real, in fact, they're government drones. And I think they could have made this into a real conspiracy if they made it a bit less ridiculous. So one adaptation that I would use as a manipulator is that I wouldn't say that all birds are drones, because that can be easily falsified. So you want to say, there's only some birds that are drones, and then you can go into patterns. And like, what did they look like? And you know, how are the government drone-birds that are spying on you different from regular birds? Then you can have whole sub-conspiracies about that. It's a joke, but it taps into deep-seated worries about privacy, about control. And so a lot of disinformation producers, they leverage those fears and anxieties and those tricks that get traction with misinformation.

Benjamin Thompson

Which brings us down to what to do about it, and you put some ideas forward. But in an early chapter, you say that what you're putting forward isn't a way to persuade people about what is true and what isn't true, it's maybe about inoculating them against misinformation. And this theme of you know, vaccinating people against the virus of misinformation runs right through your book, right. And you're saying that just debunking a falsehood with facts, doesn't cut it anymore?

Sander van der Linden

Yeah. And it's important people don't misunderstand, I think fact checking and debunking are clearly important, because they send a signal that somebody's out there keeping track and holding people accountable. But unfortunately, as you said, it's not sufficient, because we know that people continue to rely on falsehoods, even when they acknowledge a correction. And this is what we call the continued influence of misinformation. And that, you know, has to do with memory networks in the brain, you know, they kind of operate like a spiderweb of links. And so when you integrate incorrect knowledge into your mental model, it kind of spreads everywhere, and it leaves traces behind and lots of different concepts that we have. So when you're trying to undo it, it's like a game of whack-a-mole, when you inactivate, one node, another falsehood pops up somewhere else. And so even though when people know that something isn't true, they continue to use it. And that's how difficult it is to get rid of some of these falsehoods. And that's why we focus on this idea of of inoculation, or pre-bunking trying to get ahead of the situation. And that's what I introduced in the final part of the book as the sort of potential solution that we're looking at now, which is really premised on the vaccination analogy. So just as vaccines introduced a weakened dose, or an inactivated strain of a virus into the body to jumpstart the production of antibodies to help confer resistance against future infection, it turns out, you can do the same with misinformation, you know, and initially, this was a bit of a controversial idea, because the idea really is that instead of giving people more facts, and so on, you actually expose people so weakened dose of the misinformation that they might come across in the future, or the techniques used to spread misinformation and refute those in advance so that people can build up cognitive or intellectual antibodies so that they can resist it in the future.

Benjamin Thompson

And in broad terms, what does a pre-bunking campaign look like? Then what does it consist of?

Sander van der Linden

So it gives you a practical example of a pre-bunking campaign. You know, we conducted one with Google on YouTube. You know, YouTube doesn't have headlines, but it does have misinformation, right. And so one of the techniques, rhetorical techniques, political gurus often used to lure people into more extremist modes of thinking, is what's called a false dilemma or a false dichotomy, which is very prevalent in misinformation in the sense that they try to present two options to people and pretend that there's only two options. Well, in fact, there's many more. So you know, an example would be that 'we can't address immigrants and their needs before fixing domestic homelessness issues'. I mean, of course, there's, you know, a constraint on resources, but for most of the time, we can actually work on multiple problems at the same time, right? And so all the nuance gets lost. Extremists use this a lot. And that's a very common trick, also in the production of misinformation. And so, what we wanted to do is actually pre-bunk, this technique so that regardless of the example that people may see, whether it's a conspiracy theory that uses a false dilemma, or a politician who is trying to influence people in ways that are not very, let's say, accurate, people can then spot it. So what we do is we expose people to a video a very short animated video, about a minute and a half that pre-bunks this technique in the ad space on YouTube. And in the video, the inactivated strain here is a clip from Star Wars. So I'm not sure if your audience is familiar with this particular installment, but it's Revenge of the Sith and so the video cuts through Anakin Skywalker and Obi Wan Kenobi and Anakin says "Either you're with me, or you're my enemy," upon which Obi Wan replies, "Only a Sith deals in absolutes." And then the narrator sort of explains the false dilemma that's going on here, right? It's not just two options, and how bad actors can use that to dupe us into endorsing manipulative content. And we were able to scale that to millions of people and found that when we later presented people within 24 hours with a little test that contained misinformation that uses this technique, they were better able to identify it and neutralize it.

Benjamin Thompson

I mean, you're by no means the only group researching this pre-bunking way of getting ahead of misinformation. What have other studies show about the efficacy of this method.

Sander van der Linden

So when you look at all of the evidence, and you know, there's more and more groups now kind of studying this phenomena and replicating the findings and the boundary conditions to see what works and, and how. Overall, I think what it shows is that pre-bunking is a good first line of defense, and that it can help people spot the techniques of manipulation in advance, and it can help people identify misinformation before they encounter it. One major obstacle that we're facing is actually examining behavior in social media settings. So what do people do after they've been vaccinated? Do they go on and share less misinformation? Do they click on fewer things? And there was one new study that was conducted by another group that actually use a simulated social-media platform. And after they had inoculated people, they found that people liked and shared misinformation less. And of course, we need to replicate that on an actual social-media platform. But you know, that requires long-term cooperation and negotiations with social-media companies. I mean, they are implementing this approach, and there are scaling into millions of people. One of the things is that they want to test it on their own. I mean, they're doing this research too.

Benjamin Thompson

One thing that people might come back to you on this, Sander, is to say that really, aren't you just teaching folk critical-thinking? Is to not take things at face value. And what would you say to that?

Sander van der Linden

Well, I don't mind if people refer to it as critical thinking. But at the conceptual level, I do think that there's a difference. And one way to illustrate that is with what I call 'the magic show analogy'. So the first time you go see an illusionist, or a magic show, you might be totally duped by how it works. And then the traditional critical-thinking approach is to give you maybe a blueprint of how it works at a technical level. And then you have to engage your analytical skills and really wonder about how this works. Or you can let people play around with the weakened dose of it, by letting people step into the shoes of the magician for a while and figure out how the trick works on their own, so that they're definitely not going to be duped by it again. And that's kind of more analogous to the inoculation situation where you get people to put themselves in the shoes of a manipulator and figure out how the trick works on their own from experience. And that's, I think, how I like to explain the difference.

Benjamin Thompson

And does this idea of vaccinating yourself — and you also talked about maybe teaching these skills are imparting these skills to your friends and neighbors, for example — does that put the responsibility for this on regular folk, rather than in the social-media companies that are showing this sort of content?

Sander van der Linden

Yeah, that's a great question. So I do think, you know, sometimes we do worry that, you know, working with a lot of these companies, that at the end of the day, they use this as an excuse not to do the hard things, like taking down content or enforcing regulation, making changes through their algorithms, because they can just say, 'look, we can empower people and pre-bunk everything', right. And I think that even though that's going to be helpful, that doesn't mean that they still don't have to do the hard things. I think, you know, sometimes the most effective thing you can do is just to de-platform a repeated offender spreading harmful misinformation. And so some people don't like those solutions, because they cut into free speech and, you know, moderation and things like that. And so for those reasons, I think it's helpful to have other solutions, like pre-bunking. And you know, it's funny, sometimes people say, 'well, why are you working with governments or social media companies?' But I think the thing for us is, if they're implementing our findings, and helping people spot these general techniques: scapegoating groups, false dichotomies, polarization, conspiracy theories, if these entities ever engaged themselves in dubious tactics, then they've only invested in making their users immune to their own tactics. And so for us, it's kind of a win-win situation.

Sander van der Linden

Well it's not moving that fast, but I do think that it is a moving target. And so we're definitely always behind the curve, in trying to understand what these algorithms are doing, often with little help from these companies, right? So we have to either build our own simulations, or we have to scrape what's publicly available. And so it takes a lot of effort, a lot of time, a lot of resources from independent scientists trying to ascertain what goes on on these platforms, and we're always slightly behind the curve. So that's frustrating. And ultimately, I think it's all very unequal, you know, we have access to certain data we can do experiments on certain social-media platforms because we've spent years trying to build relationships with them, and trust and mutual cooperation, but other researchers don't, and they don't have access at all. So it's a very unequal environment for research. And we're trying to democratize this. And we're actually trying to get all of these companies in the same room soon to think about how we can actually diversify and democratize access to the social-media research.

Benjamin Thompson

Obviously, the advent of social media has changed the game, right, in the way that misinformation can be shared. What are some of the hurdles to learning about it? Because I guess algorithms behind the scenes in companies might have changed since we started talking, right, it's impossible for us to know. So how hard is it to get meaningful data to study, and are there things that you have found and have shown and other groups have shown? Is that already out of date, like since yesterday, kind of thing?

Benjamin Thompson

And briefly, I mean, what's the future of misinformation? Because since your book came out, obviously, we have these large language models, chat AIs that can pump out content, and we have deep-fake videos that are becoming easier and easier and easier to make, for example. So where does it go, do you think, from here?

Sander van der Linden

Ultimately, for the most part, I think we're going to be better off trying to focus on the larger techniques that are used to mislead people, because that's a much more stable property that we can target. Sure, we can teach people some ways to spot deep fakes. But with more sophisticated technology, that's going to change. And so we need a long-term way to actually inoculate people, and that the solution is probably not going to be about the deep fake, it's going to be about the context. And the same is true for ChatGPT as well. And manipulation is often apparent from the context in which something appears. And that kind of comes back to these larger manipulation-techniques that are going to provide the clues for how reliable, we should find something, because I do think in the future, deep fakes are going to be so realistic, that it's almost going to be impossible to tell whether it's a fake or not. I mean, we did some in my lab the other week, and I couldn't tell which were real, which were not. GPT can generate, you know, human-like text very well. And so what's more useful, I think, is the larger context in which this is deployed.

Benjamin Thompson

Finally then, can you envision a scenario or a world where it is possible to escape this stuff? Because even you, yourself, in your book, say that you've been caught out and shared stuff that turned out not to be true, for example. So is there a world where we can turn this around, do you think?

Sander van der Linden

Yeah, I mean, I do talk about in the book about how I was duped by a fake NASA video of the Martian wind that turns out wasn't live. And so I think that's an example of the fact that anyone can be duped. But I'm not sure if we can turn this around. I do think that we can build resilience in kind of the post-truth, sort of, era. So what I'm seeing is that we can build resilience to manipulation in a way that's going to make it manageable. And maybe that's the more realistic expectation, and that requires what I call a multi-layered defense system. So you start off with pre-bunking and inoculation as the first line of defense. You know, if something falls through the cracks, if people are still fooled by a deep fake, or, you know, something like that, we need real-time fact-checking happening at a large, coordinated level. And if people somehow don't see the fact check, then we need the most effective ways we know how to debunk something after the fact. And we also need to change the incentives on social media at the same time to promote accuracy rather than misinformation. And when we have all of that in place, I think we can manage it. That's kind of what I'm thinking. The problem is that we're a fairly long way off from having that multi-layer defense system. But I think we're making progress on all of those ends, and we can create the solutions at both the systematic and the individual level. And once we're there because some of the challenges, you know, that we've seen around information warfare, are looming, we can hopefully manage it. I don't know if that makes me an optimist or a pessimist compared to you, but I like to think that that's fairly optimistic.

Benjamin Thompson

Sander van der Linden thank you so much.

Sander van der Linden

Pleasure.

Benjamin Thompson

Sander van der Linden there. If you want to read more on mis- and disinformation, check out his book Foolproof. And that's it for episode three of Nature hits the books. If you have any feedback on the show, why not send an email to podcast@nature.com with subject line 'Nature hits the books'. Otherwise, look out for the next episode later in the year. The music used in this episode was called 'To Clarity' by Airae via Epidemic Sound and Getty Images. I'm Benjamin Thompson. Thanks for listening.