Invisible Rulers: The People Who Turn Lies into Reality Renée DiResta Public Affairs (2024)

Scientific institutions, public-health authorities and academics routinely face criticism and angry denouncements from ideologically motivated detractors who wish to bury inconvenient scientific evidence. With the rise of the Internet and social media, misinformation researchers, especially, have become targets for online partisan attacks (see Nature 630, 548–550; 2024). And academics routinely have to ward off political interference in many countries1.

Renée DiResta knows this only too well. A former research manager at the Stanford Internet Observatory (SIO) in California, she has been on the receiving end of online attacks for years, owing to her academic work combating misinformation about elections and vaccine efficacy. After a barrage of unsubstantiated accusations — including those levelled in a controversial investigation by the US House of Representatives’ judiciary committee, chaired by Republican congressman Jim Jordan — DiResta found that her research group at the SIO was suddenly dismantled in June, reportedly because of a change in institutional priorities.

In Invisible Rulers, DiResta documents her stormy personal and professional journey into what she describes as the “fantasy–industrial complex”. It’s an insightful account of how, over the past two decades, social-media influencers, algorithms and crowds have hijacked the public debate on consequential topics — from vaccination campaigns to the validity of elections. The book’s central thesis is this: a few social-media propagandists increasingly have the power to profoundly shape public opinion. And the only maxim that seems to guide their action is, as DiResta puts it: “if you make it trend, you make it true”.

The book’s title is a reference to public-relations pioneer Edward Bernay’s 1928 work Propaganda, which describes the ‘invisible’ people who fashion public sentiment — including public-relations experts and advertising executives. Today, that power can be in anyone’s hands.

Charismatic individuals with large online followings are the new invisible rulers. The most elite among them, DiResta writes, possess the storytelling skills of a leading marketing executive, have the audience size of a television anchor and yet create the cozy, intimate feeling of a phone call with your best friend. They can also make immense profits, she notes, by pretending to be an ordinary person who is helping their audience to “break free of the lying mainstream media”.

Invisible Rulers is DiResta’s attempt to lay out the motivations and methods of these individuals, who, she explains, might project themselves as being anti-elite but are, in fact, a new breed of elite. They often wield incredible power without displaying any commensurate responsibility.

Alternate realities

DiResta’s own journey into the world of misinformation began as a concerned mother trying to work out why classroom vaccination rates were declining in California amid a measles outbreak in 2014. She documents how, after joining the vaccine debate in support of a state bill that sought to remove ‘personal belief’ as a valid ground for seeking exemption from mandatory vaccination programmes, she was deluged by online attacks from bots and trolls.

Although most children in California are vaccinated — signalling broad public consensus that vaccines are beneficial — DiResta describes the jarring experience of stumbling upon a seemingly alternate reality online.

There, she found a small yet vocal band of people promoting the idea that the government and pharmaceutical industry were colluding to cover up a supposed link between vaccines and autism — a decades-old argument that has been dispelled by research2,3.

Studies show that a growing minority of the US population now holds this sceptical view. Without intervention, anti-vaccination sentiment might dominate vaccine discourse on social media in the next decade4. Research also affirms DiResta’s contention that those who promote anti-vaccination rhetoric are organized and overlap with groups that champion other pseudoscience topics, such as unproven forms of alternative medicine and COVID-19 misinformation.

DiResta’s book shines a light on the why. Often, these influencers aren’t conventional celebrities, but ordinary citizens who talk about things that interest them. Such influencers typically don’t start out peddling rumours and disinformation. But some notice that, once they start talking about a certain controversial topic, they receive more engagement on social media. The more they talk about it, the more people ‘like’ and share what they have to say, leading algorithms to recommend their content even more.

Two Black women, social media influencers and video bloggers speak as U.S. President Donald Trump smiles during a rally.

Social-media influencers speak at a rally held by US presidential candidate Donald Trump.Credit: Al Drago/Bloomberg/Getty

The consequences of these misinformation spirals can be felt in the real world. For example, in August, violent riots engulfed the United Kingdom after the tragic stabbing of several young children. Among the triggers were false reports spread on social media — and amplified by far-right influencers — that the perpetrator was a Muslim asylum seeker who arrived in England by boat. The actual assailant was Christian, born in Cardiff and of Rwandan origin.

Because the rumour and its context were moral, emotional and shocking — qualities that help rumours spread5 — the story received a lot of attention on social media. The trinity of influencers, algorithms and crowds had created an alternative reality and misinformation provided far-right groups with the excuse they needed to leverage a tragedy to unleash violence across the country.

How a rumour is born

False rumours can have other nasty side effects, too. DiResta relates how her team was on the receiving end of them, while working as part of the Election Integrity Partnership, co-run by Kate Starbird, a computer scientist at the University of Washington in Seattle, who has also been a target of smear campaigns1. In March 2021, the team issued a public report documenting instances of viral false and misleading narratives that were circulating online during the 2020 US presidential election (see go.nature.com/472ney8).

In late 2022, statements from that report were twisted by right-leaning social-media influencers, who put forward a fantastical story about how academics, social-media companies and the US Department of Homeland Security had colluded to skew the 2020 election by taking down “millions” of social-media posts — an alleged act of mass censorship (see go.nature.com/3ak4ih0).

In reality, DiResta explains, the study’s aim was not to censor partisan statements, but to fact-check misleading statements about the electoral process in general. Only a small proportion of the posts that contained blatant election disinformation were flagged by the project to social-media companies for further action — about 0.01% of the 22 million posts in the sample. Fewer than 400 were eventually taken down for violating the platform’s terms of service.

Nonetheless, DiResta became the subject of rumours and conspiracy theories, including that she had undisclosed ties to the US Central Intelligence Agency, on the basis that she had done an internship there 20 years before.

Of the multiple lawsuits that have been filed against her since these online rumours surfaced, one case was dismissed in June by the US Supreme Court for having no legal standing. As I read the book, much of it resonated with my own experience. I’ve found myself facing online accusations of being part of a government conspiracy, for example, for helping the US State Department to educate citizens to spot common techniques used in disinformation campaigns. As the online attacks continued, the motivation behind my research was misrepresented and harassment campaigns were launched against me, my colleagues and even my students. It was stranger than fiction.

However, I also wondered about the role of another class of actors in the fantasy–industrial complex: the apologists. Think of doctors with a specialty in another medical domain who question the efficacy of vaccines or philosophers who weaponize postmodern principles to question whether an identifiable category called ‘misinformation’ even exists. DiResta overlooks them, but academics who are congenial to the messages promoted by influencers can provide troubling intellectual cover for anti-scientific claims.

Prevention better than cure

In terms of solutions, DiResta offers a nuanced discussion on the role of free speech, content moderation and education in our fractured media landscape. One suggestion is to give power back to the people and let audiences decide how much moderation and algorithmic ranking they want in their social-media feeds. Other ideas include teaching the public about the techniques of propaganda, because those techniques can be used by anyone.

DiResta also offers an important tip for scientists facing political threats: instead of sticking your head in the sand, pre-emptively release the facts and prebunk falsities before an alternative reality begins to take on a life of its own. This coheres with what I know about fighting misinformation: prevention is better than cure. But to fix our societal ills, people need to share the same reality. DiResta’s book offers a powerful and compelling read on how we might achieve just that.