In this episode:

How tweaking social media algorithms affects polarization

Societies are becoming increasingly polarized, with people reportedly shunning those with differing political views. Social media is often thought to be exacerbating these divides, by creating echo chambers and filtering out dissimilar views. Many hoped that tweaking the algorithms that drive these platforms could reduce polarization. But, a group of studies show that such changes have little to no affect on polarization, implying that solutions to this issue are trickier than previously thought.

Research Article: Nyhan et al.

News and Views: Influence of Facebook algorithms on political polarization tested

News: Tweaking Facebook feeds is no easy fix for polarization

Subscribe to Nature Briefing, an unmissable daily round-up of science news, opinion and analysis free in your inbox every weekday.

Never miss an episode. Subscribe to the Nature Podcast on Apple Podcasts, Google Podcasts, Spotify or your favourite podcast app. An RSS feed for the Nature Podcast is available too.

TRANSCRIPT

Nick Petrić Howe

Hello there, Nick Petrić Howe here coming to you, somewhat unusually, on a Thursday. We have departed from our usual schedule to bring you a special report about some work in Nature that has only been published today, along with a few accompanying papers in Science. And it's all about social media. I'm sure that you're familiar with the idea that social media can polarise people with impacts on everything from politics, to family gatherings. But how much are these sites really impacting our views? Reporter Adam Levy has been finding out.

Adam Levy

From MySpace to TikTok, social media has changed the way we connect with those around us — allowing us to do everything from stay in touch with old friends, to form new bonds across the globe. But simultaneously, social media could also divide us. This possibility is known as “social media polarisation”.

Jaime Settle

The idea, mostly, is that by using social media, people become either more extreme in the policy attitudes or political preferences that they have, or that they become more judgmental and more stereotyped about the political outgroups that they're not a part of.

Adam Levy

This is political scientist, Jaime Settle, she and collaborators have a study out in this week's Nature, and there'll be more on that study, and the unusual team behind it in a moment. Now connected to concerns of polarisation on social media is the notion of echo chambers. This is the idea that we naturally surround ourselves with like-minded views, which could be exacerbated by social media, meaning that we rarely hear differing opinions. And as social media has become a bigger and bigger part of our lives. Many researchers have become concerned that echo chambers and polarisation on these platforms could be having profound impacts.

Jaime Settle

That could not only affect how we communicate on those platforms themselves, but also the attitudes that we develop towards each other and, and our political preferences, and that could leak then back into the real face-to-face world with implications for the way we vote, and even the way that we interact with our fellow citizens.

Adam Levy

But the truth is that while there has been plenty of speculation on how widespread echo chambers are, and the impacts on our views and our actions, such questions have been hard to answer. But now researchers from Meta, the parent company of Facebook and independent US researchers, like Jaime, have teamed up to try to provide some solid evidence.

Jaime Settle

So this was a very exciting collaboration to be a part of, this project has been more than three years in the making.

Adam Levy

So what did this collaboration find? Well, the first question the team asked was just how prevalent echo chambers on the social media platform Facebook.

Jaime Settle

We were able to use observational data from the entire population of active adult Facebook users, that was about 231 million users. Most people see most of their content from like-minded sources. But at the same time, a much smaller proportion of users get the vast majority of their content from like-minded sources. We think that this idea of the fact that, you know, there's rampant echo chamber problems on social media is is overblown, that people, yes, are seeing a lot of content from sources that share their point of view, but it's not the overwhelming majority.

Adam Levy

This result gives us a sense of the information we receive on Facebook. But what about the effects that this information has on us? Well, the team investigated that too, running a three month test with over 20,000 Facebook users during the 2020 US election, where they changed the content that those people received.

Jaime Settle

And were able to reduce their exposure to this information from like minded sources by about 1/3. But it didn't have any of the anticipated effects on changing people's attitudes that are really the driving concern of this echo chamber problem.

Adam Levy

So profoundly changing the algorithm, and so the content that Facebook users received, at this crucial political moment, did not profoundly, or even measurably, alter the attitudes of the participants. This is tough news for those who had thought that such tweaks to social media could heal deep divides in societies. Social and behavioral scientist David Garcia — who wasn't in involved in the study — was taken aback by these results.

David Garcia

So first, of course, I was like disappointed. So I have more hope for what an algorithm could do on a person's attitudes. But then when I thought more about it, and trying to change their attitudes, just because the algorithm is different, it's a bit of wishful thinking. And we did have these wishful thinking before. And these experiment is showing us that the algorithm only does so much. And solutions that go through the technology alone, and only through changing algorithms, are not going to make the society less polarized. We need more than that.

Adam Levy

As surprising as these results are, though, David remains impressed by just how robust the study is.

David Garcia

So if you asked me a while ago, whether an actual collaboration like these between academics, and Facebook was going to be successful, and they will actually get to publish something like this. I didn't believe it. I always thought that the company will have an issue with these. I don't know how could this be done better? Honestly.

Adam Levy

And this collaboration has done a host of other work. Some, Jaime tells me, is still to come later this year. But the team are also publishing studies in the journal Science at the same time as this Nature study.

David Garcia

Two of them are similar experiments in the sense that they changed the algorithm, and that one also didn't have an effect on polarisation. But there's like three possible solutions that didn't work in this case for polarization.

Adam Levy

So these studies didn't find ways to reduce polarisation, but they have their limitations. For example, the researchers deliberately targeted the 2020 US election for their investigation because of the importance of polarisation to such events. But it could be that less political periods yield different results. On top of that, other social media platforms use different algorithms, which may have different effects on their users. Plus, focusing on the US may yield unusual results.

David Garcia

It could happen that in a place that is not so polarised, that changing the algorithm could actually have an effect. It is not far fetched to think that the results could be different in another country.

Adam Levy

For both David and Jaime, these results aren't saying that social media algorithms are irrelevant to causing, or addressing, division in society. Rather, the work indicates that there's still so much we simply don't know about social media's effects. And that healing divides may be a more nuanced task than some previously believed.

Jaime Settle

And so we're hoping that these findings, which many people will find counterintuitive, will help push the field towards better clearer definitions of what the problem might be. There's a lot of room for further study in this area to better understand what other changes are possible. And what other kinds of changes can you make on the way that users interact with and engage with content on social media that does have the desired effects on their psychology that we hope that it might.

Adam Levy

Future work could address a host of questions about social medias impact on our lives, investigating other ways these platforms connect and divide us, or even investigating how interventions could influence the mental health of users. But to answer many such questions, requires more collaborations like this one. Between researchers at social media companies and independent universities. So how can such unusual collaborations become far more usual?

David Garcia

So my first approach will be to have regulations where this is kind of mandatory for a platform basically, that the scientists can come with one of these questions and the platform can't really say no. And they have to fulfill a social responsibility the same way as a car company needs to have a particular tests to pass whether they're polluting or not, as I said, being a platform I'd have to pass particular tests that academics propose. There's so much that can be done with academics and not just inside the company.

Nick Petrić Howe

That was David Garcia of the University of Konstanz, in Germany, and Jaime Settle from the College of William and Mary in Williamsburg, in the US, talking to Adam Levy. For more on that story, check out the show notes for some links. I'm Nick Petrić Howe. Thanks for listening.