Neil Johnson has analysed pro-ISIS networks on social media. Credit: Univ. Miami

Neil Johnson, a physicist at the University of Miami in Florida, studies patterns that emerge in complex systems. This month, he and his colleagues — who included social scientists and computer scientists — published two papers1,2 discussing the results of an effort to track the connectivity of more than 100,000 people who follow online communities that support the Islamist terrorist organization ISIS. Using VKontakte, a large social-media network based in Russia, the researchers uncovered 196 pro-ISIS communities. Nature talked to Johnson about what he learned from the studies. This interview has been edited for length and clarity.

Why study pro-ISIS communities online?

I have always been interested in anything that involves extreme beliefs. There are many studies on the evolution of online extremism, but they tend to look at chatter between individuals, or study who follows who on Twitter. The topic is full of rather vague theories that people behave in this way or that in a group, but there are not many quantitative analyses of what is actually happening.

Around the start of 2014, we began searching for keywords and hashtags to follow pro-ISIS sentiment on VKontakte, where pro-ISIS communities — or aggregates — tend to live longer than they do on Facebook, which shuts down their activity very quickly. Some of the communities live for just a day or two; others can be active for 4–5 months. By September 2015, many of the groups that we had looked at began to shut down or go private. We were very fortunate to be able to monitor the aggregates in the time we had.

What did you learn about the online behaviour of pro-ISIS communities?

We were surprised to find that 40% of followers declared themselves to be female. Women hold an unexpected position in the pro-ISIS networks — they tend to be centres of information-flow between followers, and to increase the lifespan of the communities. They typically do not have similarly prominent roles in comparable networks from the everyday world, such as innovation networks for patents in industry and academia.

The communities shifted very quickly: around 15% of the groups change their names, and others adjust their privacy settings, or disappear and re-emerge with another identity but with most of the same followers. That’s different from the communities that appear online in other situations — such as protests that we examined in Brazil in June 2013 and Venezuela in February 2014. The aggregates that appeared in those cases didn’t adapt or shift in the same way, probably because no one was trying to disrupt them.

How can this research help those trying to shut down ISIS?

A key implication of our work is that once the communities are found, you have your hand on the pulse of the entire organization. Instead of having to sift through millions of Internet users and track specific individuals, an anti-ISIS agency can simply follow the relatively small number of communities. 

In light of the massacre in Orlando this week and the stabbing in Paris, our research also suggests that any online ‘lone wolf’ actor will only truly be alone for short periods of time: as a result of the coalescence process that we observe in the online activity, any lone wolf either was recently in an aggregate or will soon be in another one.

Pro-ISIS support will quickly grow into one large super-community — which could very rapidly spread material across the Internet — if anti-ISIS agencies aren’t active enough in shutting the smaller groups down. But we show mathematically that it should be possible to prevent the development of large aggregates by breaking up smaller ones. For example, to prevent aggregates of 1,000 or more forming in the future, agencies can focus on breaking up groups of, say, 100. Without these smaller pieces, the larger ones cannot develop.

I am hesitant to say that monitoring the number of aggregates could help to predict when attacks might happen in the real world — but we did see that communities proliferated before bursts of real-world attacks.

At the start of our work, we shared our preliminary findings with some people in the intelligence community, but I suspect that they didn’t feel sufficiently comfortable with our complex-systems scientific approach. The community seems to be dominated by social scientists who favour narrative discussions and theories about human behaviour, whereas ours is data-driven. Hopefully this situation might change in the future, with natural, mathematical and social scientists working closer together on such topics, as we did in our team.

Are you still following ISIS groups online?

Yes, but also we want to start to look at other groups that self-organize online around extreme beliefs, such as homophobia. In the future, extreme groups will undoubtedly emerge around some other set of common beliefs.