Published online 4 April 2011 | Nature 471, 566-568 (2011) | doi:10.1038/471566a

News Feature

Social science: Web of war

Can computational social science help to prevent or win wars? The Pentagon is betting millions of dollars on the hope that it will.

If George Clooney stars in a movie, will it be a hit? Or will it flop, like his 2008 comedy Leatherheads?

That question, at least in broad outline, made its way to Ken Comer, deputy director for intelligence at the Joint Improvised Explosive Device Defeat Organization (JIEDDO) of the US defence department, and the man at the centre of the US military's war on roadside bombs. He recently made the time for a briefing by scientists from the US Department of Energy, who were honing their modelling skills by working with a film studio on the formula for a successful blockbuster.

Comer listened to them describe how they had analysed and reanalysed the data that Hollywood vacuums up about its audiences, slicing the results in every way they could think of, only to come to the same conclusion every time: you can't tell. "You can dress George Clooney up," recalls Comer, "you can dress him down, you can put a beard on him, and yet there's no reliable method for predicting whether or not a George Clooney movie is going to be a blockbuster."

And that, says Comer, is a perfect illustration of why the Department of Defense (the Pentagon) is rethinking its data-mining approach to the problem of roadside bombs — not to mention counter-insurgency and other aspects of warfare. "I speak as one who has been swimming in data for three years," he says, referring to the reams of information that the department has compiled about roadside bombs after nearly a decade of war in Iraq and Afghanistan: "Data is not our solution to this problem."

Instead, Comer and other officials are placing their bets on a new generation of computer models that try to predict how groups behave, and how that behaviour can be changed. This work goes under a variety of names, including 'human dynamics' and 'computational social science'. It represents a melding of research fields from social-network analysis to political forecasting and complexity science.

Figures on total funding for this work are difficult to come by. But one of the field's major supporters, the Office of the Secretary of Defense, is planning to spend US$28 million on it in 2011, almost all on unclassified academic and industrial research. And separate computational social-science programmes are being funded by bodies such as the Defense Advanced Research Projects Agency (DARPA), the Defense Threat Reduction Agency and the research arms of the Army, Navy and Air Force.

The Pentagon's embrace of this work has been so enthusiastic that some scientists have urged a slow-down, for fear that such nascent models will be pushed into operation before they are ready.

In their current state of development, says Robert Albro, an anthropologist at American University in Washington DC, the models are often a waste of time. "I am not saying that computational social science is voodoo science," says Albro, a member of the American Anthropological Association's Commission on the Engagement of Anthropology with the Security and Intelligence Communities. "I'm saying that voodoo science is all too frequently being generated from the work of computational social science."

Cloudy, with an 80% chance of war

One often-cited inspiration for the current modelling work is an episode in 2003, when coalition forces in Iraq were searching in vain for deposed dictator Saddam Hussein. With conventional methods leading nowhere, a group of US Army intelligence analysts decided to aggregate the available information about Saddam's social network using a link diagram to depict relationships. As they factored in key variables such as trust, the analysts began to realize that the most-wanted government officials — those pictured on the 'personality identification playing cards' that had been widely distributed among US troops — were not necessarily the people whom Saddam trusted most, and were thus not likely to know where he was hiding. Instead, the diagram led the analysts to focus their attention on trusted lower-level associates — including one key bodyguard, whose information led the trackers to the dictator's underground hideaway on a farm near Tikrit.

Today's simulations are similar in concept, but with one crucial difference: the Army analysts' diagram was static, constructed by hand and analysed manually. Now the goal is to do all that with algorithms, using computers to integrate vast amounts of data from different sources, and then to keep the results up to date as the data evolve.

A prime example of a system that creates such models is the Organization Risk Analyzer (ORA): a 'dynamic network analysis' program devised by Kathleen Carley, a computer scientist at Carnegie Mellon University in Pittsburgh, Pennsylvania, who has emerged as a leading figure in Pentagon-funded computational social science. "We build the psychohistory models," Carley jokes, referring to the 'science' of group behaviour invented in the 1940s by the science-fiction author Isaac Asimov for his classic Foundation novels. "We are the Foundation!"

Click for larger image

To create an ORA model for a politically unstable region such as Sudan, explains Carley, she uses her own program, AutoMap, to trawl through publicly available news reports and automatically extract names and other key data (see 'The conflict forecast'). The ORA might then use that information to identify people — or, in the lexicon of social-network analysis, nodes — with a high degree of 'betweenness', meaning lots of direct connections to other people within the network. These individuals "are often those that are considered influential because … they broker information between groups and so on", says Carley.

The same types of model can be used to predict how a terrorist ideology might catch on in the local population and propagate from person to person like a spreading virus. Carley's system can factor in cultural variables, using records of the opinions and attitudes that tend to prevail among specific ethnic groups. The goal, she says, is to produce an effective strategy for stopping the epidemic of radicalization or for destabilizing the terrorist networks by identifying key individuals and groups to be targeted with diplomatic negotiation or military action.

Another example is the Spatial Cultural Abductive Reasoning Engine (SCARE) developed by Venkatramanan Subrahmanian, a computer scientist and co-director of the Laboratory for Computational Cultural Dynamics at the University of Maryland in College Park. Subrahmanian says that SCARE was able to predict the locations of arms caches in Baghdad to within half a mile, using a combination of open-source data on past roadside bomb explosions and constraints based on distance (terrorists didn't want to carry their explosives very far for fear of getting caught) and culture (most of the attacks that they tracked came from Shiite groups with ties to Iran, so the caches were probably not in Sunni neighbourhoods). Subrahmanian says that he has given copies of the program to the military, and "they're clearly trying it out".

“I would say the weather guys are far ahead of where we are.”


Can a model predict a war? The Integrated Crisis Early Warning System (ICEWS) is being developed with DARPA funding by university researchers working with defence giant Lockheed Martin. A revival, at least in part, of a more primitive, 1970s-era DARPA forecasting project, the current incarnation of the ICEWS focuses on predicting political events such as insurgency, civil war, coups or invasions. The system draws its data mainly from the Reuters online news feed, and combines them with models incorporating previous behaviour of ethnic or political groups, economic factors such as a country's gross domestic product and geopolitical relationships with neighbouring countries. The result is an ICEWS forecast that might predict, for example, that 'Country X has a 60% probability of civil war'.

The ICEWS has been producing monthly predictions since March 2010, says Sean O'Brien, programme manager for the effort at DARPA. He believes that such models, although imperfect, are already nearing the point at which they can be useful for military leaders. O'Brien has considerable company elsewhere in the Pentagon: the Office of Director of Defense Research and Engineering, for example, is sponsoring its own programme in Human, Social, Cultural and Behavior modelling. And although the office did not provide details, it says that some of its simulations are already being used by the US Special Operations Command and the US Africa Command.

A generation away

Even among researchers working on models with Pentagon funding, there is concern that such enthusiasm may be premature. It seems, for example, that neither computer models nor human analysts were able to precisely predict this year's uprisings in the Middle East.

When it comes to prediction, "I would say the weather guys are far ahead of where we are", says Subrahmanian, who notes that meteorologists are frequently accused of being wrong as much as they are right. "And that might give you some relative understanding of where the science is."

Carley points to the pitfalls of automated data collection. "One of the issues," she says, "is that you will get people who are … talked about as part of the networks who aren't technically alive." In the ORA model for Sudan, for example, the textual analysis resulted in a network in which one of the key individuals was Muhammad — the Islamic prophet who died in AD 632.

Models used bomb attacks in Iraq — click for larger image

Albro, who has reviewed a number of computational social-science models as part of the US National Research Council's Committee on Unifying Social Frameworks, worries that much of the work is being done by computer scientists, with only token input from social scientists, and that minimal attention is being paid to where the data come from, and what they mean. He points to some models that look for signs of extremist violence by tracking phrases such as "blow up" in online social-media discussions. "There's the constant implication that discursive violence adds up to real violence, and that's crazy," he says.

Robert Axtell, a computational social scientist at the Krasnow Institute for Advanced Study at George Mason University in Fairfax, Virginia, and a pioneer of agent-based modelling, argues that there simply aren't enough accurate data to populate the models. "My personal feeling is that there is a large research programme to be done over the next 20 years, or even 100 years, for building good high-fidelity models of human behaviour and interactions," he says.

Similar notes of caution can be heard within the defence department. "We're at the very beginnings of this," says John Lavery, who manages a programme of complex-systems modelling at the Army Research Office in Research Triangle Park, North Carolina, and who compares the current state of computational social science with physics in the early nineteenth century.

“As soon as they delivered it we said, 'Gee, thanks. Now you'll have to rewrite it that for Afghanistan.'”


"It's a tool, and if you can leverage it, that's great," agrees Brian Reed, a behavioural scientist at the Network Science Center of the US Military Academy at West Point, New York, who was a key architect of the network analysis that led to Saddam's capture. "But you can get too much information," he warns, "and someone has to provide a focus." Reed cites an example from his own return to Iraq, where he was deployed from 2008 to 2009 in the province of Diyala. Wanting to stop roadside bomb attacks, he asked his intelligence organization for a network analysis of the insurgent network. They provided an overload of data. "What they crunched, no one at our end could understand," says Reed.

Critics such as Albro worry that too many researchers are unaware of the real limitations of their work. Many of the models that Albro has seen focus on verification — ensuring that the simulations are internally consistent — but give short shrift to validation, or making sure that they correlate to something in the real world. The models might provide an interpretative tool that allows policy-makers or military leaders to think critically about a problem, he suggests, but the technique's limitations are sometimes overlooked. "It does not answer our questions for us," says Albro. "It does not solve that dilemma of what decision I need to make."

Indeed, it is often far from clear whether the current generation of models is telling people anything that an expert in the relevant subject wouldn't already know. Carley recalls a conference at which she presented her results about the key individuals whom her ORA model had identified in Sudan. "Yeah," came the response from the regional specialists in the audience, "we kind of knew most of this."

For all the caveats, however, the need to help soldiers on the ground carries an acute sense of urgency back at JIEDDO headquarters. "We have a few instances of models that have docked with the data successfully," says Comer, citing an agent-based simulation of the Iraqi city of Samarra, which was funded by JIEDDO. "The big magic trick is to move those models to a point where they can be predictive."

The model of Samarra was able to match specific changes in US military strategy to decreases or increases in the incidence of roadside bombs, but it was specific to that city. The researchers "did a great model and it was really useful", says Comer. "Just as soon as they delivered it we said, 'Gee, thanks. Now you'll have to rewrite that for Afghanistan.'"

ADVERTISEMENT

Comer acknowledges the irony that as the world's most technologically advanced military spends tens of millions of dollars on sophisticated computer tools to predict insurgent behaviour, the insurgents in question are busy building crude bombs with little more than fertilizer and basic electronics.

"The enemy is holding his own," says Comer, "not only without the data, but without the computer power, without the Internet, without the databases — and without the science." 

Sharon Weinberger is an Alicia Patterson Foundation Fellow based in Washington DC.

Commenting is now closed.