State of the art: a model from the UK National Centre for Atmospheric Science and the Met Office running on Japan's Earth Simulator. Credit: P. L. VIDALE, NCAS CLIMATE, WALKER INST., UNIV. READING

Climatologists have called for massive investment in computer and research resources to help revolutionize modelling capabilities. The eventual aim is to provide probabilistic climate predictions that are as useful, and usable, as weather forecasts.

At the end of a four-day summit held last week at the European Centre for Medium-Range Weather Forecasts in Reading, UK, the scientists made the case for a climate-prediction project on the scale of the Human Genome Project. A key component of this scheme, which would cost something up to, or over, a billion dollars, would be a world climate research facility with computer power far beyond that currently used in the field.

Questions on how severe the effects of global warming will be, and which regions will be hit in what ways, are beyond the capabilities of current climate science, at least in part because of computing constraints. Today's climate models are run on computers in the 10-teraflop range, meaning they are capable of 10 trillion operations a second. Despite this speed, models on these computers are still coarse-grained, cutting the world into cells more than 100 kilometres across.

Increasing computing power 10,000 times — to speeds in the hundreds of petaflops — would allow modellers to study simulations at the kilometre scale, enabling better predictions on the activity of hurricanes and, eventually, the local deep convection that transfers much energy into the upper atmosphere (see 'A real solution?'). This research could then be fed into operational models.

The scientists think they could answer at least some of the 'big' questions on the effects of global warming if the technology was available. But national climate-modelling efforts, such as those of the Met Office in Exeter, UK, or the National Center for Atmospheric Research (NCAR) in Boulder, Colorado, aren't attracting the required level of funding. Although Japan's Earth Simulator in Yokahama was once the world's fastest computer, there are now 29 faster ones, with the first petaflop machines only months away.

We need to be breathtakingly bold. Leo Donner

“We need to be breathtakingly bold, frankly, in terms of some of the calculations that we're going to do in order to push the climate-prediction effort forward,” says Leo Donner, a physical scientist at the Geophysical Fluid Dynamics Laboratory of Princeton University, New Jersey. Antonio Navarra, a climate modeller at the National Institute of Geophysics and Volcanology in Bologna, Italy, spells out the implication: “We're reaching the point where national resources are insufficient to answer the scientific questions.”

More money and cutting-edge challenges would also provide some hope of retaining highly trained programmers with expertise in climate modelling. Conference chair Jagadish Shukla of the Institute of Global Environment and Society in Calverton, Maryland, says this resource is “decreasing faster than the sea ice” as staff are lured from research by the financial rewards and job security provided by companies such as Google.

Researchers from around the world gathered in Reading, UK, for the summit. Credit: R. HINE, ECMWF

Addressing the summit on its opening day, economist Jeffrey Sachs, the director of the Earth Institute at Columbia University, New York, said that there would be “a lot of interest among politicians in investing the hundreds of millions of dollars necessary, if scientists can provide answers to key questions … such as future food supply”. Although governments are the obvious source of funding, Lawrence Gates, a now-retired climate scientist from the Lawrence Livermore National Laboratory in California, urged the attendees to explore philanthropic options.

How increased investment is divided between new facilities and existing ones is likely to be controversial. Some fear that a single global institute could threaten national centres, potentially taking the onus off governments to fund the institutions that are closest to stakeholders and could be expected to provide the predictions that have most real-world use. “Everyone is agreed that there needs to be a substantial investment in climate modelling, but whether a single centre is the solution is another question. There may be other ways,” says John Mitchell, the chief scientist at the Met Office. Donner says that a sketch, presented on the conference's last day, of how the global facility might fit into the research world “seems to relegate national centres to little more than distributors of data”.

However, Shukla was adamant that “every science breakthrough leads to the formation of an institute to address the problem”. “We are facing a darwinian change in the way we are working and we shouldn't be afraid of that,” says Navarra. The meeting could have come to grief on such differences, according to Julia Slingo, the director of the Centre for Global Atmospheric Modelling at the University of Reading, but in the end the level of consensus, she says, was “fantastic”.

Various attendees expressed frustration at the fact that the new facility could not be funded purely on the basis of the world-class science it would do — and indeed the fact that it would produce great research might count against it, making it seem more like a “toy for the boys” than a policy-informing instrument.

“If we just ask for enhanced understanding, then we have very little chance of getting the necessary funding,” warned Shukla. But as Mitch Moncrieff from NCAR put it “we need a quantum leap in research to provide better predictions, even if the politicians don't get that”. And there was widespread agreement that they need to get it fast. “We need a revolution as it has got to be done extremely quickly”, said Brian Hoskins, director of the Grantham Institute for Climate Change at Imperial College London, UK.