Few scientific creations have had greater impact on public opinion and policy than computer models of Earth's climate. These models, which unanimously show a rising tide of red as temperatures climb worldwide, have been key over the past decade in forging the scientific and political consensus that global warming is a grave danger.

Now that that consensus is all but universal, climate modellers are looking to take the next step, and to convert their creations from harbingers of doom to tools of practical policy. That means making their simulations good enough to guide hard decisions, from targets for carbon dioxide emissions on a global scale to the adaptations required to meet changing rainfall and extreme weather events on regional and local scales.

Today's modelling efforts, though, are not up to that job. They all agree on the general direction in which the climate will move as greenhouse gases build up, but they do not reliably capture all the nuances of today's climate, let alone tomorrow's. Moreover, each model differs from reality in different ways.

It was in recognition of this that a cross section of climate modellers gathered for a 'summit' at the European Centre for Medium-Range Weather Forecasts in Reading, UK, last week (see page 268). The meeting called for an ongoing project aimed at understanding and modelling the climate system well enough to provide the sorts of prediction that policy-makers and other stakeholders need — or, at the very least, to show why such prediction might not, in fact, be achievable. Key to this project would be one or more dedicated facilities offering world-class computational resources to the climate-modelling community.

A clear resolution

Those resources are notably lacking at the moment. The world's very fastest computers run at hundreds of teraflops (which is to say, hundreds of trillions of mathematical operations a second), and the first forays into the petaflop range are expected by the end of the year. But today's climate models rarely run on machines that can manage more than a few tens of teraflops. This translates into spatial resolutions of a hundred kilometres or so. There was a general agreement at the summit that more realistic models will require resolutions in the tens of kilometres, at least. And even higher resolutions — a kilometre or less, say — may well be needed to handle such critical issues as cloud formation realistically. Hence the need for computers a couple of generations beyond the current state of the art.

Meeting this need is not just a matter of buying a supercomputer. It means moving climate modelling up the petaflop pecking order for a sustained period of time. One plausible goal might be to assure that the most powerful supercomputer in the public realm should be devoted to climate work by 2012, and that the field's lead should be regularly renewed for several computer generations after that. After all, the fastest computers are nearly always paid for out of the world's public purses, often for use in areas of national security such as communications intelligence or nuclear weapons design. And climate prediction is a national security issue if ever there was one.

Climate prediction is a national security issue if ever there was one.

If funding agencies were to embrace such a goal, the implications would go well beyond money. Profound changes would be required of the community itself. Because the cost over a decade or more might easily top a billion dollars, such an investment in cutting-edge climate modelling would all but certainly have to be done multinationally, or even globally. This would pull climate modelling into the world of 'big science' alongside space telescopes and particle accelerators — a transformation that would require new, and possibly disruptive, institutional arrangements.

Living large

Aware of budgetary realities and the history of scientific centralizations, national centres of climate modelling and expertise such as Britain's Hadley Centre or the US National Center for Atmospheric Research might reasonably see the development as a threat. International collaborations such as CERN — the European particle-physics laboratory — and the European Southern Observatory have served the scientific communities of their member states well, yet have undoubtedly taken their toll on national facilities. With the upcoming inauguration of the Large Hadron Collider (LHC) at CERN, Europe will have the world's best particle-physics facility — but it will have very few of its other particle-physics facilities.

This analogy is not, however, fully convincing. Building an LHC does not make it significantly easier to build lesser accelerators. But advances in supercomputing do make it easier to build computers formerly known as super: a petaflop will seem slow in less than a decade. A world facility where teams of researchers try out very high resolutions and new techniques, and where software engineers and programmers learn how to get the most out of bleeding-edge hardware, will require a network of more modest centres around the world from which to draw its problem-list and into which to feed its insights. A range of operational climate-prediction capabilities could help keep modelling close to stakeholders' needs, and lessen the all-eggs-in-one-basket group-think risk of a global facility.

An ambitious climate-modelling facility dedicated to solving problems beyond the capability of today's national programmes carries risks, but they are risks worth taking. The world's governments — and even, conceivably, its high-tech philanthropists — should listen to the modellers. Big science is often, and gloriously, justified on the basis of pure intellectual excitement. This field offers that — and a chance to improve the world as well.