Climate models project large decreases in permafrost by 2100. Some models used for the IPCC's next assessment will include important feedbacks associated with increased releases of the greenhouse gases methane and carbon dioxide. Image adapted from ref. 9. Credit: DAVID LAWRENCE

The climate scientists that comprise the Intergovernmental Panel on Climate Change (IPCC) don't do predictions, or at least they haven't up until now1. Instead the scientists of the IPCC have, in the past, made projections of how the future climate could change for a range of 'what-if' emissions scenarios. But for its fifth assessment report, known as AR5 and due out in 2013, the UN panel plans to examine explicit predictions of climate change over the coming decades. In AR5's Working Group I report, which focuses on the physical science of climate change, one chapter will be devoted to assessing the skill of climate predictions for timescales out to about 30 years. These climate forecasts, which should help guide decision-makers on how to plan for and adapt to change, will no doubt receive much attention.

Another chapter will deal with longer-term projections, to 2100 and beyond, using a suite of global models. Many of these models will attempt new and better representations of important climate processes and their feedbacks — in other words, those mechanisms that can amplify or diminish the overall effect of increased incoming radiation. Including these elements will make the models into more realistic simulations of the climate system, but it will also introduce uncertainties.

So here is my prediction: the uncertainty in AR5's climate predictions and projections will be much greater than in previous IPCC reports, primarily because of the factors noted above. This could present a major problem for public understanding of climate change. Is it not a reasonable expectation that as knowledge and understanding increase over time, uncertainty should decrease? But while our knowledge of certain factors does increase, so does our understanding of factors we previously did not account for or even recognize.

From projection to prediction

In previous IPCC assessments1, changes in the atmospheric concentrations of greenhouses gases and aerosols over time were gauged using 'idealized emissions scenarios', which are informed estimates of what might happen in the future under various sets of assumptions related to population, lifestyle, standard of living, carbon intensity and the like. Then the changes in future climate were simulated for each of these scenarios. The output of such modelling is usually referred to as a projection, rather than a prediction or a forecast. Unlike a weather prediction, the models in this case are not initialized with the current or past state of the climate system, as derived from observations. Instead, they begin with arbitrary climatic conditions and examine only the change in projected climate, thereby removing any bias that could be associated with trying to realistically simulate the current climate as a starting point. This technique works quite well for examining how the climate could respond to various emissions scenarios in the long term.

Climate models have, however, improved in the past few years, and society is now demanding ever more accurate information from climate scientists. Faced with having to adapt to a range of possible impacts, policymakers, coastal planners, water-resource managers and others are keen to know how the climate will change on timescales that influence decision-making. Because the amount of warming that will take place up to 2030 is largely dependent on greenhouse gases that have already been released into the atmosphere, it is theoretically possible to predict, with modest skill, how the climate will respond over this time period.

In recent years, several modelling groups have published such predictions for the coming decades2,3,4 (Fig. 1). In weather prediction, and in this newer form of climate prediction, it is essential to start the model with the current state of the system. This is done by collecting observations of the atmosphere, oceans, land surface and soil moisture, vegetation state, sea ice, and so forth, and assimilating these data into the models — which can be challenging, given model imperfections. Although important progress has been made in this area, the techniques are not yet fully established5. In part because it takes at least a decade to verify a ten-year forecast, evaluating and optimizing the models6 will be a time-consuming process. The spread in initial results is therefore bound to be large, and the uncertainties much larger, than for the models in the last IPCC assessment. There are simply more things that can go wrong.

Figure 1: Big spread.
figure 1

Ten-year mean global surface temperatures from observations (red) and from three independent hindcasting studies (green, blue and black)2,3,4. Blue bars and grey shading represent statistical error. Separate vertical bars to the right show forecasts of the future. Observations are from two UK Met Office Hadley Centre models, HadISST 1.1 and HadCRU3. Anomalies are departures in temperature from that of a base period, which is different for each of the three studies. See ref. 5 for further details.

Factoring in feedbacks

For the longer-term projections up to 2100 and beyond, the models used in AR5 will be of unprecedented complexity and will better describe important climate processes and feedbacks. Some examples include the release of greenhouse gases from thawing permafrost and the fertilizing effect of atmospheric carbon dioxide on vegetation. How realistically the models can reproduce such feedbacks depends critically on their inclusion of many other factors, such as the availability of water, nutrients and biogeochemical trace elements, as well as the nitrogen and hydrologic cycles7. All of these are represented to varying extents in the climate models for AR5.

As another example, it has long been known that aerosols affect clouds, both directly and indirectly, in multiple and complex ways8. Aerosols directly interfere with radiation and typically act as cooling agents in the atmosphere, especially at the Earth's surface. However, they also indirectly affect the climate by redistributing cloud moisture and changing the brightness of clouds, as well as their lifetime and precipitation.

The direct effects of aerosols on radiation were included to some degree in the atmospheric component of the climate models in the last IPCC assessment. Now the indirect effects are also being included, even as researchers are continually gaining new insights about these processes from satellite observations. Because different groups are using relatively new techniques for incorporating aerosol effects into the models, the spread of results will probably be much larger than before.

Trial and error

Should experimental model results be included as part of a process that is used to inform policymakers and society of the changes to come?

It has been said that all models are wrong but some are useful. A climate model is a tool, albeit a very sophisticated one that includes complexity and nonlinearities in ways that are impossible to comprehend analytically. Ideally, a model should encapsulate the state of our knowledge. When that knowledge is incomplete, one strategy is to omit certain complex processes and to assume that they are constant, even when it is known that they cannot be. Adding complexity to a modelled system when the real system is complex is no doubt essential for model development. It may, however, run the risk of turning a useful model into a research tool that is not yet skilful at making predictions.

It is essential to take on the challenge of decadal prediction. Confronting the model results with real-world observations in new ways will eventually lead to their improvement and to the advancement of climate science. It could also, as an added benefit, provide information that is potentially useful to society. The same holds true for longer-term projections that now include far more processes and feedbacks than before. The question is not whether to do this work, but whether experimental model results should be included as part of a process that is used to inform policymakers and society of the changes to come. If the merits of a given technique have not yet been thoroughly established through the peer-reviewed literature, is it appropriate to employ it under the banner of the IPCC?

Performing cutting-edge climate science in public could easily lead to misinterpretation, and it will take a great deal of work communicating carefully with the public and policymakers to ensure that the results are used appropriately. A wise approach would be to ensure that the IPCC's usual criteria — assessing peer-reviewed publications, reconciling results where possible and appropriately expressing uncertainties — are applied, and that the panel continues to uphold the principle that it does not do research but rather assesses published studies. The first set of model projections for AR5 must be completed by the last quarter of 2010. The timescale dictated by the IPCC process brings with it the risk of prematurely exposing problems with climate models as we learn how to develop them. In other disciplines, this might not matter so much, but what to do about climate change is a high-profile, politically charged issue involving winners and losers, and such results can be misused. In fact — to offer one more prediction — I expect that they will be.