News & Views | Published:

Climate change

Models change their tune

Nature volume 430, pages 737738 (12 August 2004) | Download Citation

Subjects

Climate models are usually tuned to match observations. A new approach, in which the models are detuned instead, increases our confidence in projections of future warming.

In 2001, the Intergovernmental Panel on Climate Change published its report stating that the Earth will warm by between 1.4 °C and 5.8 °C by the end of the twenty-first century1. Many argue that this range is too large to justify action to reduce the rising concentrations of greenhouse gases, principally carbon dioxide, in the atmosphere. Two factors are responsible for the wide estimated range: uncertainty in our understanding of the physical processes that influence climate and our ability to model them; and uncertainty over the so-called ‘emissions path’ — how the demographic, technological and ecological development of human societies will affect energy consumption and greenhouse-gas emissions.

Murphy and colleagues (page 768 of this issue)2 have used a climate model to estimate the uncertainty associated with modelling the physical processes. Climate modellers use mathematical formulations, called parameterizations, to describe processes that cannot be resolved because of their scale or complexity, or that are not well understood. Murphy et al. quantify how the choice of parameterizations affects the results of modelling the evolution of temperature, precipitation and other climate variables when the greenhouse-gas concentration increases. The good news for the modelling community is that none of these choices alters the basic response of the model to a common scenario, a doubling of CO2 concentration: all simulations show consistent warming worldwide, most likely between 2 °C and 3 °C, with the characteristic amplification of warming at high latitudes of the Northern Hemisphere. So this is a robust result over a wide range of parameterizations.

Simplifications are unavoidable when trying to capture the climate system — with its cascades of scales in time and space — in a finite numerical model. Some of these parameterizations may be rigorously derived from fundamental physical principles3, but many contain poorly constrained parameters that are used to complete the mathematical construct. Although it is often true that (in John von Neumann's words) “the justification of such a mathematical construct is solely and precisely that it is expected to work”, practical data from instruments must be used to set limits on the physically realistic range for these parameters.

Murphy and colleagues' model2 consists of a state-of-the-art atmospheric general circulation model coupled to a ‘slab’ ocean. The latter provides a rudimentary lower boundary condition for the atmospheric model, with lateral fluxes of heat being suggested and kept fixed, while sea surface temperatures are allowed to change. Ocean dynamics are not taken into account, which precludes projections of such considerations as changes in sea-ice cover in the Arctic and Antarctic, or in the ocean uptake of CO2. In predicting climate changes in the next several decades — the transient climate response — the ocean cannot be neglected4,5. However, the equilibrium climate sensitivity, a key value in characterizing a climate model, can be reasonably estimated using this reduced set-up. Climate modellers are sometimes preoccupied with ‘tuning’ their models to achieve best agreement with observations, but Murphy et al. take the opposite tack: instead of aiming for a best estimate by careful tuning, they quantify the range of outcomes that results when parameter values are changed. This detuning is a neat way of perturbing some of the physics of the climate model.

Within given bounds, they varied 29 parameters, one by one, and analysed the results from 20-year simulations under present-day and doubled CO2 conditions. In this linear approach, perturbing each parameter in turn, it is assumed that the effect of combinations of changed values of different parameters can be roughly estimated by adding their results. Murphy et al. define a climate prediction index, or quality index, which weighs the simulated fields against observations, and permits an objective determination of the critical parameters in the perturbation experiments. It is no surprise that fields associated with the hydrological cycle, particularly cloud processes, show the widest spread in this quality index. In consequence, the partitioning of the radiative fluxes (short-wave versus long-wave radiation, downwards versus upwards) varies widely when the parameterizations are perturbed. This was expected from earlier comparisons6; it indicates that the models must be improved if they are to be reconciled with high-precision data on components of the radiative balance7.

In a further step, Murphy et al. calculate probability density functions for climate sensitivity; this is defined as the estimated increase of global mean temperature in doubled CO2 conditions. Earlier studies have produced such estimates with either small ensembles of comprehensive models4 or simplified climate models8. Depending on the various model simulations to which the quality index is applied, and giving more weight to those that show better agreement with observations, the warming in the 5–95% probability range is between 2.4 °C and 5.4 °C. In spite of the relatively wide range of systematic parameter variation, the fundamental response of the climate model is consistent with the range reported by the Intergovernmental Panel1,9.

Uncertainty is inherent in any prediction. The public have become used to probabilities reported in weather forecasts. But the public, and decision makers, have yet to accept that a similar approach is necessary to guide steps to reduce and mitigate the impact of climate change. Work such as that of Murphy et al. and other initiatives will reduce these uncertainties to a point where the dominating uncertainty remains the choice of emission path used in modelling. This is already the case for predictions that look at the second half of this century10. To shift that time horizon closer to the present will require improved climate models that can be compared in a systematic way11, larger model ensembles for testing12,13, and a systematic exploration of uncertainties in the radiative balance.

References

  1. 1.

    Houghton, J. T. et al. (eds) Climate Change 2001: The Science of Climate Change (Cambridge Univ. Press, 2001).

  2. 2.

    et al. Nature 430, 768–772 (2004).

  3. 3.

    Dynamical Paleoclimatology (Academic, London, 2002).

  4. 4.

    & Nature 416, 723–726 (2002).

  5. 5.

    , & J. Clim. 15, 124–130 (2002).

  6. 6.

    , , , & J. Clim. 14, 3227–3239 (2001).

  7. 7.

    , , , & Geophys. Res. Lett. 31, L03202 (2004).

  8. 8.

    , , & Nature 416, 719–723 (2002).

  9. 9.

    et al. in Climate Change 2001: The Science of Climate Change (eds Houghton, J. T. et al.) 525–582 (Cambridge Univ. Press, 2001).

  10. 10.

    Nature 416, 690–691 (2002).

  11. 11.

    et al. Glob. Planet. Change 37, 103–133 (2003).

  12. 12.

    & Nature 419, 228 (2002).

  13. 13.

Download references

Author information

Affiliations

  1. Thomas F. Stocker is in the Division of Climate and Environmental Physics, Physics Institute, University of Bern, Sidlerstrasse 5, CH-3012 Bern, Switzerland. stocker@climate.unibe.ch

    • Thomas F. Stocker

Authors

  1. Search for Thomas F. Stocker in:

About this article

Publication history

Published

DOI

https://doi.org/10.1038/430737a

Further reading

Comments

By submitting a comment you agree to abide by our Terms and Community Guidelines. If you find something abusive or that does not comply with our terms or guidelines please flag it as inappropriate.

Newsletter Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing