Perspective | Published:

Expert judgement and uncertainty quantification for climate change

Nature Climate Change volume 6, pages 445451 (2016) | Download Citation

Abstract

Expert judgement is an unavoidable element of the process-based numerical models used for climate change projections, and the statistical approaches used to characterize uncertainty across model ensembles. Here, we highlight the need for formalized approaches to unifying numerical modelling with expert judgement in order to facilitate characterization of uncertainty in a reproducible, consistent and transparent fashion. As an example, we use probabilistic inversion, a well-established technique used in many other applications outside of climate change, to fuse two recent analyses of twenty-first century Antarctic ice loss. Probabilistic inversion is but one of many possible approaches to formalizing the role of expert judgement, and the Antarctic ice sheet is only one possible climate-related application. We recommend indicators or signposts that characterize successful science-based uncertainty quantification.

Access optionsAccess options

Rent or Buy article

Get time limited or full article access on ReadCube.

from$8.99

All prices are NET prices.

References

  1. 1.

    The Economic and Financial Risks of a Changing Climate: Insights from Leading Experts Workshop Report (AAAS, 2014).

  2. 2.

    Assessment and propagation of model uncertainty. J. R. Statis. Soc. B 57, 45–97 (1995).

  3. 3.

    et al. in Climate Change: Impacts, Adaptation, and Vulnerability. (eds Field, C. B. et al.) 1039–1099 (IPCC, Cambridge Univ. Press, 2014).

  4. 4.

    , & The myopia of imperfect climate models: the case of UKCP09. Philos. Sci. 80, 886–897 (2013).

  5. 5.

    & The equilibrium sensitivity of the Earth's temperature to radiation changes. Nature Geosci. 1, 735–743 (2008).

  6. 6.

    & Subjective judgements by climate experts. Environ. Sci. Technol. 29, 468–476 (1995).

  7. 7.

    , , & Expert judgements about transient climate response to alternative future trajectories of radiative forcing. Proc. Natl Acad. Sci. USA 107, 12451–12456 (2010).

  8. 8.

    et al. in Climate Change 2013: The Physical Science Basis (eds Stocker, T. F. et al.) (IPCC, Cambridge Univ. Press, Cambridge, 2013).

  9. 9.

    et al. Best Practice Approaches for Characterizing, Communicating, and incorporating Scientific Uncertainty in Climate Decisions Synthesis and Assessment Product 5.2. (US Climate Change Science Program, 2009).

  10. 10.

    et al. Guidance note for lead authors of the IPCC Fifth Assessment Report on consistent treatment of uncertainties (IPCC, 2010).

  11. 11.

    et al. in Meeting Report of the Intergovernmental Panel on Climate Change: Expert Meeting on Assessing and Combining Multi Model Climate (eds Stocker, T. F., Qin, D., Plattner, G.-K., Tignor, M. & Midgley, P. M.) 1–13 (IPCC, 2010).

  12. 12.

    & in Guidance papers on the cross cutting issues of the Third Assessment Report of the IPCC (eds R. Pachauri, T. Taniguchi and K. Tanaka) 33–51 (IPCC, 2000).

  13. 13.

    . et al. in Climate Change 2013: The Physical Science Basis (eds Stocker, T. F. et al.) Ch. 13 (Cambridge Univ. press, 2013).

  14. 14.

    & Radiation protection dosimetry expert judgement and accident consequence. J. Uncertainty Anal. 90, 295–301 (2000).

  15. 15.

    & Predictor screening, calibration, and observational constraints in climate model ensembles: an illustration using climate sensitivity. J. Clim. 26, 887–898 (2013).

  16. 16.

    et al. A methodology for probabilistic predictions of regional climate change from perturbed physics ensembles. Phil. Trans. Royal Soc. A 365, 1993–2028 (2007).

  17. 17.

    , , , & Challenges in combining projections from multiple climate models. J. Clim. 23, 2739–2758 (2010).

  18. 18.

    & The use of the multi-model ensemble in probabilistic climate projections. Phil. Trans. R. Soc. A 365, 2053–2075 (2007).

  19. 19.

    , & Second-order exchangeability analysis for multi-model ensembles. J. Am. Statist. Assoc. 108, 852–863 (2013).

  20. 20.

    & Reified Bayesian modelling and inference for physical systems. J. Statist. Plan. Infer. 139, 1221–1239 (2009).

  21. 21.

    et al. UK Climate Projections Science Report: Climate Change Projections (Met Office, 2009).

  22. 22.

    , , & Multivariate probabilistic projections using imperfect climate models part I: outline of methodology Clim. Dynam. 38, 2513–2542 (2012).

  23. 23.

    & Multivariate probabilistic projections using imperfect climate models part II: robustness of methodological choices and consequences for climate sensitivity Clim. Dynam. 38, 2543–2558 (2012).

  24. 24.

    Ensemble modeling, uncertainty and robust predictions. WIREs Clim. Change 4, 213–223 (2013).

  25. 25.

    , & The rapid disintegration of predictions: climate science, bureaucratic institutions, and the West Antarctic ice sheet. Social Stud. Sci. 42, 709–731 (2012).

  26. 26.

    et al. Anchoring devices in science for policy: the case of consensus around climate sensitivity. Social Stud. Sci. 28, 291–323 (1998).

  27. 27.

    et al. American Climate Prospectus: Economic Risks in the United States (Rhodium Group, 2014).

  28. 28.

    et al. Coastal flood damage and adaptation costs under 21st century sea-level rise. Proc. Natl Acad. Sci. USA 111, 3292–3297 (2014).

  29. 29.

    et al. Grounding-line migration in plan-view marine ice-sheet models: results of the ice2sea MISMIP3d intercomparison. J. Glaciol. 59, 410–422 (2013).

  30. 30.

    & Reducing uncertainties in projections of Antarctic ice mass loss. Cryos. Discuss. 9, 2625–2654 (2015).

  31. 31.

    , & Potential Antarctic ice sheet retreat driven by hydrofracturing and ice cliff failure. Earth Planet. Sci. Lett. 412, 112–121 (2015).

  32. 32.

    et al. Ice-sheet model sensitivities to environmental forcing and their use in projecting future sea level (the SeaRISE project). J. Glaciol. 59, 195–224 (2013).

  33. 33.

    et al. Insights into spatial sensitivities of ice mass response to environmental change from the searise ice sheet modeling project ii: Greenland. J. Geophys. Res. Earth Surf. 118, 1025–1044 (2013).

  34. 34.

    & Risk estimation of collapse of the West Antarctic ice sheet. Climatic Change 52, 65–91 (2002).

  35. 35.

    , & Testing the robustness of semi-empirical sea level projections. Clim. Dynam. 39, 861–875 (2012).

  36. 36.

    , & Negative learning. Climatic Change 89, 155–172 (2008).

  37. 37.

    et al. The limits of consensus. Science 317, 1505–1506 (2007).

  38. 38.

    & An expert judgement assessment of future sea. level rise from the ice sheets. Nature Clim. Change 3, 424–427 (2013).

  39. 39.

    , & Upper bounds on twenty-first-century Antarctic ice loss assessed using a probabilistic framework, Nature Clim. Change 3, 654–659 (2013).

  40. 40.

    et al. Probabilistic 21st and 22nd century sea-level projections at a global network of tide gauge sites. Earth's Future 2, 383–406 (2014).

  41. 41.

    , & Upper limit for sea level projections by 2100. Environ. Res. Lett. 9, 104008 (2014).

  42. 42.

    & Probabilistic inversion of expert judgements in the quantification of model uncertainty. Manag. Sci. 51, 995–1006 (2005).

  43. 43.

    , & Techniques for generic probabilistic inversion, Comp. Stat. Data Anal. 50, 1164–1187 (2006).

  44. 44.

    , & Marine ice sheet collapse potentially under way for Thwaites Glacier Basin, West Antarctica. Science 344, 735–738 (2014).

  45. 45.

    , , , & Widespread, rapid grounding line retreat of Pine Island, Thwaites, Smith, and Kohler glaciers, West Antarctica, from 1992 to 2011. Geophys. Res. Lett. 41, 3502–3509 (2014).

  46. 46.

    & How to interpret expert judgement assessments of 21st century sea-level rise. Climatic Change 130, 87–100 (2015).

  47. 47.

    , , & Expert assessment of sea-level rise by AD 2100 and AD 2300. Quatern. Sci. Rev. 84, 1–6 (2014).

  48. 48.

    Global sea level rise scenarios for the United States National Climate Assessment (Climate Program Office, 2012).

  49. 49.

    , & Probabilistic framework for assessing the ice sheet contribution to sea level change. Proc. Natl Acad. Sci. USA 110, 3264–3269 (2013).

  50. 50.

    et al. Probabilistic Accident Consequence Uncertainty Assessment Using COSYMA Uncertainty from the Atmospheric Dispersion and Deposition Module EUR 18822EN (European Commission, 2001).

  51. 51.

    The Logic of Scientific Discovery (Hutchinson, 1959).

  52. 52.

    The Methodology of Scientific Research Programmes Philos. Papers Vol. 1 (Cambridge Univ. Press, 1978).

  53. 53.

    , & Verification, validation and confirmation of numerical models in the Earth sciences. Science 26, 641–646 (1994).

  54. 54.

    Uncertainty in dispersion and deposition in accident consequence modelling assessed with performance-based expert judgement. Rel. Eng. Syst. Saf. 45, 35–46 (1994).

  55. 55.

    & in Risk and Uncertainty Assessment in Natural Hazards (eds Hill, L., Rougier, J. C. & Sparks R. S. J.) 64–99 (Cambridge University Press, 2013).

  56. 56.

    et al. Reactor Safety Study: An Assessment of Accident Risks in US Commercial Nuclear Power Plants WASH-1400 (NUREG75/014) (US Nuclear Regulatory Commission, 1975).

  57. 57.

    et al. Risk Assessment Review Group Report to the US Nuclear Regulatory Commission NUREG/CR-04000 (Chemical Rubber Company, 1979).

  58. 58.

    Science and Judgement in Risk Assessment (The National Academies, 1994).

  59. 59.

    Guidelines for Carcinogen risk assessment EPA/630/P-03/001F (US Environmental Protection Agency, 2005).

  60. 60.

    Expert Elicitation Task Force White Paper (US Environmental Protection Agency, 2011).

  61. 61.

    & TU Delft Expert judgment data base, special issue on expert judgement. Rel. Eng. Syst. Saf. 93, 657–674 (2008).

  62. 62.

    A route to more tractable expert advice. Nature 463, 294–95 (2010).

  63. 63.

    , & The effect of the number of seed variables on the performance of Cooke's classical model. Rel. Eng. Syst. Saf. 121, 72–82, (2014).

  64. 64.

    in Statistics in Volcanology. (eds Mader, H. M., Coles, S. G., Connor, C. B. & Connor, L. J.) 15–30 (Geological Society, 2006).

  65. 65.

    et al. A probabilistic characterization of the relationship between fine particulate matter and mortality: elicitation of European experts. Environ. Sci. Technol. 41, 6598–6605 (2007).

  66. 66.

    , , , & Mortality in Kuwait due to PM from oil fires after the Gulf War: combining expert elicitation assessments. Epidemiol. 16, S74–S75 (2005).

  67. 67.

    , , , & What risk assessment can tell us about the mortality impacts of the Kuwaiti oil fires. Epidemiol. 16, S137–S138 (2005).

  68. 68.

    et al. Out-of-sample validation for structured expert judgement of Asian carp establishment in Lake Erie. Integr. Environ. Assess. Manag. 10, 522–528 (2014).

  69. 69.

    et al. Research synthesis methods in an age of globalized risks: lessons from the global burden of foodborne disease expert elicitation. Risk Analysis 36, 191–202 (2015).

  70. 70.

    et al. Suburban watershed nitrogen retention: estimating the effectiveness of storm water management structures. Elementa: Sci. Anthropocene 3, 1–18 (2015).

  71. 71.

    , , & Using structured expert judgement to assess invasive species prevention: Asian carp and the Mississippi — Great Lakes hydrologic connection. Environ. Sci. Technol. 48, 2150–2156 (2014).

  72. 72.

    et al. Structured expert judgement to forecast species invasions: bighead and silver carp in Lake Erie. Cons. Biol. 29, 187–197 (2014).

  73. 73.

    et al. 2010 expert elicitation for the judgement of prion disease risk uncertainties using the classical model and EXCALIBUR J. Toxicol. Environ. Health A. 74, 261–285 (2011).

  74. 74.

    , , & Seismic hazard modeling for the recovery of Christchurch, New Zealand. Earthquake Spectra 30, 17–29 (2014).

  75. 75.

    , & The Feasibility of Using Seed Questions for Weighting Expert Opinion in CCS Risk Assessment CO2CRC Report RPT11–2868 (Cooperative Research Centre for Greenhouse Gas Technologies, 2011).

  76. 76.

    et al. in 11th International Conference on Greenhouse Gas Control Technologies (eds Dixon, T. & Yamaji, K.) 2775–2782 (Elsevier, 2013).

Download references

Acknowledgements

We thank J. Hall (Oxford University, UK), K. Keller (Pennsylvania State University, USA), R. Kopp (Rutgers University, USA), J. Rougier (University of Bristol, UK), D. Sexton (Met Office Hadley Centre, UK) and C. Tebaldi (National Center for Atmospheric Research, USA) for either helpful comments on an earlier draft of the manuscript, or useful discussions of the issues raised, or both.

Author information

Affiliations

  1. Department of Geosciences and Woodrow Wilson School of Public and International Affairs, Princeton University, Princeton, New Jersey, 08544, USA

    • Michael Oppenheimer
  2. Atmospheric and Environmental Research, Inc., Lexington, Massachusetts, 02421, USA

    • Christopher M. Little
  3. Resources for the Future, 1616 P St NW, Washington DC, 20036, USA

    • Roger M. Cooke
  4. Strathclyde Business School, University of Strathclyde, 199 Cathedral Street, Glasgow G4 0QU, UK

    • Roger M. Cooke

Authors

  1. Search for Michael Oppenheimer in:

  2. Search for Christopher M. Little in:

  3. Search for Roger M. Cooke in:

Contributions

M.O., C.M.L. and R.M.C. designed the research, conducted analysis of data and results, and contributed to writing, editing and revision. C.M.L. and R.M.C. performed statistical modelling.

Competing interests

The authors declare no competing financial interests.

Corresponding author

Correspondence to Michael Oppenheimer.

Supplementary information

PDF files

  1. 1.

    Supplementary Information

    Supplementary Information

Excel files

  1. 1.

    Supplementary Information

    Supplementary Information

About this article

Publication history

Received

Accepted

Published

DOI

https://doi.org/10.1038/nclimate2959

Further reading