There are various — and confusing — targets to limit global warming due to emissions of greenhouse gases. Estimates based on the total slug of carbon emitted are possibly the most robust, and are worrisome.
It is one thing to agree on a goal of international policy, quite another to achieve it. The 192 signatories of the 1992 United Nations Framework Convention on Climate Change (including the United States) have committed themselves to reducing the emissions of carbon dioxide and other greenhouse gases to avoid “dangerous interference in the climate system”. But policy-makers around the world are still trying to figure out how, specifically, to do that.
The European Union has adopted a goal of keeping temperatures below 2 °C above pre-industrial levels. Others argue for a stabilization of atmospheric CO2 concentrations at 350 parts per million (p.p.m.), or 450 p.p.m., or higher. In the United States, the administration of President Barack Obama has proposed reducing emissions by 80% by the year 2050. These schemes are all intended to be solutions to the same problem. But relating and comparing one to another is not straightforward. For instance, solutions to what level of atmospheric CO2 is required to avoid a 2 °C temperature rise, and what emission pathways might achieve that goal, are still unclear. Papers elsewhere in this issue by Meinshausen et al.1 and Allen et al.2 explore the uncertain relationships between carbon emissions and climate response, with the aim of better estimating how much additional CO2 might indeed be too much.
Meinshausen and colleagues1 (page 1158) take a comprehensive probabilistic approach, combining the uncertainties in climate sensitivity and carbon-cycle feedbacks, and integrating the two over a large range of potential emission pathways. Their target is to avoid a peak global mean warming from the pre-industrial level of more than 2 °C (equivalent to a further rise of about 1.2 °C from today). We must note here that there is nothing special about 2 °C that would make warming of less than this magnitude 'safe'. It is more analogous to a speed limit on a road, and is a guide to the scale of the problem. With 2 °C of global warming (more over land and at the high latitudes), Earth would probably be warmer than it had been in millions of years — a huge change.
Meinshausen et al. find that the maximum temperature that Earth will experience to the year 2100 depends most reliably on the total amount of CO2 emitted to the year 2050, rather than on the final stabilized CO2 concentration. Their base-case estimate is that the total emissions from today (2009) to 2050 need to stay below 190 GtC (equivalent to 700 GtCO2; 1 GtC = 1012 kg of carbon) for us to have a good chance (75%) of staying below 2 °C (Fig. 1). The probability drops below 50% if we emit more than 310 GtC in that time. This is significantly less than the amount of carbon contained in proven reserves of gas, oil and coal, let alone reserves of non-traditional fossil-fuel sources such as tar shales, oil sands or methane hydrates. Last year, we probably emitted more than 9 GtC, and this has been increasing at around 1–3% a year. At that rate, we will reach 190 GtC in under 20 years.
Allen and colleagues2 (page 1163) take a slightly different tack, using a combined climate and carbon-cycle model, and varying uncertain parameters, to produce a series of simulations that attempt to span the range of projections that are consistent with already observed changes. They agree with Meinshausen et al. that it's the total slug of carbon that matters most, and define a term they call the cumulative warming commitment (CWC) as the peak temperature change expected as a function of the total anthropogenic carbon.
Comparing the bottom-line results from the two studies is tricky because of the use of different units, different base periods and different experimental design3. However, given that humans have already emitted roughly 520 GtC to the end of 2008, Allen and colleagues' best-estimate CWC — 2 °C per 1,000 GtC emitted from 1750 to 2500 (compared with 2000–2050 in Meinshausen et al.) — implies that another 480 GtC would put us over 2 °C with more than 50% likelihood. This is broadly consistent with the 310-GtC estimate from Meinshausen et al. over a much shorter time frame. For comparison, two scenarios with cuts of 80% in emissions by 2050, in developed countries and globally, give an additional 325 GtC and 216 GtC, respectively (Fig. 1).
A lot rides on the questions addressed in these papers, and they are unlikely to be the last words written on the topic. There is certainly room for further debate on the definition of 'dangerous'; the maximum global temperature is a good place to start, but ice sheets and sea level, for example, probably depend on the integrated climate impact rather than on peak warming4,5,6. Also, these studies1,2 use the traditional, short-term 'Charney' climate sensitivity, which includes some fast feedbacks such as variations in atmospheric water vapour and clouds, but not the slower feedbacks such as changes in vegetation or ice sheets, or feedbacks in atmospheric aerosols. The true sensitivity of the Earth system may well be higher7, implying that any temperature-based target will become progressively harder to maintain as slower feedbacks kick in.
Finally, both studies1,2 make different assumptions about how non-CO2 factors (anthropogenic methane, ozone, black carbon, sulphates and so on) will change. These effects can't be shoehorned into the same cumulative-emissions metric as CO2 because their effects over time are much more closely tied to contemporaneous emission levels. However, they remain a tempting additional policy target that might usefully limit near-term temperature rises8.
The bottom line? Dangerous change, even loosely defined, is going to be hard to avoid. Unless emissions begin to decline very soon, severe disruption to the climate system will entail expensive adaptation measures and may eventually require cleaning up the mess by actively removing CO2 from the atmosphere. Like an oil spill or groundwater contamination, it will probably be cheaper in the long run to avoid making the mess in the first place.
Meinshausen, M. et al. Nature 458, 1158–1162 (2009).
Allen, M. R. et al. Nature 458, 1163–1166 (2009).
Allen, M. R. et al. Nature Reports Climate Change www.nature.com/climate/2009/0905/full/climate.2009.38.html (2009).
Hansen, J. et al. Atmos. Chem. Phys. 7, 2287–2312 (2007).
Mann, M. E. Proc. Natl Acad. Sci. USA 106, 4065–4066 (2009).
Smith, J. B. et al. Proc. Natl Acad. Sci. USA 106, 4133–4137 (2009).
Hansen, J. et al. Open Atmos. Sci. J. 2, 217–231 (2008).
Shindell, D. & Faluvegi, G. Nature Geosci. 2, 294–300 (2009).
About this article
Enhancing the CO2 separation performance of SPEEK membranes by incorporation of polyaniline-decorated halloysite nanotubes
Journal of Membrane Science (2019)
2,3 Butanediol production in an obligate photoautotrophic cyanobacterium in dark conditions via diverse sugar consumption
Metabolic Engineering (2016)
Stranded assets, externalities and carbon risk in the Australian coal industry: The case for contraction in a carbon-constrained world
Energy Research & Social Science (2016)
A methodological approach to assess the combined reduction of chemical pesticides and chemical fertilizers for low-carbon agriculture
Ecological Indicators (2013)
Science & Education (2012)