Online collection.

Thinking about worst-case scenarios is nothing new — climate scientists have been doing it for more than 20 years. In 1988, after intense heat waves baked the eastern and central United States, Robert Watson, later to chair the Intergovernmental Panel on Climate Change (IPCC), and I briefed Bill Bradley, the Democrat senator for New Jersey, on the risks of disproportionate surprises from rapid, major climate change. The nature of those surprises was then, as it is now, unclear in details, although we had our hunches.

What is new is the assertion that we know the level of warming required to pass tipping points for potentially irreversible outcomes — for example, the risk of unstoppable ice sheet melt in Greenland1. In truth, we don't know the precise values for tipping points, but we can reasonably estimate with medium confidence by looking at palaeoclimates and recent ice sheet behaviour2. For Greenland, I estimate, after listening to expert judgements, a few per cent chance that meltwater transporting heat downward has already begun to obliterate ice cover irrevocably. At 1 °C more, I'd up my odds to maybe 25% and at 2 °C to 60%. At 3 °C, because the system is highly non-linear, to 90%. Deficiencies in current knowledge allow us to make only subjective probabilistic estimates that must be revised with new knowledge.

But what if the worst-case scenario came to pass? An atmosphere in 2100 with 1,000 parts per million of carbon-dioxide equivalent would be catastrophic. To understand the effect of this, we need to peer into what Harvard University economist Marty Weitzman calls the 'fat tail'3 of the probability distribution for climate damage. Although the likelihood is uncertain — and probably low — we should give these events more attention because not doing so could be potentially disastrous.

An unthinkable scenario?

In 2000, the IPCC published its Special Report on Emissions Scenarios, a now-famous set of storylines for future greenhouse-gas emissions. Even its most optimistic scenario projected a doubling of pre-industrial CO2 levels by 2100 — this is a scenario that the authors called B1, with emissions-reducing technologies spreading throughout a world with low population growth and a more egalitarian distribution of resources. At the other end of the spectrum was a 'fossil intensive' scenario called A1FI4. This tripled CO2 to roughly 950 p.p.m. by 2100. I describe this scenario as business as usual, with economic growth deemed more important than conservation.

Often studies of climate change use B1 and A1FI as 'bookends' to bracket future projections. The authors of the Special Report on Emissions Scenarios were unable to agree which scenario was most probable and deemed them all “equally sound”. Recent history, however, suggests otherwise. Until the economic downturn in late 2008, actual emissions since 2000 have been above the worst-case A1FI scenario5. Of course, short-term trends — either above or below long-term scenarios — cannot reliably be extrapolated. Nevertheless, if we resume the pattern of the recent decade, emissions will be on track to making 1,000 p.p.m. of CO2 more probable than the B1 storyline.

“We have to do a lot of things as part of a climate-energy policy portfolio.”

How would 1,000 p.p.m. translate into temperature changes? The amount of global warming associated with any level of radiative energy added to the Earth–atmosphere system — called 'radiative forcing' — depends on the 'sensitivity' of the system. Sensitivity is a measure of how much the surface will warm up if CO2 levels double from pre-industrial levels, and the IPCC has estimated it to have a “likely” range (implying a 66–90% probability) of 2 °C to 4.5 °C, with a “best guess” median of 3 °C. That implies that there is a 5–17% chance of warming above or below those endpoints. The IPCC estimates about 2.5 °C to 6.4 °C as the “likely” range for warming by 2100 under A1FI, so there is a 5–17% chance that temperatures will go up by more than 6.4 °C by 2100.

Many will argue that warming above 6.4 °C is unthinkable. Unfortunately, when I talk to analysts or economists such as Weitzman, I am told that it is precisely the warmer endpoints that they want us to examine further to alert society to catastrophic outcomes that are more than 5–10% likely to happen. This is a probability that is way above the threshold at which people usually buy insurance, or for department of defence deterrence strategies.

The IPCC's Fourth Assessment Report attempted to assess the greatest risks posed by climate change. It suggested that the five “reasons for concern” examined in the Third Assessment Report remained a valid way to approach risks. But the temperature thresholds at which such damages might be triggered had to be lowered. The figure, published independently by IPCC authors after the report was approved, illustrates this evolution of author judgement6. It also extends the possibility of warming to 7 °C.

Figure 1: Updating the embers.
figure 1

The Intergovernmental Panel on Climate Change assessed five reasons for concern in terms of societal, economic and natural damage that would be caused by climate change. The result was the 'burning embers' diagram, first seen in 2001. Updates to judgements about the thresholds at which such damages might occur revised the thresholds downwards6. Plotting the range of temperature increases caused by a doubling of carbon-dioxide levels by 2100 (B1) and a tripling (A1FI) offers a range of risk levels associated with these scenarios for the five reasons for concern. A 1,000-parts-per-million scenario would look slightly higher than A1FI.

What could be lost

In a 1,000 p.p.m. scenario, many unique or rare systems would probably be lost, including Arctic sea ice, mountain-top glaciers, most threatened and endangered species, coral-reef communities, and many high-latitude and high-altitude indigenous human cultures.

People would be vulnerable in other ways too: Asian mega-delta cities would face rising sea levels and rapidly intensifying tropical cyclones, creating hundreds of millions of refugees; valuable infrastructure such as the London or New York underground systems could be damaged or lost; the elderly would be at risk from unprecedented heat waves; and children, who are especially vulnerable to malnutrition in poor areas, would face food shortages.

Fairness must also be taken into account, given that some people would be at much greater risk than others: poor people in hot countries with little adaptive capacity, for instance, indigenous peoples and those exposed to hurricanes or wildfires, or living in low-lying areas. The elderly and children with asthma or other lung ailments would be particularly affected by urban air pollution or wildfire smoke plumes exacerbated by the extreme warming.

The economic outlook is no better. With warming of just 1–3 °C, projections show a mixture of benefit and loss. More than a few degrees of warming, however, and aggregate monetary impacts become negative virtually everywhere; and in a 1,000 p.p.m. scenario current literature suggests the outcomes would be almost universally negative and could amount to a substantial loss of gross domestic product. Millions of people at risk from flooding and water supply problems would provide further economic challenges7.

The number and intensity of abrupt events and the possibility of irreversible damages goes up non-linearly with warming. If CO2 levels were to reach 1,000 p.p.m., a rise in sea levels of up to 10 metres after many centuries from the melting of the Greenland and West Antarctic ice sheets would be more likely2. So would damage to coral and oceanic phytoplankton, as their calcium carbonate skeletons could dissolve in acidified oceans. Tropical rainforests would become more vulnerable to wildfire, and in some models such forests would switch from CO2 sinks to sources, adding yet more emissions. Extinction of some half of known plant and animal species would become much more likely, particularly if climate sensitivity is in the middle-to-upper part of the bell curve2,8.

Responses to 1,000 p.p.m.

An atmosphere with 1,000 p.p.m. CO2 would produce a rapidly multiplying set of interconnected risks and would undoubtedly spur calls for geoengineering schemes to try to offset the worst effects. But how effective would these schemes be? Injecting dust into the stratosphere to reflect light and prevent radiative forcing, for example, would not prevent increasing ocean acidification.

Similarly, because anthropogenic CO2 concentration increases and the associated warming are predicted to last for a millennium or so9, geoengineering — and the global cooperation it would require — would have to be sustained without interruption from wars or other political stresses. Moreover, severe unexpected climatic events during a period when climate control was in practice could lead to unprecedented liability claims10.

It seems obvious that we need to stay well below 1,000 p.p.m. A whole range of policies will be needed, with international cooperation, and they cannot wait for the outcome of the current political negotiations. These include policies that discourage polluting technologies and provide incentives for using cleaner ones, as well as penalties for non-compliance. “There's no silver bullet, but there's a lot of bronze buckshot” is something of a cliché among those determined to address climate policy. In other words, we have to do a lot of things as part of a climate–energy policy portfolio. Even if governments debate the emissions-cap target or the distribution of burdens for paying a price for carbon for several years more, at least by rapid implementation of performance standards and clean-technology development we can get on with the job in a politically less contentious way.