It could take less of the greenhouse gas to reach a particular level of warming.
Atmospheric carbon dioxide levels may have been lower in warm eras of the Earth's distant past than once believed, scientists reported this week.
The finding raises concern that carbon dioxide levels from fossil fuel burning may, in the near future, be closer to those associated with ancient hothouse climates.
More immediately, the work brings one line of palaeoclimate evidence — that deduced from ancient soils — into agreement with other techniques for studying past climate.
"It makes a major revision to one of the most popular methods for reconstructing palaeo-CO2," says Dana Royer, a palaeobotanist at Wesleyan University in Middletown, Connecticut, who was not involved in the work. "This increases our confidence that we have a decent understanding of palaeo-CO2 patterns."
In a paper in the Proceedings of the National Academy of Sciences1, Dan Breecker, a soil chemist from the University of Texas, Austin, and colleagues report studying modern soils from Saskatchewan to New Mexico2, to determine the conditions under which the mineral calcite forms.
Calcite occurs in limestone and can be produced by the action of carbon dioxide in arid soils. Scientists trying to puzzle out ancient climate conditions often use it as an indicator of amount of carbon dioxide in the atmosphere. Previous studies had concluded that calcite formation indicates atmospheric carbon dioxide levels as high as 3,000 to 4,000 parts per million. The new study, however, lowers the calcite-formation threshold in soil to about 1,000 parts per million.
Breecker's team reached the conclusion by studying the outgassing of carbon dioxide from modern soils during times when calcite minerals are forming. "You can just put a box on top of the soil and let it fill up with carbon dioxide," he says. "The rate at which the concentration increases gives you the flux into the atmosphere." That information, in turn, can be used to determine the conditions under which calcite forms.
The team then looked at what the new estimates of calcite formation would mean for fossil soils from warmer eras over the past 450 million years. "We plugged in our new conditions and out come new atmospheric carbon dioxide concentrations that are decreased by as much as four times," Breecker says.
The new result, he says, brings carbon dioxide calculated from fossil soils into line with results obtained from other methods, such as measuring the spacing of pores on fossil leaves. Estimates based on these other techniques have generally produced lower carbon dioxide concentrations than those derived from carbonate levels in fossil soils, Breecker says. But the higher levels derived from soil carbonates were thought to be more accurate, especially from eras when atmospheric carbon dioxide was high.
"I think they've made a persuasive enough case," comments Neil Tabor, a sedimentary geochemist at Southern Methodist University in Dallas, Texas. "What is encouraging about it is that it comes in line with the other estimates."
Atmospheric carbon dioxide levels are rising today, and the new finding suggests that climate might be considerably more sensitive to changes in carbon dioxide than previously thought. "This may have implications for near-future climate change," Royer says.
Breecker cautions that fossil soils reflect the Earth's adjustment to long-term climate changes, on scales of millions of years, rather than the more rapid, and possibly shorter-lived, changes likely to result from fossil-fuel burning. But, he notes, his study still indicates that the difference in carbon dioxide levels between ice ages and hothouse climates is less than previously believed.
"That's what makes this important," he says.
Breecker, D. O. et al. PNAS doi: 10.1073/pnas.0902323106 (2009).
Breecker, D., Sharp, Z. D. & McFadden, L. Geol. Soc. Am. Bull. 121, 630-640 (2009).