Climate scientists are hoping to develop a database of land temperature readings from around the world. Credit: AP Photo/Tohru Saito

Meteorologists are meeting this week to hammer out a solution to one of the thorniest problems in climate science: how to make raw climate data freely available to all.

The workshop, to be held in Exeter on 7-9 September, will be hosted by Britain's Met Office. It follows years of discussion within the climate-science community, which wants to draw disparate climate data together into a single, comprehensive repository to streamline research.

But the effort has been given fresh urgency over the past year by the backlash against climate science that was sparked by the leaking of e-mails from the Climatic Research Unit (CRU) at the University of East Anglia in Norwich, UK (see 'Storm clouds gather over leaked climate e-mails'). The episode came just months after Nature revealed that Phil Jones, the director of CRU, was being bombarded with requests under the Freedom of Information Act to make raw climate data available to the public (see 'Climate data spat intensifies').

"This workshop is an exercise in climate-science openness," says Peter Thorne, a climate scientist at the US National Oceanic and Atmospheric Administration's Cooperative Institute for Climate and Satellites in Asheville, North Carolina, and chair of the workshop's international organizing committee.

Currently, there are glaring holes in land temperature measurements, with some regions and time periods severely lacking data. In some cases, measurements simply haven't been taken, but often they are not readily accessible because the raw data have yet to be digitized.

Data availability can also be limited for political and economic reasons. Meteorological services in some countries need the revenue that comes from selling their data, for example, and are reluctant to provide free public access.

Thorne and Peter Stott, head of climate monitoring and attribution at the Met Office, recently authored an Opinion article in Nature outlining how the creation of an international databank will help to overcome these limitations, while bolstering research and public confidence in climate science.

Collating land temperature data into a central bank will expose exactly where the information gaps are, the organizers say, potentially encouraging efforts to fill them. And although the creation of the database itself will not alleviate the political and economic pressure that currently limits data accessibility, the scientists behind the scheme hope that this workshop will engage and encourage the international community to be more open with climate data.

The new databank will provide information on daily, or even shorter, intervals, and at a spatial resolution of only a few kilometres — detail that could be crucial for refining climate prediction models and formulating advice for policy-makers.

"This is such a terrific idea to me," says John Christy, director of the Earth System Science Center at the University of Alabama in Huntsville. "All of the data will be accessible, plus all of the expert information about where the [weather] stations were," he adds.

But creating the databank will be a formidable task. Before they can be deposited, data must be analysed and corrected to account for any long-term changes to the local environment around each measurement site, says Thorne. Working out how to homogenize the data will be a key topic of discussion at the workshop. Other ideas slated for debate include the possibility of enlisting armies of Internet users to help digitize the data, similar to other 'crowdsourcing' efforts in science such as the SETI@home or Galaxy Zoo projects.

Even if the workshop produces a framework for developing the database, Stott says that it will take years, and millions of dollars, to achieve: "It is a great challenge and will require international engagement."

Credit: AP Photo/Tohru Saito