Meteorologists are planning a coordinated global drive to recalibrate space-based measurements of the weather. The weather scientists are confident that better calibration will result in better data — and a fuller picture of global climate change.

Meeting last week in Geneva, the World Meteorological Organization announced plans for a Global Space-based Inter-Calibration System (GSICS). The initiative will ask national satellite agencies to take steps to ensure better comparability of satellite measurements made by different instruments and satellites, and to tie these measurements to absolute references.

To permit early detection of climate change, it is vital that satellite instrument calibration is of the highest quality.

“As the requirement for monitoring global climate becomes clearer, there is need for more accurate measurements,” says Don Hinsman, director of the World Meteorological Organization's space programme. “To permit early detection of climate change, it is vital that satellite instrument calibration is of the highest quality, and that a capability exists to cross-calibrate satellite sensors.”

Remote sensing by some 30 satellites forms the backbone of global weather and climate monitoring today. Such measurements are vital because reliable ground-based observations are available for only about a quarter of Earth's surface. Continuous measurement of oceans, deserts and other remote and sparsely populated areas can come only from space.

But such measurements are prone to error, with problems arising from instrument degradation over time, small deviations of the satellites from their planned orbits, and faults in the algorithms used to process raw numerical data into meaningful geophysical information.

Flawed satellite data have caused disagreements between scientists in the past over such matters as temperature trends in the troposphere1. One radiometer onboard a US National Oceanic and Atmospheric Administration satellite — the only instrument to measure temperature in the stratosphere before 1998 — is thought to have transmitted grossly biased temperature measurements since 1979 (ref. 2).

Even small temperature discrepancies, if undiscovered, can seriously disrupt the study of climate trends. “Inter-calibration has to be almost perfect if we want to look at climate trends — otherwise the bias will be stronger than the signal you want to address,” says Jean-Noël Thépaut, who heads the satellite section at the European Centre for Medium-Range Weather Forecasts in Reading, UK.

The onboard calibration of instruments is costly and technically challenging, and provision for it has been incorporated only into new satellites. But just as important is the occasional lack of consistency between data collected from different satellite missions.

“The development of new sensor technology is progressing much faster than our capability to validate data,” explains Gerhard Adrian, head of research at the German Weather Service in Wiesbaden.

The GSICS will make use of the exceptionally well-calibrated sensors onboard the latest generation of European and US meteorological satellites — such as Europe's MetOp-A satellite, which became operational last week — to validate data from older instruments.

“Cross-calibration is very much in our own interest,” says Johannes Schmetz, head of the meteorology division at EUMETSAT, the European Organisation for the Exploitation of Meteorological Satellites in Darmstadt, Germany, and a member of a panel that will run the GSICS. “Ideally, what we would like to have is an operational system that could precisely define, and correct for, any orbital and instrumental biases in real time.”

Reprocessing recently archived data using improved algorithms is also part of the plan. At EUMETSAT, robots can now do this quite quickly. Cumbersome manual 'data archaeology' is required only for old data sets stored on unwieldy magnetic tape.

Satellite data are becoming ever more abundant. At the European weather centre in Reading, for example, more than 5 million data points are processed every day, with the volume of data likely to triple in the next few years.