Technology will soon allow the world to be mapped in near-real time and at high resolution. Declan Butler investigates the potential for operational monitoring of the planet.
Forecasting is a tricky business. You can be let down by your initial data or your model of the processes, by an unrecognized bias or just bad luck. But a dramatic forecast still has the power to grab the attention. Take this one, about the state of Earth monitoring in a couple of decades: ?A user will be able to get, on demand, climate, or any other information for any place on the planet, on the land, in the oceans, or in the atmosphere, at any time, past, present and future.?
The speaker is Rick Anthes, president of the University Corporation for Atmospheric Research in Boulder, Colorado, and chair of a US National Academies panel that in January released an influential 428-page blueprint on the future of Earth monitoring. The forecast he is making is based on clear and established trends: satellites are getting more cost-effective in their capabilities, and the computers and supercomputers that make use of their data are speeding up exponentially.
If that increase in technological capability can be turned into usable systems, then the ability to monitor Earth's environment will be revolutionized. Real-time and near-real-time data will be available on soil moisture, greenhouse-gas concentrations, biological productivity, aerosol concentrations and so on, all around the world. With those data, scientists will be able to build and study models of Earth as a system far beyond what they have today.
To make that real, though, will require coherent and sustained political and institutional support, and on this front the news from the National Academies is less compelling. The US Earth-monitoring programmes are adrift without leadership, warns the academies report; the number of observational satellites and instruments has already peaked, and is set to decline over the next two decades (see page 782). No repairs are likely before the next administration, even if then. Anthes and other US scientists are keenly aware that to get a glimpse of the sort of sensible and forward-looking Earth observation strategy the academies panel proposed, they have to look to the European Union (EU).
Europe's approach to Earth monitoring is not flashy. Its underlying philosophy flows not so much from cutting-edge research as from what amounts to weather forecasting writ large, building ever more capacity for monitoring the planet onto the day-to-day activities of meteorology ? delivering data, images and products to users 24 hours a day, 365 days a year. The idea is to roll out similar easily used maps, models and forecasts on an ever-increasing range of data and processes ? for example flood risks, soil and coastal erosion, crop and fish resources, air pollution, greenhouse gases, iceberg distribution and snow cover. The systems that do so would, like today's meteorological systems, generate continuous, cross-calibrated, long-term data sets on the state of the planet and its atmosphere.
A user will be able to get climate information for any place on the planet at any time. Rick Anthes
It is by embedding scientific Earth-observation needs within an operational system that meets the needs of customers that the financial case can be made for the sort of sensors attuned to various climatic and other parameters now seen only on research satellites ? sophisticated spectrometers, sounders, lidars and radars. The operation of these sensors entails the recurrent costs of running fleets of satellites and sensors for decades, with regularly scheduled replacements.
A key part of the European process is the Global Monitoring for Environment and Security (GMES) programme run by the European Commission and the European Space Agency (ESA). GMES is explicitly charged with bringing the sorts of data that have previously been the province of research satellites to the citizens of Europe and beyond.
The GMES suite of 'Sentinel' satellites will be operated around the clock to routinely supply data similar to those now provided by research satellites into the foreseeable future. Sentinel 1, slated for launch at the end of 2011, will be designed for radar and build on some of the databases from Envisat, an 8-tonne ESA research behemoth that has tested out a wider range of instruments; the Sentinel 2 series will be imaging satellites with fine spectral resolution, building on the SPOT satellites; the Sentinel 3 satellites will carry forward the ocean-observing aspects of Envisat; Sentinels 4 and 5 would monitor atmospheric chemistry. The data provided by these assets would be integrated with data from future research satellites, as well as with national and international data from airborne, ground and ocean sensor webs (see Nature 440, 402?405 2006). The idea is that by the mid-2020s, Europe would have monitoring systems akin to those now in place for meteorology for all areas of environmental monitoring, says Josef Aschbacher, head of the GMES space office in Frascati, Italy, with forecasts and data on everything from global climate change to town-by-town air pollution levels.
The programme is loosely modelled on that of the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT), which supplies weather data to national met offices, and other government and commercial users. EUMETSAT is not a large organization ? its annual budget is normally in the ?300-million range (about US$440 million) ? but unlike ESA, or for that matter the commission, it has a track record of running operational systems in a way that works for users. The first of its Meteosat weather satellites was slotted into geosynchronous orbit in November 1977. It also now runs a weather satellite in a low-Earth orbit that complements US satellites in similar orbits. EUMETSAT recently agreed to join GMES, and is discussing directly operating future GMES satellites and ground operations, as well as hosting GMES instruments on its own weather satellites.
But GMES is a much more ambitious undertaking. It has ?1.97 billion in approved funding from the EU and ESA to carry it through until 2013, covering the launching of the first three Sentinels (the contract for the first of which was signed earlier this year). In 2008 ESA will ask its member states (which differ slightly from both those of the EU, and of EUMETSAT) for a further ?700 million ?900 million to cover development of the Sentinel series, operations and spare spacecraft, and the commission will ask the EU for the ?2.5 billion needed to operate the system until 2023.
These requests for funding are not the only reasons that 2008 will be the crunch year for GMES, says Paul Counet, head of EUMETSAT's Strategy and International Relations Division. Many governance issues remain to be resolved, such as who is responsible for which aspects of operations or services. And then there is the vast task of integrating national observing systems into GMES, and of finding ways for private industry to manage or add value to specific data sets and provide new services. The resolution of these governance issues, and the forging of the long-term relationships needed to underpin operational systems, will determine whether GMES blossoms into a full-blown system or goes belly up, says Counet.
Regardless of its implementation, the operational logic of the GMES programme gets the thumbs up from across the Atlantic. ?The Europeans certainly have a robust programme and are moving to make it happen,? says Scott Goetz, a researcher at Woods Hole Research Center, Falmouth, Massachusetts, who uses remote sensing to model ecosystems. ?GMES is a step in the right direction,? agrees Kevin Trenberth, a climate researcher at the National Center for Atmospheric Research in Boulder, Colorado, who says that he would like to see a similar approach expanded in the United States and internationally, to generate a global 'climate-information system'. The fact that operational satellites must meet needs other than those of scientists ? a cornerstone of the GMES approach ? is no obstacle to research, he says. Researchers just need to get involved to make sure their needs are taken into account.
The data that such information systems could make available will have implications both for how well scientists can run environmental models and for what those models can do. Operational systems launched in the late-2010s and onwards are likely to generate order-of-magnitude improvements in both temporal and spatial resolution. Today, a typical weather model might have 30?50-kilometre horizontal resolution. Climate research models are even coarser ? often 100?200 km. But by 2025, improvements in both data and computing will mean that weather will be modelled at 1-km resolution, and climate models at 5?10 km, predicts Anthes.
Anthes has high hopes for the impacts such improvements could allow. ?We have seen beyond a doubt in weather prediction that as the resolution increases, something almost magical starts to happen in the models, even without an increase in the observations,? says Anthes. ?As we go for example from a 30-km resolution model to a 5-km model, hurricane prediction, precipitation patterns and so forth become far more realistic.?
Climate prediction is probably the most computationally challenging problem in science. Tim Palmer
More data obviously demand more computing power to make sense of them ? and computing power is already the overriding limitation on how realistic models are, say many researchers. ?It's a major issue,? says Trenberth. To get good results you need to run the same model again and again with slightly different inputs, which eats up computing power. And every twofold increase in resolution requires a tenfold increase in the teraflops required. ?Climate prediction is probably the most computationally challenging problem in science,? says Tim Palmer, a scientist at the European Centre for Medium-Range Weather Forecasts (ECMWF) in Reading, UK.
But Palmer is optimistic that the bottleneck will soon be alleviated. He points to the arrival next year of the first petaflop computers, running at peak speeds of up to 3,000 teraflops (see Nature 448, 6?7 2007). By way of comparison, the ECMWF's fastest machines today run at less than a hundredth of that. Palmer predicts that by 2010, 10-petaflop machines will allow climate scientists who can get hold of them to run century-long simulations of the climate at 10-km resolution. That could be 1 km within a decade after that. Accurate modelling of cloud processes at the 1-km level, a key component currently missing from global climate models, could vastly improve predictions of regional climate change. Trenberth is one of a group of researchers planning to propose the creation of one or more international multi-petaflop computing facilities for climate prediction, with a ball-park cost of $1 billion over 5 years. The idea will be presented at the international climate negotiations opening in Bali, Indonesia, this week, with a formal proposal to be published next year in Bulletin of the American Meteorological Society.
Integrate and accumulate
The data expected will not just be more precise ? they will also be more wide-ranging, providing new impetus to models that seek to treat the Earth system as an integrated whole. Until recently, Earth observation has been less than the sum of its oceanic, terrestrial and atmospheric parts, according to Stephen Briggs, head of science, applications and future technologies at ESA. ?Integrating the components is something we are really bad at,? says Briggs. ?This is where we are going to see the major advances.?
Incorporating more geophysical observations made from multiple instruments obviously makes models more complex. To integrate such disparate data sets, which differ not just in their spatial and temporal resolution but also in their error profile, modellers are borrowing the 'data assimilation' techniques used by weather forecasters. A model producing a weather forecast will start off with reasonable best estimates of initial global conditions informed by the data to hand. As more come in ? as low-Earth-orbit satellites pass over new places, for example ? the model's evolution is reiteratively compared with reality. So sparse and infrequent sources of data can still play a role.
Counter-intuitively, more data sources can also often simplify modelling, as they can help to better define other variables, adds Trenberth. For example, raw measurements from a buoy might be misleading if it were in an eddy of the warm Gulf Stream rather than somewhere more representative of the Atlantic as a whole. A system that could use other data to know that the buoy was in the Gulf Stream would not be misled so easily, and the model would be made more realistic. ?As one can resolve features better,? he predicts, ?one can utilize data better.? The resolution and the data can be provided, if the institutions allow; it will then be up to Trenberth and his colleagues to make good on that forecast.
Related links in Nature Research
Related external links
About this article
Cite this article
Butler, D. Earth Monitoring: The planetary panopticon. Nature 450, 778–780 (2007). https://doi.org/10.1038/450778a