washington
The United States has fallen behind other countries in its ability to model long-term climate change, says a panel of the National Research Council (NRC), the executive arm of the National Academy of Sciences.

The lag has been partly caused by US researchers being prevented from buying powerful supercomputers from Japan, says the NRC climate research committee's report. It adds that the US climate modelling community is likely to “remain behind the rest of the world in terms of computational facilities for the next several years”.
Although US researchers still hold the lead in small- and intermediate-scale models, some feel that they lag behind in developing higher-resolution, integrated climate models of the kind needed to address policy issues raised by the Kyoto climate treaty.
“The reliance of the United States upon other countries for high-end climate modelling must be redressed,” the panel argues, for reasons that have more to do with policy than with science. It points out that decisions on the Kyoto Treaty could have far-reaching economic impact, and US decision-makers should not depend on simulations done by countries “with different priorities than those of the United States”.
The study was led by Thomas Karl of the National Climatic Data Center in Asheville, North Carolina. It was prompted by a letter written in 1995 by several prominent climate modellers to leaders of the US Global Change Research Program (USGCRP). The scientists had warned of a crisis due to a lack of computing power, and said the United States had already been eclipsed by Germany and the United Kingdom as world leaders in climate modelling, with Japan gaining rapidly.
The panel based its conclusions in part on an analysis carried out by Bill Buzbee, director of scientific computing for the National Center for Atmospheric Research (NCAR) in Boulder, Colorado. NCAR's most powerful computers can sustain just 5 gigaflops (5 billion operations per second) when running a single application, whereas comparable centres in Australia, Canada, England and elsewhere have systems that can sustain from 20 to 100 gigaflops, said Buzbee.
Much of the blame goes to a controversial Commerce Department ‘anti-dumping’ ruling of 1997, which blocked the Japanese supercomputer maker NEC from selling powerful SX-4 machines to NCAR (see Nature 389, 432; 1997).
The decision “seems to have discouraged other institutions within the United States from considering the purchase of foreign computers,” says the NRC report. “This lack of routine access by [US] climate researchers to the world's most powerful computers has become a quite serious problem.”
Jerry Mahlman, who heads the Geophysical Fluid Dynamics Laboratory (GFDL) in Princeton, New Jersey, a leading US climate modelling centre, was one of the outside reviewers of the NRC report. Even though Japanese machines offer more, he says, his laboratory, which wants to advance its computing capability to 150 gigaflops, assumes it will have to buy American computers.
The NRC committee found that US climate modellers face other problems, including the lack of an integrated government strategy for tackling important scientific and policy questions. There is little standardization of key computer subroutines or model outputs, for example. Component models of the atmosphere, oceans and land are not always incorporated effectively into more comprehensive models.
Adding to the problem, it says, is that US researchers do not always have “full, open and timely access” to output from foreign climate models. The situation could get worse as commercial considerations — for example, the privatization of European meteorological data — come into play. But it warns that any proposed solution to this nonreciprocity “should not involve the imposition of access restrictions to US data”.
The panel's conclusions have already generated debate. Mahlman believes it may overstate the gap between US and other climate modellers. His own centre is running simulations at 15 gigaflops, he says, and publishing results similar to those from foreign modelling groups. He also worries that the NRC recommendations could be taken as jingoistic. “These so-called foreigners are our friends and colleagues,” he says, and collaboration is common.
US supercomputing capability may not trail the competition for long. The Department of Energy is expected to receive $70 million of a proposed $366 million increase for information technology (see Nature 397, 285; 1999) which aims to increase computing power by three orders of magnitude to 40 teraflops (40 trillion operations per second).
The first scientific applications for the Department of Energy's advanced supercomputing capacity would be combustion systems and research on “global systems”. The computers would come on-line in 2003, and support the Intergovernmental Panel on Climate Change's fourth assessment in 2005.
But Mahlman echoes one of the panel's cautions: fast machines alone won't improve climate prediction. There must be investment in the underlying science, he says, to create a “balance between supercomputing power and brain power”.
Rights and permissions
About this article
Cite this article
Reichhardt, T. Curb on foreign computers puts damper on US climate modelling. Nature 397, 373 (1999). https://doi.org/10.1038/16961
Issue Date:
DOI: https://doi.org/10.1038/16961