Nearly 10 years, US$386 million and many grey hairs after it got the go-ahead, an enormous US ocean-observing network is finally up and running.
On 6 June, the National Science Foundation (NSF) announced that most data are now flowing in real time from the Ocean Observatories Initiative (OOI), a collection of seven instrumented arrays. Oceanographers have the chance to test whether the technologically complex and scientifically unprecedented project will ultimately be worth it.
“It has been stressful,” says Richard Murray, the NSF’s director for ocean sciences. “It’s not for the faint-hearted.”
The raw data streams came online in April — months behind schedule, in part because of a 2014 switch between university subcontractors.
Through an open-records request, Nature obtained more than 1,200 pages of e-mails between project managers at the NSF and the Consortium for Ocean Leadership in Washington DC, which built the observatory. The records reveal an extraordinary level of tension throughout 2014 and into early 2015, as the final instruments were installed in the water and the contract for handling the data streams was switched from the University of California, San Diego, to Rutgers University in New Brunswick, New Jersey.
“Please excuse my display of stress in this email, but the InBox is overflowing with high-priority, short-fuse items — none of which deserve to be ignored — but all of which cannot be completed within the requested time frames,” Timothy Cowles, then programme director at the Consortium for Ocean Leadership, wrote to the NSF in January 2014.
The NSF cited cost overruns and performance delays in changing the cyberinfrastructure contract later that year. In April 2015, an underwater volcano laden with OOI instruments erupted, just as scientists had predicted — but the live data were not yet flowing to the wider scientific community.
Now, about 85% of OOI data are available in real time on the project’s website, with the percentage growing every week, says Greg Ulses, the current programme director at the Consortium for Ocean Leadership. The information — on factors such as temperature and salinity — streams from more than 900 sensors at the 7 sites.
The OOI consists of one high-tech cable on the tectonically active sea floor of the northeast Pacific Ocean, together with two lines of oceanographic instruments — one off the US east coast and the other off the west coast — and four high-latitude sites, near Greenland, Alaska, Argentina and Chile. Each array involves a combination of instruments, from basic salinity sensors to sophisticated underwater gliders.
The NSF built the network as a community resource, hoping to stimulate an era of virtual oceanography in which scientists explore real-time data sets open to all (see ‘Virtual view’).
“We know the data are valuable,” says Lisa Campbell, a biological oceanographer at Texas A&M University in College Station. “How to implement it is what we’re working on.”
Those involved in the OOI’s painful birth are happy to see it working at last. “When I finally got through and saw the real-time data, I shouted so loud someone had to come down the hall and close the door,” says Glen Gawarkiewicz, a physical oceanographer at the Woods Hole Oceanographic Institution in Massachusetts.
The array off the coast of Massachusetts has already captured some unprecedented observations, he says. In 2014, it measured air–sea fluxes when a hurricane passed overhead. The following winter, it measured dramatic shifts in the boundary at which shallow waters interact with deep ones. “That has tremendous practical implications, because there’s a lot of commercial fishing in that area,” Gawarkiewicz says. Using OOI data, he is now working with local fishers to share real-time information on changes in temperature and currents.
The west-coast array has studied a warm blob of water linked to weather patterns that are strengthening the ongoing drought in California. And in the North Atlantic, off the coast of Greenland, OOI scientists have coordinated their measurements with those of others, such as an international programme to measure heat flow in this key region. “These are high-scientific-value sites that we have dreamed about, and now we have occupied them,” says Robert Weller, a physical oceanographer at the Woods Hole Oceanographic Institution.
But the OOI’s future remains murky. A 2015 review of US ocean-science priorities suggested that the programme’s operational budget should be slashed by 20%, to around $44 million a year. Yet each of the arrays must be serviced every year or two to replace broken instruments and install new ones. The NSF has not yet decided how it will save that 20% — whether it would cut back dramatically on servicing one particular site, spread the cuts across the entire system, or come up with some other plan.
Later this year, the agency will be soliciting bids from organizations to manage the OOI for the next five to ten years. Who responds, and with what suggestions, will help to determine what gets cut. “We built this thing, and will be funding operations for what the community feels is best,” says Murray.
Ultimately, there is no metric for what constitutes a successful OOI. Ulses says that the project needs to run for a full year before managers can assess which scientists are using which data, and how stable and successful the data streams are.
Weller would like to see a set of OOI measurements become as iconic as the records of atmospheric carbon dioxide levels taken at Mauna Loa, Hawaii, since the 1950s. “On any given day, I step back,” he says, “and am still sort of amazed that it’s all out in the water and most of it’s working.”
- Journal name:
- Date published: