The US National Institutes of Health (NIH) is increasingly moving to adopt schemes that promote open access to scientific information. Most scientists favor the plans, but the push is adding confusion over how to balance openness with simultaneously rising security concerns about biodefense research.

Beginning in October, the NIH plans to require grant applicants to describe how they will make their data available to other scientists. The agency has also announced plans to make all NIH-funded research papers freely available six months after publication.

There is other support for the open sharing of data as well. The US National Research Council in September released a report firmly advocating continued public access to genome sequences of microbial pathogens, saying the benefits of sharing data outweigh the potential risks.

But as spending on biodefense research continues to grow, some scientists and administrators are concerned over the lack of specific criteria for judging sensitive content. The funds for applied research on infectious diseases ballooned to a proposed $1.7 billion in the 2005 budget, a 30-fold increase over the 2001 investment.

Some journal publishers are taking security issues into their own hands by flagging papers they think could pose security risks. One scientist developed a gene therapy technique that he realized could be used in nefarious ways. “I thought, what do I do with this? Publish or not publish? There are no mechanisms that tell me what to do,” he says. After calls to various agencies, including the US Central Intelligence Agency, he published the research. Several months later, nothing has happened but “I'm still worried,” he says.

“I am not surprised by the tension and confusion within the community,” says Ronald Atlas, former director of the American Society of Microbiology. Atlas advocates openness wherever possible. “Some information is inherently dangerous and should be constrained,” he says, “but most information should be openly dispersed.”

I am not surprised by the tension and confusion within the community. . Ronald Atlas, University of Louisville, Kentucky

Atlas last year served on a National Academy of Sciences (NAS) committee that recommended establishing a National Science Advisory Board for Biosecurity (Nat. Med. 10, 319; 2004). The board should guide federal and local agencies on so-called 'dual-use' research—legitimate biological research that could pose a threat to public health or national security—the committee said.

The board, to be housed at the NIH's Office of Biotechnology Activities (OBA), was approved in March and is slated to have a budget of $2.8 million per year. No board members have yet been chosen, but the OBA is selecting the first director and the initial meeting will probably be held next year, says Mary Groesch, senior advisor for scientific policy at the OBA.

Board members will begin by addressing criteria for identifying dual-use research. They are also expected to draft an international code of conduct for scientists and devise guidelines to oversee dual-use research at the local level.

In the meantime, scientists concerned about the safety of their research should turn to their institute's biosafety committee, says Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases. Although his agency funds biodefense research, the number of cases in which security concerns will pop up is small, Fauci says.

Some administrators say the confusion could be the push the community needs to step up the debate. “Scientists need to engage the security community so that concerns on each side can be shared,” says Stephen Morse, head of the Columbia University Center for Public Health Preparedness in New York. “If they don't, those decisions may be made [for them].”