Slowly but surely, a key advisory committee is helping the scientific community act more responsibly when conducting and publishing biological research that could carry security risks.
At an open meeting last week, the US National Science Advisory Board for Biosecurity (NSABB) demonstrated that it has made welcome progress in its daunting mission to sort out the responsibilities of stakeholders in relation to biological research that might put national and global security at risk. And although the board answers to the US government, its proceedings are relevant everywhere.
For example, the board's working group on communications has proposed a checklist to be used by anybody confronted with research results whose dissemination might assist terrorists or state-run bioweapons labs. The questions themselves are straightforward (for example: “Is novel scientific information provided that could be intentionally misused to threaten plant or animal health?”). They address both the potential risks and benefits of openness, and invite the user to weigh up one against the other. That is hardly profound or original: what's new is the idea that such a checklist should itself be widely disseminated to raise awareness and to help peer reviewers, university administrators, students, government officials and experienced investigators examine research critically.
Those used to such issues know that biosecurity risks are perceived by most people to outweigh the benefits of publication only where direct weaponization techniques are involved. But the draft checklist also raises questions about presentation: might the results as written cause public misunderstanding or encourage sensationalism? If the answer is yes, the working group lists a menu of options either for improved presentation or, if the results themselves are deemed too risky for open publication, to decide against publication altogether or to issue them on a ‘need to know’ basis.
This last option raises the spectre of a category of information defined as ‘sensitive but unclassified’, first promoted in the biological context by the Bush administration following the attacks of 11 September 2001. The NSABB working group, not empowered to decide on the merits of such a concept, had to consider it as a potential outcome. But neither the NSABB nor anyone else has come close to working out exactly how to implement such a category in practice.
The communications working group also espouses a set of overarching principles of responsible dissemination. These reiterate the fundamental benefits of scientific openness, stating that results should be communicated as fully as possible, and that communication should only rarely be restricted. This may sound like a truism to readers of Nature, but the working group includes individuals from security and military backgrounds. The endorsement of openness as a fundamental principle is significant, in effect placing the burden of proof of potential risk on those who might restrict communication.
The NSABB working group rightly places the onus on investigators and their institutions to assess their own work throughout the research process, rather than on government or journals (although many publishers will also play their part). Researchers and their institutions know the work best, and keeping decisions in local hands is most likely to yield sensible results. It also minimizes the opportunities for overly burdened bureaucracies to opt for the least risky option, as they may see it, of restricting dissemination.
However helpful the draft guidelines are (they have yet to be formally adopted by the NSABB), they can only achieve so much. Ultimately, results that represent a biosecurity risk will get submitted, and scientists whose papers are rejected by one journal may submit them for publication elsewhere. And the guidelines don't specifically speak to the informal communication of ‘dual-use’ results in non-moderated forums, such as conferences or web postings.
“The best way to keep the pursuit of knowledge open and free is for researchers to exercise a demonstrable sense of responsibility.”
The best the NSABB can do is plant the idea in scientists' minds that they should always think carefully about their work and its security implications — an idea that has not taken root in parts of the scientific community, which fears restrictions to its freedom. The NSABB is showing that the best way to keep the pursuit of knowledge open and free is for researchers to exercise a demonstrable sense of responsibility. At times, this may require scientists to ask and answer difficult questions. But if scientists don't learn to do this for themselves, the hard questions will come instead from outside parties, as happened recently with a modelling paper on botulinum toxin, whose publication was delayed following concern from the US government (see http://www.pnas.org/cgi/content/full/102/28/9737).
The NSABB has not yet addressed how to help scientists who wish to explore the potential risks of their results, and how to resolve disagreements with government agencies, for example, about how to communicate the results. This guidance is needed. For now, scientists and institutions engaged in research whose outcomes may carry security risks should take steps to incorporate such guidelines into their working practices, once the NSABB's final recommendations are established.
Previous attempts to address the problem of communicating dual-use research have arrived at broad principles that don't go far enough to help either scientists or administrators. By exploring specific questions and criteria that could be used to identify dual-use research, and by detailing how to determine whether the communication of its results should be handled with extra care, the NSABB is doing a service to the whole scientific community across the world. Its progress may seem to come at a snail's pace, but the board must be commended for taking a sensible approach, and for maintaining a consensus in the process.
About this article
Biosecurity and Bioterrorism: Biodefense Strategy, Practice, and Science (2006)
Science as Culture (2006)