Dual-use research—research that could be misused to pose a threat to public safety—needs to be regulated, but the best way to do so is not straightforward at all.
Bruce Ivins' death has provided a grim reminder of the dangers of dual-use research, raising questions about the appropriate level of regulation that research of this type requires.
Between 2001 and 2008, biosecurity spending in the US amounted to $41 billion. Compared to the amount spent before the terrorist attacks of 2001, the annual biosecurity budget is now almost ten times larger. A consequence of the increased funds available for biodefense is that there are now so many people working in this field—as many as 14,000 individuals by some estimates—that their activities are not being adequately monitored.
This lack of oversight arises partly from the fact that current US regulations largely focus on the transfer and possession of a series of pathogens and 'select agents' listed by the government. Therefore, it is a criminal act to possess or transfer a select agent without the knowledge of the US Centers for Disease Control and Protection. However, this ruling does not relate to dual-use research, such as, for example, transferring a toxin-encoding gene to a nonpathogenic microorganism that could then infect humans.
To regulate dual-use research, the US government has leaned on the authority of the National Institutes of Health (NIH). But, by contrast to countries such as the UK, where the government has direct legal authority over all dual-use research, the US relies on funding mechanisms to bring scientists into its regulatory schemes—one is bound by the existing regulations only if one receives NIH funds. Moreover, the NIH has increasingly handed the regulatory oversight to Institutional Biosafety Commitees (IBCs) that review dual-use research the same way that Institutional Review Boards (IRBs) evaluate ethical aspects of experimentation. However, by contrast to IRBs, not every institution has an IBC, and those that exist do not receive much guidance for doing their job.
This situation has created a regulatory vacuum, and, as a result, efforts are being made to put better rules in place (see page 893). What shape these rules will take is still unclear—as unclear, perhaps, as the definition of dual-use research itself. Could the identification of a mechanism to weaken the human immune system be construed as dual-use research? If the answer is yes, the regulatory consequences for many labs performing what they thought was purely basic research could be profound.
Should the US government have legal authority over all dual-use research, similar to the UK's strategy? Or should researchers police themselves, despite the conflict of interests that such self-regulation may represent? There are no obvious answers to these questions, and it is fortunate that the US government has not taken the same type of 'Big Brother' approach that it has adopted in other areas relevant to its 'war on terror'. It is also fortunate that scientists have taken advantage of this opportunity for dialogue, urging stakeholders to carefully reflect on what best serves the government, the public and the scientists' own interests. As regulatory decisions of this sort might apply to all biomedical research, a cautious approach that allows time for the community to grasp all the issues relevant to the regulation of dual-use research is to be commended.
From the international perspective, the legal vacuum is even more profound. The most important international agreement on the regulation of dual-use biological agents emerged from the 1972 Biological Weapons Convention (BWC), which prohibited the development of biological agents for hostile purposes but did not regulate research. Also, the recommendations from the BWC are not legally binding, and the US has characteristically opposed efforts to change their status since 2001.
Aiming to fill this void, the BWC countries have held annual meetings since 2003 to promote “common understanding and effective action” on a series of biosecurity issues agreed upon in advance. For example, as this issue of Nature Medicine went to press, the 2008 meeting was taking place in Geneva, with a focus on measures to promote biosafety and on oversight, education and development of codes of conduct to prevent misuse of advances in biotechnology. Regrettably, again at the insistence of the US, the participants of these annual meetings do not have decision-making authority, raising serious doubts about their real influence.
Deciding on the right level of regulation for dual-use research, both at the national and international levels, is a difficult problem. Scientists should continue to get involved in the decision-making progress to make sure that their point of view is heard until clear guidelines are in place. If the outcome of this domestic discussion is successful, it may provide a blueprint for a global regulatory scheme—a sorely needed opportunity for the US to lead by example.
About this article
Cite this article
Playing it safe. Nat Med 14, 891 (2008). https://doi.org/10.1038/nm0908-891