After being mutated just a handful of times, an artificially created variant of the H5N1 avian influenza virus began to spread among ferrets in virologist Ron Fouchier's laboratory in Rotterdam, the Netherlands, late last year. This was ominous, because ferrets are a model for human-to-human transmission of flu.

J. Robert Oppenheimer (third from left) and General Leslie Groves (centre) examine the remains of the first successful atomic bomb test in 1945. Credit: CORBIS

Fouchier's work, along with parallel studies by virologist Yoshihiro Kawaoka of the University of Wisconsin–Madison and the University of Tokyo, ignited a worldwide debate. Earlier this year, experts in microbiology and in science policy wrestled over whether the specifics of such potentially dangerous research should be openly published in scientific journals1.

Some insisted that publishing detailed information on how to produce mutant flu strains is crucial to researchers who want to fine-tune surveillance strategies for pandemic outbreaks and hone potential vaccines. Others argued that releasing this information will increase the chance of a deadly virus devastating human populations — either by enabling bioterrorists to manipulate strains, or by aiding an accidental release from a lab. Many even questioned whether such research should have been funded in the first place.

The furore, which is far from settled (see go.nature.com/wxeijg), has sparked passionate claims and counterclaims about the risks of imposing tight controls over the flow of biological information. Missing from the discussions has been a clear-eyed look at history.

The problems posed by dual-use research — which can benefit the public but might also be co-opted for harmful purposes — are hardly new. Even in ancient Greece, Archimedes applied his mathematics to improve devices used to overcome city walls in siege warfare. Revisiting the successes and mistakes of the recent past would clarify the risks and benefits of various proposals for biomedical research. We argue that, although scientists are right to be wary of heavy-handed approaches to security, self-censorship alone has rarely proved sufficient.

Flu freeze

Reacting to the wrangling over whether the H5N1 work by Fouchier and Kawaoka should be published, virologists adopted a voluntary moratorium on flu-virus research in the early months of 2012. Several practitioners have argued that such self-governance is adequate for today's challenges in the life sciences, and that further checks on the open sharing of information would hamper scientific progress1,2. This is surprising to us: no matter the field of research, can anyone be expected to step outside the excitement and momentum of their own work to make objective decisions in risky situations?

Biologists' resistance to meddling from outsiders probably stems, at least in part, from fears of the kinds of restriction and paranoia that constrained many nuclear physicists working in Europe and the United States during and after the Second World War.

After the onset of the war, US nuclear physicists agreed among themselves in the early 1940s to restrict or avoid publication of certain information about nuclear reactions, even before the Manhattan Project3 to develop the atomic bomb was established. But formal schemes for classifying such information as secret were soon codified and written into law. The US Atomic Energy Act of 1946, for instance, made it a federal crime to circulate information about the rates of some nuclear reactions without extensive review and declassification. This ensured that whole categories of information were deemed secret. The default became to classify first and to declassify only when needed.

Many physicists at the time chafed at the new rules. The restrictions were also ripe for abuse. Historians have uncovered dozens of instances in which authorities in the United States, the United Kingdom and elsewhere used secrecy and classification to undermine legitimate discussion and debate. In the United States, for example, more physicists were called to testify before the House Un-American Activities Committee than members of any other academic discipline4.

Classification of information was sometimes used unfairly to disadvantage defendants. This happened in personnel security hearings (including, most famously, that of J. Robert Oppenheimer, scientific director of the wartime Los Alamos Laboratory in New Mexico), in criminal espionage cases (such as that of accused atomic spies Julius and Ethel Rosenberg) and even in intellectual-property claims. Nor was the system foolproof. There were several instances of espionage at Los Alamos despite the elaborate security precautions4.

This monolithic regime of classification, emblematic of the Second World War and the post-war 'red scare' era of anticommunism, however, is one of many examples of concerned non-scientists stepping into the research fold.

Recombinant research

In 1975, molecular biologists met at the Asilomar Conference Grounds in Pacific Grove, California, to discuss scientists' concerns about the emerging field of recombinant-DNA technology. A major worry at the time was that introducing genes from, say, viruses could transform innocuous microbes into deadly pathogens that were resistant to antibiotics or into agents that cause cancer. The biologists agreed to a voluntary moratorium on all research involving recombinant DNA until they could hammer out appropriate safety protocols5.

Maxine Singer, Norton Zinder, Sydney Brenner and Paul Berg (left to right) at the 1975 Asilomar meeting. Credit: US NATL LIB. MED.

In the recent debate over work on flu viruses, Asilomar has been cited as a shining example of scientists' ability to act responsibly when unfettered. Yet self-censorship was only the beginning of that chapter.

Asilomar has been cited as a shining example of scientists' ability to act responsibly.

Around the time of the Asilomar meeting, local officials in US cities — notably Cambridge in Massachusetts, but also Ann Arbor in Michigan, Bloomington in Indiana, Berkeley and San Diego in California and Madison in Wisconsin — also became concerned about the potential risks of recombinant-DNA research. Rather than rely on the scientists to police themselves or wait for the US National Institutes of Health to dispatch safety guidelines, the mayor of Cambridge, Alfred Vellucci, convened public hearings on the topic. He even threatened to impose a ban on researchers to stop them from carrying out any recombinant-DNA research within city limits unless they first worked with multiple local stakeholders to iron out mutually agreeable safeguards6.

Some molecular biologists at the local powerhouse institutions — Harvard University and the Massachusetts Institute of Technology — grew frustrated by the imposition. Following four weeks of debate, and after the city had imposed a three-month moratorium on all recombinant-DNA research, the Cambridge Experimentation Review Board was established in 1976 to consider the public-health effects of the research. The board's members included physicians, academics from other disciplines and members of the public. They met twice weekly for five months, arranged public debates and considered testimony from dozens of experts and non-specialists — including Nobel laureates, university and public officials, and concerned citizens.

Ultimately, the board crafted city legislation establishing a level of cooperation among researchers, policy-makers and concerned citizens that was unprecedented for a local effort. The bill — passed unanimously by the city council in 1977 — provided a model for other cities, and similar local negotiations unfolded elsewhere at around the same time6.

With regulatory uncertainties removed, Cambridge especially became a magnet for investors who sparked the new biotechnology industry off the back of recombinant-DNA research. Companies such as Biogen (now Biogen Idec), Genzyme and Millennium Pharmaceuticals were launched6.

Dual-use dilemma

What aspects of these two critical moments in history — the Asilomar meeting and the handling of post-war nuclear physics — are relevant to dual-use research in the life sciences today?

Arguably, despite the downsides, a certain clarity emerged from nuclear scientists' rules of conduct being written into national laws during the 1940s and 1950s. Grey areas over what was permissible and who had ultimate jurisdiction were kept to a minimum. Yet it is difficult to see how an approach modelled on the US Atomic Energy Commission, which oversaw the peacetime development of nuclear technology until its dissolution in 1975, would fit today's challenges for the life sciences.

Most of the sensitive research related to nuclear technologies has required hulking, factory-sized infrastructure that shows up in grainy satellite images from space. By contrast, much of the biological research that could have dual uses today — from studies of drugs that alter memories to genetic manipulations of viruses or bacteria — is conducted in groups of just a few people, requiring bench-top equipment.

Nuclear research, such as that at Oak Ridge National Laboratory, is harder to hide than bench-top biology. Credit: BETTMANN/CORBIS

There are also important differences in patronage and jurisdiction. The huge facilities required for nuclear projects, such as the Oak Ridge National Laboratory in Tennessee, mean that most high-risk work has been conducted under the auspices of a national government. Such research has been underwritten almost exclusively by federal funds, and thus is naturally subject to government oversight and control.

Today, many biologists are funded by multinational corporations or philanthropic organizations in addition to federal grants. This has blurred matters of jurisdiction, oversight and ownership of scientific results7. In 2005, for example, the Proceedings of the National Academy of Sciences published an article8 that analysed the possible effects of botulinum toxin contamination of the US milk supply. The journal's decision to publish was explicitly opposed by the US Department of Health and Human Services9. In their paper, the researchers acknowledged funding from Stanford University's business school but none from federal sources. The journal's editors concluded that federal overseers did not have sufficient jurisdiction to block the article's publication.

Some policy statements, such as a 2007 report from the US National Research Council10, have harked back to the Asilomar meeting and called for renewed self-regulation in the life sciences. Such reports acknowledge that internal institutional review boards have often failed to assess dual-use bioscience adequately in recent years, and have pushed for more education of life scientists “on the basic principles of risk-based biosafety and biosecurity review”10. Recognizing the need for some government input and leadership, some reports have suggested that bodies such as the US National Science Advisory Board for Biosecurity (NSABB) provide advice.

Few physical scientists in the 1940s and 1950s argued that selfcensorship would suffice.

In the 1970s, however, workable systems of oversight and security emerged from intensive engagement with a broad range of legitimate stakeholders. The recent H5N1 debacle in particular raises questions about whether self-regulation with guidance from a government body is all that different from self-censorship alone.

Initially, the NSABB voted for the US federal government to block publication of portions of Fouchier's and Kawaoka's papers. A few months later, the group reversed its decision, supporting full publication. Several experts testified before the US Senate that they feared the NSABB had stacked the deck by soliciting input predominantly from the very scientists who had conducted the work in question (see go.nature.com/wocmeh).

Amid the drama and passions roused by the H5N1 saga this year, it is worth remembering that few physical scientists in the 1940s and 1950s argued that self-censorship would suffice — despite the seemingly draconian approach of the Atomic Energy Commission. Indeed, when asked why they had continued to work on the atomic bomb after Germany had been defeated, most of the physicists at Los Alamos admitted to having been caught up in the momentum of the project. In the 1970s, biologists were not relied on to regulate matters of much broader concern on their own, despite their much-heralded initial efforts to self-govern. Moreover, the involvement of many other players did not quash scientific research and communication, but helped molecular biology and biotechnology to thrive.