NEWS

US officials revisit rules for disclosing risky disease experiments

An expert panel is considering how much to reveal about a largely secret review process of 'gain-of-function' research.

Search for this author in:

Two researchers in a high-security laboratory (BSL4-Lab)

Making pathogens more dangerous can help researchers prepare for pandemics.Credit: Anna Schroll/Fotogloria/UIG via Getty

US disease researchers are pushing the government to be more transparent about federally funded research that involves making pathogens more deadly or more transmissible.

Several disease researchers who attended a recent meeting to discuss transparency around such studies say the US government should offer a public explanation when it approves such ‘gain-of-function’ experiments, disclose who made the decision to fund them and make a broad public announcement when a study begins. Others argued that greater transparency could make it harder to approve necessary research.

The debate over how much to disclose about such work is revving up because the government is preparing to revisit rules that guide gain-of-function research — especially with regard to their communication to the public.

Researchers studying viruses in the lab sometimes deliberately make them more dangerous to help prepare better responses to outbreaks that might occur naturally. In 2017, the federal government began requiring that any National Institutes of Health (NIH) grant proposals involving gain-of-function research undergo a review by an expert panel to evaluate the risk of such work against the potential gains. But the names of the expert-panel members are not publicly available, nor are its reviews of study proposals.

“We’re not trying to say the policy is wrong, we’re trying to say the policy is ambiguous,” says Marc Lipsitch, an epidemiologist at the Harvard T.H. Chan School of Public Health in Boston, Massachusetts and one of the researchers calling for greater transparency around such work.

He was one of several researchers who attended a meeting on 23–24 January of the National Science Advisory Board for Biosecurity (NSABB), an independent panel that advises the NIH’s parent, the Department of Health and Human Services (HHS). The NSABB is reviewing current guidelines for sharing information on gain-of-function research at the request of the NIH and the White House.

The discussion is the latest chapter in a long-standing debate about the value of potentially dangerous biological research. In 2014, after a series of accidents involving mishandled pathogens at the US Centers for Disease Control and Prevention, the NIH announced that it would stop funding gain-of-function research into certain viruses — including influenza, severe acute respiratory syndrome (SARS) and Middle East respiratory syndrome (MERS) — that have the potential to unleash a pandemic or epidemic if they escaped from the lab. Some researchers said the broad ban threatened necessary flu-surveillance and vaccine research.

Surprising news

The government reversed course in January 2017 after the NSABB concluded that very few such experiments posed a risk to public safety. The NIH lifted its ban on funding gain-of-function research in December 2017, after the HHS and the White House developed a system for vetting proposed experiments — creating the expert panel in the process.

The related debate over how much to disclose about such research reignited in 2019 following media reports that the government had approved two gain-of-function experiments.

Those studies “were announced via a reporter from Science getting a tip and calling the government”, Tom Inglesby, director of the Center for Health Security at Johns Hopkins Bloomberg School of Public Health in Baltimore, Maryland, said at the meeting on 23 January. “That doesn’t seem like a good process.” He urged the committee to reconsider the need for such work, arguing that its benefits are not “widely proven”.

Carrie Wolinetz, the NIH’s associate director for science policy, said that the agency posted the proposals online, just as it does for other research that it funds. She conceded that there was “room for discussion on how much is disclosed [and] timing of disclosure” were necessary.

Lipsitch said that the information released about the latest set of experiments mirrored the disclosure typically required for government-funded peer-reviewed work. But he argues that discussions around potential pandemic pathogens require even greater disclosure, akin to an environmental-impact assessment. Like Inglesby, he says that the government should not have funded certain gain-of-function studies — such as experiments in 20121 that created variants of the avian flu virus that could travel between ferrets breathing the same air.

Another model for disclosure is the process used by the Food and Drug Administration, which makes information about reviewers and the reasoning behind their decisions public, said Luciana Borio, a former biodefence director at the White House National Security Council.

Chilling effects

Other experts argued against some of the suggestions for increasing transparency, saying that they could be counterproductive.

Christian Hassell, senior science adviser to the HHS Office of the Assistant Secretary for the Preparedness and Response, said that naming the members of the expert review panel could make serving on it less attractive. “As much as it would be good to publicize the individual names, as suggested, if that chills anyone from willing to be on serve on that committee, that would be detrimental,” he said.

Hassell wouldn’t disclose the panel’s size, but said that its members were “very experienced, very actively involved in research”.

And Kenneth Bernard, an NSABB member who advised former president George W. Bush on biodefence, said that releasing too much information could give other nations the wrong impression about the nature of the research the United States is backing. “Defending against biothreats by developing countermeasures looks a lot like an offensive bioweapons programme” to some countries, he said.

Bernard added that opening up review of proposals to a larger group of people, as Inglesby and Lipsitch have recommended, could risked stalling projects indefinitely.

“If you open up a review process to a public review, that includes ethicists and security people and scientists, there’s a 100 percent chance nothing will get approved — if it’s serious enough discussion about a serious enough” potential pandemic pathogen, Bernard said. “Science can’t work that way.”

References

  1. 1.

    Imai, M. et al. Nature 486, 420–428 (2012).

Download references

Nature Briefing

An essential round-up of science news, opinion and analysis, delivered to your inbox every weekday.