While the US Centers for Disease Control and Prevention (CDC) was investigating an accident involving anthrax that happened at a lab on its Atlanta, Georgia, campus in June, the agency’s director Thomas Frieden got a nasty surprise. Another accident, this time involving the deadly H5N1 avian influenza virus, had been discovered at a CDC laboratory six weeks previously but had not been reported at the time. Frieden was angry, and rightly so. But this was not just a one-off — biosafety experts contend that many such incidents in secure labs worldwide go unreported.

The CDC accidents raised many justified concerns, but they also led to some undue worries in the media and to political grandstanding. The risks posed by pathogens kept in high-biocontainment labs need to be kept in perspective. Many such agents are poorly transmissible, so pose mostly local threats — as well as the risk that they will be stolen and used in bioterrorism. Few are highly transmissible and able to spark epidemics of global significance.

But some pathogens do pose such risks. In July 2003, a sustained public-health effort probably stopped the SARS (severe acute respiratory syndrome) virus from causing a pandemic. But a few months later, lab accidents infected researchers in Taiwan and Singapore. And the following year, the virus was accidentally released from a lab in China and infected a researcher, then spread to her mother — who died — and a nurse. A pandemic could well have resulted.

If staff and public health are to be protected, then accidents must be reported in full, and the long-standing lack of progress here must end. As a News article on page 515 reports, many accidents are caused not by a lack of physical barriers or regulations, but by the absence of a strong biosafety culture in labs and their oversight bodies.

A key part of such a culture is timely knowledge of all accidents and their causes. That way, organizations everywhere can quickly take on board the lessons learned. The International Federation of Biosafety Associations, among others, has proposed the creation of an inter­national system for sharing such information confidentially, but the meagre funding needed has not been forthcoming.

A confidential system would be a start, and deserves support, but it is not enough. Regulatory and oversight bodies throughout the world should require the reporting of all serious accidents and near misses in biocontainment labs, and in particular those that occur in labs with the highest biosafety levels. Timely incident reports should also be made available on public websites — as many nuclear regulators require of power plants — perhaps with an option for sharing details and more-sensitive information confidentially.

Researchers must be given incentives to report accidents. A strong biosafety culture would clearly communicate and enforce the rules of play. Negligence should be disciplined, but researchers who have accidents while acting in good faith should not be penalized unfairly. Some of the current media and political reaction to the CDC accidents and the calls for disciplinary action against the researchers involved is unhelpful and potentially unjustified. On 22 July, Michael Farrell resigned as head of the CDC’s Bioterror Rapid Response and Advanced Technology Laboratory in Atlanta, and other heads may roll, too.

As one biosafety expert told Nature, the current criticism of the people involved means that most researchers would probably now think twice about reporting an accident. This blame game is unhelpful. What is more important, and in everyone’s interests, is to prevent future accidents. And that requires full data on accidents and why they happen.