Biosafety-level-3 protection at the US Army's Dugway Proving Ground, Utah. Credit: Douglas C. Pizac/AP/Pa Images

Two months ago, the US Department of Defense froze operations at nine biodefence laboratories where work is done on dangerous pathogens. Inspectors had discovered live anthrax outside a containment area at the US Army's Dugway Proving Ground — a facility in Utah that tests defence systems against biological and chemical weapons.

The discovery at Dugway is the latest of several concerning finds. In June 2014, workers at a US Centers for Disease Control and Prevention (CDC) biosafety-level-3 laboratory in Atlanta, Georgia, sent anthrax samples to three other laboratories on the same campus. The samples were meant to have been sterilized but several factors meant that 41 people were potentially exposed to live bacteria1. Then in May this year, an investigation revealed that for several years, staff at Dugway had been improperly sterilizing anthrax samples, and that live spores may have been sent to 52 laboratories in the United States, Canada, Australia and South Korea.

These mishaps — which are by no means unique to anthrax — are worrying on two levels. First, the handling of dangerous pathogens within a controlled environment is one of the easier biological risks to contain. Much harder is ensuring that basic biological research that is known to be potentially dangerous, or that turns out to be so, is carried out safely. Second, it is only going to get harder to ensure the safe and secure use of organisms and their products — whether in basic research or in detecting and preventing the development of biological weapons.

Relatively inexpensive and easy-to-use tools and approaches are greatly expanding the possibilities for genetic engineering, including for would-be terrorists. Among them are the gene-editing technique CRISPR/Cas9, and the use of gene drives — where the biased inheritance of particular genes alters entire populations. Meanwhile, myriad developments are undermining existing approaches to non-proliferation. These include: the sale of equipment and materials over the Internet; the accessibility of computing power; and the rise of the open-science movement.

Nature special: CRISPR — the good, the bad and the unknown

What are the prospects for managing the more intractable risks globally if measures to ensure the safe handling of dangerous pathogens are failing at the best-equipped facilities in the country with the most advanced biotechnology in the world? The anthrax incidents occurred despite the use of extensive legislation, protocols and procedures. The problem with the CDC, the US Department of Defense, and the many labs around the world who follow their lead, is not a lack of knowledge or training, or even a lack of engineering resources. It is the lack of a safety culture.

Most laboratories handling potentially dangerous biological materials are stuck in compliance mode. To prevent human and environmental catastrophes, and the shutdown of important research, that mindset must be transformed.

I never thought I'd write this, but I believe that it is time for experts who advise on biosafety and biosecurity to learn from specialists in nuclear security. I define biosafety and biosecurity as the prevention of the accidental release of potentially harmful organisms or their products and the prevention of the deliberate release of such agents for nefarious purposes. Leaders in these areas include the CDC, the World Health Organization (WHO), and Public Health England in the United Kingdom.

Outside the box

The reluctance of those of us in biosecurity and biosafety to learn from the nuclear industry stems from the fact that many of the practices in nuclear security and safety are not transferable to biology. For instance, monitoring the amount of materials entering and leaving a complex makes little sense when a tiny sample can contain militarily significant amounts of a hazardous substance. And expensive security measures — guns, gates, guards and cameras — make sense at nuclear power plants, of which there are only a few hundred worldwide. They are not feasible at the much greater number of labs and hospitals dealing with hazardous biological agents. Moreover, hospital accident-and-emergency buildings and procedures are designed to get patients inside as quickly as possible, not keep them out. And progress within public health and research depends on transparency and open collaboration.

What those working with biologicals can learn from practitioners in the nuclear industry — as well as from those in the US Navy, offshore oil drilling, airlines and utilities — is a culture of safety. In all these areas, best practice focuses on preventing failure rather than on maximizing output. The result is what is called a 'high-reliability organization' (HRO).

Workplace injuries plummeted when metals manufacturer Alcoa overhauled its approach to safety. Credit: Jim Sugar/Corbis

HROs feature the following five characteristics2. First, everyone within the organization constantly asks, 'What can go wrong and how do we prevent it?' Second, workers are sensitive to any deviation from the norm, such as an unexpected change in the temperature of the reactor core in the case of a nuclear power plant, and learn to ascertain which variances can snowball into catastrophic failure. Third, systems are designed to be resilient so that if they do fail, they do so with minimal damage and recovery can be quick. Fourth, workers recognize that the operating environment is complex and changeable, and that mindlessly following standard procedures without paying attention to what else is going on in the environment can be dangerous. Lastly, expertise is valued over seniority, with the recognition that it may be the newest or most junior member of a team who spots a problem or who knows best how to fix it.

In HROs, safety is not 'for them' but 'for each and every one of us', and is seen as an investment rather than a short-term cost. Workers are encouraged to hold each other accountable and to report red flags, such as a change in behaviour that might make a colleague more prone to mistakes. Mishaps and near misses too are reported without fear of blame, and mistakes are analysed to learn how to prevent them from recurring. Finally, the process is one of continual improvement: attention to safety does not stop just because certain targets have been met.

In biosecurity and biosafety, the CDC is widely seen as the global gold standard. The CDC's handbook Biosafety in Microbiological and Biomedical Laboratories has become the reference for laboratories worldwide. Other resources that it provides (posters, training videos, data and information) along with documents from the WHO, are used as core reference materials, even in the most remote labs. Yet the world's exemplars in the handling of the most dangerous pathogens, and therefore the multitude of public and private organizations who follow them, are stuck in a very different culture from that of HROs.

People who refuse to adapt should lose their positions.

From the CDC to diagnostics and basic-research laboratories worldwide, the emphasis is on ticking boxes and on following rules set by outside authorities, such as the Department of Health and Human Services in the United States or the relevant agencies in the European Union. Safety is generally seen as an inconvenience that detracts from the main task at hand. It is delegated to biosafety officers, and after-the-fact indicators of problems such as the number of accidents, are the predominant metric, with the implicit aim being to ensure that spills, infections and so on are kept below targets with minimal effort.

A recent illustration of problems caused by the rote following of rules is the handling of an Ebola patient by staff at Texas Health Presbyterian Hospital in Dallas in 2014, where two nurses contracted the disease. Having never dealt with a suspected Ebola case before, staff checked the CDC website for information on the correct personal protective equipment (PPE) to wear. Unfortunately, that website described PPE more suitable to handling Ebola samples in a laboratory. The PPE the hospital workers initially used left areas of their face and necks exposed.

Failure to consider context and all the links in the chain can similarly undermine the value of spending millions of dollars on building and operating containment labs throughout the world. A recent inspection at a major diagnostic lab for animal diseases in Afghanistan, for instance, revealed that standard operating procedures (SOPs) copied from Western labs, for 'safe' operation, were being followed to the letter, including one for the handling of biological waste. The waste was getting bagged up pending incineration. But because there was no budget for petrol for the incinerator, the bags were simply being stored, undermining many of the prior biosafety procedures.

Safety first

Organizations that have successfully implemented a culture of safety have often treated the introduction of a new way of doing things as a business project, akin to a move to a new software platform. Experts in the offshore oil industry have likened the process to moving from directing one play to another3. One must deal with a new script (the vision), a new stage set and scenery (the facilities, equipment and technology used in operations), new stage directions (processes and procedures), new roles (job descriptions), new contracts (hires), and new rehearsals (training and commissioning).

For example, the metals manufacturer Alcoa, based in New York, launched a safety drive starting in the late 1980s using such techniques and saw the average rate of lost workdays (due to work-related injuries) drop over a ten-year period from 1.86 to just 0.18 per 100 work years4. As well as this willingness to start afresh, three other steps are crucial.

People working in the nuclear industry are encouraged to ask 'What can go wrong?' Credit: Benoit Tessier/Reuters/Corbis

Provide leadership, funds, time and commitment. The process starts with senior management laying out what safety means for their particular organization. All layers of the organization are then involved in identifying what facilities, equipment and practices need to be changed. Lastly, a master plan is drawn up to realize the vision.

In some cases, considerable sums will be needed initially. Yet such investments can quickly pay off. Alcoa, for instance, jump-started its safety programme by spending US$3 million over two months to fix unlit passageways in its plants. But based on the US Department of Labor's Accident Cost Calculator, the reduction in time lost due to accidents saved Alcoa around $51.5 million annually.

Make safety matter to everyone. People will care about safety at their organization if their immediate bosses and those at the top frequently talk about it and back their talk with actions. If other achievements, such as efficiency or the output of journal papers, are rewarded ahead of safety — as is the case in most basic-research labs — people will pay less heed to it.

Those who ignore new safety rules must face significant sanctions. People who refuse to adapt should lose their positions. Various tools can aid managers on this front. For instance, workers can be required to obtain certification before being allowed to perform potentially hazardous tasks.

Exploit peer accountability. Most managers of staff who have employment protection, such as tenured professors or civil servants, cannot hire and fire, or give or withhold bonuses. Fortunately, cash does not seem to be a key motivator when it comes to safety.

A 2010 study of 1,600 safety professionals across different industries found that people's expectations of their peers seems to be the most important influence on workplace behaviour — ahead even of management's expectations5. And recognition of a job well done can be more motivating than a bonus. For instance, the Gallup Organization, based in Washington DC, has surveyed more than 4 million workers worldwide, and found that employees who are recognized for their achievements have better safety records and fewer accidents on the job.

Leverage leadership

In 2011, the Joint Commission, a non-profit organization that controls hospital certification in the United States, started promoting HRO approaches in hospitals throughout the country6 (see 'Follow the leader'). This followed several serious medical errors, such as surgeons operating on the wrong side of the brain in three patients in one year at the Rhode Island Hospital in Providence. In the case of the hospitals, the actual procedural changes — anything from more-stringent processes for infection control to improved systems for record checking — vary from place to place, but the aim is always to minimize the chances of something going wrong.

Credit: Sources: M. T. Zubrow <i>et al</i>. <i>Jt Comm. J. Qual. Patient Saf.</i> <b>34</b>, 187–191 (2008); AHRQ, USDHHS

The CDC is the obvious candidate to pick up the torch and prove that the HRO approach also works in laboratory settings. It has the resources. And where the CDC leads, others follow; if they do not, they risk not being able to acquire funding, collaborate with those in other laboratories or obtain contracts from corporations who demand compliance with best practice.

Changing the culture of such a large entity will be difficult. But proof of concept could be achieved first in one unit, such as the Bioterrorism Rapid Response and Advanced Technology Laboratory (BRRAT). Because BRRAT is on the CDC's main campus in Atlanta, top managers from across the organization could more easily be engaged in the process of organizational change. Approaches used at BRRAT could then be rolled out to the entire organization.

The HRO approach will be especially valuable for those facing uncertainty. In experimenting with gene drives and CRISPR/Cas9, there are no SOPs to follow. Asking 'What could go wrong?' or 'How could this science be misused?' and 'How can we prevent that from happening?' will embed biosafety and biosecurity considerations into study programmes from the outset.

Although research will always have an element of the unknown, under the HRO model, workers are encouraged to constantly monitor outcomes against expectations and to make adjustments on the basis of new evidence. In other words, an HRO approach means reappraising biosafety and biosecurity plans as understanding increases.

The CDC is the obvious candidate to pick up the torch.

The biological-research community is capable of taking this road: people working on gene drives, for instance, are actively debating potential safety and security issues7. But HRO principles need to be adopted much more widely.

Failure to so could greatly harm society, agriculture and the environment. Take, for instance, the 2007 release of foot-and-mouth disease virus from the Pirbright Institute, an animal-health research centre in Woking, UK. Inadequate sterilization of biological waste, broken waste pipes and unsealed and overflowing manhole covers led to more than 2,000 sheep and cows being slaughtered, at a cost of $200 million.

Moreover, failure to be seen to be conducting biology in a safe, responsible and ethical manner undermines the public's support for promising technologies and approaches. That government and public anxieties can quickly block research has been demonstrated repeatedly.

Take the nearly eight years of restrictions on human embryonic stem-cell research in the United States, instituted by President George W. Bush in 2001. Or the year-long voluntary moratorium called in 2012 on 'gain-of-function' experiments involving the highly pathogenic avian H5N1 influenza virus8. Here, researchers used genetic engineering to enhance the transmissibility of such flu viruses in mammals in the course of investigating changes that might increase their transmissibility between humans. The research is aimed at predicting which strains we shall need vaccines against in the near future.

Biology must move forward on safety and security. Let's not reinvent the wheel, but learn from those doing safety better.