Pathogen safety in federal labs

Over the past few weeks, revelations of potentially dangerous errors in US federal labs handling pathogens have placed health and safety high on the national agenda.  In June, the US Centers for Disease Control and Prevention (CDC) announced as many as 75 of its staff may have been exposed to anthrax due to safety issues at one of its labs.  At the beginning of July, vials of smallpox virus were found in an unsecured room at the National Institutes of Health (NIH). Then earlier this week came the revelation that in the same room were over 300 vials containing pathogens such as dengue virus, influenza, and the bacterium that causes Q fever.

Congressional hearing

The CDC anthrax incident prompted a hearing by the US House of Representatives Committee on Energy and Commerce Subcommittee on Oversight and Investigations this week.  At the hearing, CDC director Tom Frieden acknowledged a “broader problem of unsafe practices at the agency”.  But as an article in today’s New York Times points out, similar issues are not confined to the CDC, or to federal labs.

Broader safety issues

The issue at hand is one of safe practices around research on pathogens.  But the emerging pattern of  lax safety procedures indicative of a broader culture of complacency in research labs.

Most researchers I know are committed to ensuring a safety environment in their labs.  Yet there is often a disconnect between intent and practice.  To an outsider on a walk-through, many academic labs and not a few government labs can appear a minefield of dangerous incidents waiting to happen.

Researchers will often argue that they know what they are doing, and what looks dangerous to a non-expert really isn’t as risky as they might think.  And yet incidents like the ones above happen.

Dangerous contempt

It may be that the old adage of familiarity breeding contempt is truer than we would like to believe in research labs.  Certainly, the fields of chemistry and physics (two disciplines I work across) have a long if dubious cultural history of near-misses in the lab being the mark of a “real” scientist.  And some researchers still believe that behavior which looks risky to outsiders but which they are “in control of” marks them out as belonging to an elite group of experts.  There’s a certain professional identity, pride and confidence that comes with working with substances that could kill or maim you if you weren’t an expert.

A culture of complacency

This culture of complacency within research labs is underlined by the current National Academies of Science study on establishing and promoting a culture of safety in academic laboratory research.  At the study’s kick-off, information was presented from a survey of academic lab safety that concluded:

The overwhelming majority of respondents reported that their labs allow people to do experiments while alone, which is a very elementary violation. Only 7% said that this never happens in their labs, 35% called it a daily occurrence, and 80% said it occurs at least weekly. Only 46% of those who say their work requires a lab coat report wearing one at all times. And 40% of the respondents with supervisors reported that they—the supervisors—fail to “regularly check their performance in terms of safety.” Almost 10 percent of workers in small labs (labs with fewer than 11 workers) reported that “no individual [is] specifically responsible for lab safety.

Supporting this assessment, on May 13 2013 an open letter fro three companies published in C&E News noted

Occupational Safety & Health Administration statistics demonstrate that researchers are 11 times more likely to get hurt in an academic lab than in an industrial lab. There have been serious accidents in academic labs in recent years—including fatalities—that could have been prevented with the proper use of protective equipment and safer laboratory procedures.

Blatant contempt for safe practices in academic laboratories are thankfully increasingly rare in my experience.  Yet there remains a residual elitism in some laboratories that resists safety procedures imposed by others.

A gathering storm

While the current focus is on labs handling pathogens, I cannot see public concerns over laboratory safety stopping there.  Most academic researchers have war stories about near misses in their labs or the labs of people they know.  As these become open to public scrutiny, there’s likely to be increasing pressure to change cultures, habits and practices.

It’s a gathering storm that scientists in academia should be preparing for – whether they handle pathogens or not.