It was the week before Christmas, and D.A. Henderson was alarmed about germs. He isn’t easily rattled: Dr. Henderson led the successful worldwide effort to eradicate smallpox in the 1970s, and he directed the U.S. Office of Public Health Emergency Preparedness after the deadly anthrax letter attacks and the destruction of the World Trade Center in 2001. But recently not just one but two laboratories had engineered the virus known as bird flu to make it easily transmissible—through the air, among mammals—and that was a scary development.
“Compared to plague or to anthrax, this one has a potential for disaster that dwarfs all others,” says Dr. Henderson, now a distinguished scholar at the Center for Biosecurity of the University of Pittsburgh Medical Center. “Given our flu-vaccine capacity, which is limited, this could be a catastrophe if it gets out.” The experiments shouldn’t have been done, in his view, and—partly because they could give terrorists a blueprint for making a more deadly form of H5N1 avian-influenza virus—they certainly shouldn’t be published.
Now they won’t be, at least not completely. In an unprecedented move that same week, a federal advisory panel on biosecurity echoed Dr. Henderson’s concerns and recommended that papers describing the findings be partly censored, stripped of crucial details about the methods used to make the viruses. The authors and the journals set to publish the work reluctantly agreed. But far from quelling the controversy, the decision has ignited a fierce debate: Can publishing such work save millions of lives by speeding development of drugs and vaccines, or will withholding it save even more lives from a killer?
“The science wasn’t all crucial, and we did not want to provide a road map for something bad,” explains the acting chair of the biosecurity panel, Paul Keim, a professor of biology at Northern Arizona University. “It’s a stupid ruling,” counters Maria S. Salvato, a professor at the Institute of Human Virology at the University of Maryland School of Medicine, arguing that the method was already out. “This is the kind of research that lets you develop countermeasures. Plus, graduate students, of whom there are many in these labs, have already communicated all the details by e-mail.”
The near-publication has also brought out general critics of the federal panel, the National Science Advisory Board for Biosecurity, and the voluntary self-policing approach that it embraces instead of regulation. “These lines of work are sufficiently dangerous that they call for independent oversight,” says John D. Steinbruner, a professor of public policy at the University of Maryland at College Park; he wants an international-licensing system for researchers who work with dangerous pathogens.
Members of the board, on the other hand, argue that more government control will slow down science and medicine so much that drugs and vaccines won’t be developed before a disease hits. “If you take this research and restrict it to some kind of Army or defense lab, the costs will go up and the quality will go down,” says Mr. Keim, the man whose lab identified the anthrax strain behind the 2001 letters. “University people are smarter and quicker because they don’t have as many restrictions.”
Even within the board, the episode has brought to the surface worries that existing biosafety committees at academic institutions are not qualified to evaluate this kind of risk, and that the work is not being sufficiently evaluated until it nears publication, which is too late.
Risky Business
The panel Mr. Keim leads has no enforcement authority but renders advice to the National Institutes of Health, the agency that financed the two bird-flu projects. One was led by Ron Fouchier, a virologist at the Erasmus Medical Center, in the Netherlands, and the other by Yoshihiro Kawaoka, a virologist at the University of Wisconsin at Madison. Both showed that, contrary to many scientists’ beliefs, the virus—which in nature infects birds—can jump species with just a few genetic changes, which makes it a greater threat to people.
Science, which was to publish Mr. Fouchier’s work, and Nature, which was to publish Mr. Kawaoka’s, have agreed to alter the articles after the NIH, following the biosecurity panel’s suggestion, asked them to, and are now working on redacted versions. One condition: that the NIH establish a repository for the complete genetic information and make it available to researchers with legitimate interest in it. Just how that will be determined is also now being worked out.
Some observers believe this middle course, alerting researchers that mammal-to-mammal transmission is possible, is the right one. “The message is more important than the details,” says Robert G. Webster, a noted influenza researcher at St. Jude Children’s Research Hospital who advised the federal panel on this case. He also says “the studies would not allow you to make vaccines faster. It takes a considerable amount of time to make them, and the papers did not say anything about vaccines.”
On a longer time scale, however, Mr. Webster does think the papers can aid in the development of drugs. He points to advances in medical knowledge after the 2005 publication of the complete genetic code for the 1918 flu virus, a plague that killed as many as 50 million people. After injecting monkeys with the reconstructed virus, scientists learned its lethality stemmed from a “cytokine storm.” Cytokines are part of the immune system’s initial response to a virus, but this one was a toxic overreaction. “It was the storm, not the virus, that killed the monkeys,” he says. Efforts are now under way to develop drugs to calm the storm without destroying all cytokines, because the body needs them to live.
The paper on the 1918 flu virus is one of the few that were flagged over the years by the biosecurity board because of their potential for malevolent use, but until now the board has always come down on the side of publication and the benefits of communicating the results to the scientific community. Some other work that aroused concern was a paper showing how bacteria release botulinum neurotoxin (the biology had never been fully explained), and one on synthesizing a laboratory version of a SARS-like virus (inability to grow SARS in the lab had hampered study of it).
The panel was created in the years after the World Trade Center attacks and the anthrax letters heightened fears about terrorists’ use of technology. It was chartered after a 2004 report from the National Academy of Sciences, “Biotechnology Research in an Age of Terrorism,” described what researchers now call “the seven deadly sins,” or kinds of experiments that should prompt scientists to proceed with great caution. Those include work that would demonstrate ways to make a vaccine ineffective, that would create resistance to antibiotics, and that would make a pathogen such as a virus more transmissible between species, among other areas.
Proceeding with caution, but avoiding censorship and regulation, has been what the biosecurity advisory board has advocated. Mr. Keim’s concerns about the negative effects of government or military control are very real, says a board member who would know, David R. Franz. He commanded the U.S. Army Medical Research Institute on Infectious Diseases in the 1990s and is today the chief biological scientist at MRIGlobal, a research institute. “I think academic labs do react more quickly than military labs,” he says, because military installations are focused on making specific products, not understanding the basics of diseases. “One of the most useful things the NSABB has done is to stop overregulation of the scientific enterprise. We come down on the side of education and transparency.”
To that end, the biosecurity board suggests a pipeline approach in which scientists and institutions evaluate risks and benefits at every stage: project conception, grant application, experimentation, and all the way through the manuscript-preparation process. “Early in the experiment, you need to think what you’ll need to do if it turns out badly,” says Mr. Keim. “Actually, I think you need to ask: Does the experiment need to be done?”
Trouble With Oversight
But critics charge that individual scientists are not capable of that kind of self-examination—they are too close to their own work. “These experiments need more than two eyes on them,” says Richard H. Ebright, a professor of chemistry who works on antibacterial drugs at Rutgers University and the Howard Hughes Medical Institute. “We have mandatory regulation of human-subjects research, for example. So work that might harm one or a few people, perhaps psychologically, is subject to review. But work in biology that could harm millions of people isn’t subject to anything.” That makes no sense, he says.
“The NSABB position against management is ideological, and the consequences are too large to indulge in silly arguments,” agrees Maryland’s Steinbruner, co-author of a monograph, Controlling Dangerous Pathogens, that champions a mandatory review system.
In Mr. Steinbruner’s plan, there are two levels of oversight. For experiments with what he calls “moderately” dangerous pathogens, like work that increases the virulence of equine encephalitis viruses, he suggests a national board examine them. But for “anything that might create a pathogen that’s more dangerous than what might appear in nature, we need an international system,” he says. The review body could be modeled after the World Health Organization’s smallpox board, he notes, which meets twice a year to review applications to work on anything related to that virus. Mr. Steinbruner’s board would meet more often.
For both systems, he says, scientists would have to be licensed to work in that particular area, and they would have to submit regular reports about their work. “We need to know who you are and what you are doing,” he says.
The smallpox model may not be so easily applied to other diseases, says Dr. Henderson, who worked with the smallpox board. The virus was scary but limited, which made it simpler to monitor who was working with it. “There was a smallpox virus, not many of them,” he says. “But with something like H5N1, for instance, there are many, many naturally occurring strains. Keeping track of the labs using it is logistically hard. It’s really a herd of cats.”
But even though he is not a fan of widespread regulation of academic scientists, he does think that local university panels modeled after human-subjects review committees could work. “It’s a useful idea, and it might be the next step,” he says. It’s an extension of scientist-driven oversight that already exists and avoids the heavy-handedness of some central regulatory body.
There are, in fact, existing university boards that approve projects with dangerous pathogens, called institutional biosafety committees, but many people worry they are not right for the biosecurity job. At a federal biosecurity board meeting in early December, Jeffrey F. Miller, a professor of microbiology at the University of California at Los Angeles, cautioned that biosafety groups focus on things like lab procedures and plans to contain a pathogen if it gets out, not whether communicating the details of an experiment could aid terrorists. “This is biosecurity, not biosafety. Those committees might not be the best ones to deal with it,” said Mr. Miller, a federal board member.
The trouble is that scientists simply are not in a position to know when a bit of information might be the final piece in a terrorist’s puzzle, adds Robert Floyd, director general of the Australian Safeguards and Non-Proliferation Office. “When I was a practicing biologist, I thought I knew what risk was. When I moved into government, I found out how little I knew. And that worries me,” he says.
Institutions have to add people to biosafety committees with expertise in security, he and others say. Further, those committees need to reach out to the intelligence and defense communities for advice. Security agencies, in particular, Mr. Floyd says, have better information on the technology and expertise of terrorist organizations, and that could change a scientist’s view about what is wise to publish. But scientists won’t know if they don’t ask.
Too Little, Too Late
Another emerging problem is that with the current process, security considerations too often come too late, when a paper is submitted to a journal. “I’m very frustrated that it gets all the way to my end before something happens,” says Lynn W. Enquist, a federal board member and editor in chief of the Journal of Virology, which sees many papers on pathogens that have the potential for risks as well as benefits.
Mr. Enquist, a professor of molecular biology at Princeton University, says that journals catch these papers because they have a system in place to alert editors. As part of the paperwork that goes along with submitting a manuscript, an author has to check a box if that work could be what is now called “dual use,” or good science that could be turned to bad purposes. Mr. Enquist also says he has a copy of the “seven deadly sins” from the National Academy of Sciences report sitting on his desk.
If he sees the checked box, or sees that an experiment has increased the pathogenesis of a virus or bacterium, or sees a name from a list of “select agents,” things like Ebola virus or ricin that have been tagged by the U.S. Department of Heath and Human Services or Agriculture as threats to public safety, he alerts the publication director of the American Society for Microbiology, which publishes his journal. They have the option of convening a discussion of the paper with the editors of the other 10 journals owned by the society, to establish the benefits and risks of publishing. “I think this takes between 10 and 15 percent of my time as an editor,” Mr. Enquist says. “And we’ve never had a paper that we’ve ultimately rejected” for these reasons.
“But to be honest, it’s not really my job as a journal editor to be the ultimate safekeeper of these things,” he says. “These evaluations should happen way upfront, with the scientist.” Both Mr. Keim and Mr. Miller agree. Publication is essential to science, Mr. Miller adds, the key to replicating and verifying the accuracy of an experiment, so there are powerful reasons not to restrict it. Adding to the difficulty of holding something back is that “these tend not be black or white decisions. They tend to cluster in the gray,” he says. A middle course, the option of publishing but omitting some details, as the board suggested in the H5N1 papers, is only now being explored.
Mr. Miller says that while there are not a lot of these double-edged papers, in recent years more of them have been appearing in journal editors’ in boxes, and so the pressure to take additional measures is mounting. If the NIH comes up with a repository system for the dangerous details that doesn’t make legitimate scientists feel like they are being locked out—and doesn’t create arguments over who is and is not a legitimate scientist—that may provide an outlet.
That’s a position that would suit Mr. Floyd. Advisory committees are all well and good, he says, “but if something gets ‘out,’ then it won’t do you a lot of good to say, Well, we had a committee.”