Criteria for guiding funding decisions for certain H5N1 gain-of-function research proposals at the U.S. Department of Health and Human Services
• Such a virus could be produced through a natural evolutionary process.
• The research addresses a scientific question with high significance to public health.
• There are no feasible alternative methods to address the same scientific question in a manner that poses less risk than does the proposed approach.
• Biosafety risks to laboratory workers and the public can be sufficiently mitigated and managed.
• Biosecurity risks can be sufficiently mitigated and managed.
• The research information is anticipated to be broadly shared in order to realize its potential benefits to global health.
• The research will be supported through funding mechanisms that facilitate appropriate oversight of the conduct and communication of the research.
Ebola is not the virus that keeps Marc Lipsitch up at night.
Lipsitch, a Harvard epidemiologist who grew up in Atlanta, is on a mission to eradicate human-engineered strains of deadly pathogens such as the H5N1 “bird flu.” Those strains exist only in a handful of labs, where they have been genetically altered to make the virus more contagious.
H5N1, which first infected humans in 1997 in China, has killed about 60 percent of the almost 700 people who have been diagnosed with it. Nearly all of them got sick through contact with infected birds; in nature, H5N1 does not pass easily from person to person.
If it acquires that ability in the wild before scientists have developed effective vaccines and treatments, many millions of people are likely to die.
In the past few years, a handful of virologists have experimented with making H5N1 transmissible between ferrets, a species that reacts to flu much as humans do. More recently they have announced studies on another bird flu, H7N9.
The architects of those experiments, known as “gain of function” research, say that by learning which mutations make the virus more transmissible, they can help avert a pandemic.
On the flip side, Lipsitch and many other scientists argue that there’s a small but real risk that the researchers themselves could set off the very catastrophe they hope to prevent. As though on cue, recent news of lab accidents at the U.S. Centers for Disease Control and Prevention has driven home the point that even top-tier labs sometimes make mistakes.
This week, the leading U.S. practitioner of gain-of-function research, Yoshihiro Kawaoka of the University of Wisconsin, defended his work and his lab to The Atlanta Journal-Constitution. “The University of Wisconsin-Madison has provided significant support staff for biosafety and biosecurity, the facility has been designed specifically for these studies, and we have a strong ongoing training program with internal controls,” he said in an email.
Lipsitch doesn’t doubt it, but that doesn’t assuage his fears. “A very low probability of an accident may be OK if one or a few people could be infected,” Lipsitch said. “Here, there’s the potential for infecting millions or billions of people if this virus gets out. This is a whole new level of risk.”
‘This is not an abstract question’
With Lipsitch leading the charge, dozens of scientists issued a call two weeks ago to curtail gain-of-function experiments on viruses that could cause pandemics. Since the public is at risk, they argue, there should be a wide-open global discussion such as the conference in 1975 that set the parameters for research on recombinant DNA.
Until the broader public has a chance to consider the calculated odds of a man-made pandemic, “you’re doing an experiment on a global population that hasn’t been asked for informed consent,” said Harvard immunologist Barry Bloom.
It’s not a question of whether lab workers are careful. “You’d have to be an idiot” to be careless with something like H5N1, said Emory University biologist Bruce Levin, who supervised Lipsitch when he did postdoctoral work at Emory.
Nevertheless Levin said, “there is a history of lab accidents. This is not an abstract question.”
Until recently, though, most of the debate has been abstract, without benefit of probabilities. But then Lipsitch, whom Bloom calls “a hero,” did something few had done before: He pulled together existing data on the frequency of lab accidents and calculated the likelihood of a lab-bred flu strain sparking a pandemic.
A survey of the scientific literature shows that at least five labs worldwide are engaged in gain-of-function studies on various flu strains. Lipsitch and his co-author, Alison P. Galvani of Yale, calculated that if 10 labs performed such work for 10 years, there would be a 20 percent chance of a lab worker becoming infected. They calculated the odds of that one infection becoming an outbreak as between 5 and 60 percent.
Proponents of gain-of-function studies have criticized Lipsitch’s methodology and rejected his findings. But his work, in conjunction with the CDC’s missteps, has put the issue front and center among people who combat deadly diseases.
Starting in animals, leaping to humans
Like many deadly viruses, including Ebola, flu strains originate in other species. Ebola and the precursor of HIV infect other primates, such as monkeys and apes. Various flus prefer birds or pigs.
Over time, some of those viruses make the jump from their animal hosts to humans. At that point, most still don’t have the ability to spread efficiently from person to person. The ones that mutate and acquire that ability have the potential to sweep through human populations.
Ebola and HIV spread inefficiently, requiring contact with an infected person’s blood or other bodily fluids. Flu viruses can spread like wildfire, because they also travel through the air.
The goal of science and medicine is to identify emerging flu threats and develop vaccines and treatments before a major outbreak occurs. With a virus as deadly as H5N1, the stakes are immense, and the urgency is palpable.
Ultimately, everyone on both sides of the present debate wants the same thing: to prevent millions of people from suffering and dying.
Kawaoka and his allies believe that understanding the genetics of mammal-to-mammal transmission offers the best hope of keeping tabs on H5N1 and developing a vaccine against it. “Our research is important for pandemic preparedness,” he said in his email. “From my point of view, it would not be ethical to do nothing and just wait for the next pandemic to happen.”
But Lipsitch and his allies believe that other kinds of research offer not only safer but more effective ways to thwart the virus. Vaccines were part of the public health arsenal long before genetics was a science, they note.
“I’m not calling for a reduction in virus research,” Lipsitch said. “I’m calling for better virus research.”
Shock waves
Gain-of-function research on bird flu first got lots of attention in 2011 when Kawaoka and a researcher in the Netherlands announced a round of discoveries on H5N1.
The news set off shock waves. A national biosecurity committee recommended that the bulk of their research not be published, for fear that bioterrorists would use it for evil ends.
The committee later reversed course, but the scientists agreed to a voluntary moratorium on gain-of-function research while the issue was discussed. The federal government hosted high-profile panel discussions of scientists in 2012 to argue the matter, and holds this up as an example of public involvement.
The National Institute of Allergy and Infectious Diseases established a process for regulating what it dubbed “dual-use research of concern” – dual-use meaning it can be used for either good or evil.
It’s up to the research institutions themselves to help identify which research should trigger the added oversight. That system, established in 2012, got one of its first tests at Wisconsin, where Kawaoka had a project to create a virus with elements of the 1918 “Spanish flu” that killed 50 million people worldwide.
As later reported by the journal Nature, the Wisconsin review panel determined that the experiment did not meet the federal threshold for special oversight. The government disagreed and required the university to write up a risk-mitigation plan, which it did.
Tom Jeffries, who’s also at Wisconsin, believes the system is a problem. “There are inherent conflicts of interest in any funding review panel such as the institutional biosafety committee at the University of Wisconsin or any at any other university,” he said.
‘We can afford to go slower’
That conflict might be especially intense when the researcher in question is a science star, which Kawaoka assuredly is. In 2006, when he was wooed by another institution, Wisconsin’s governor lobbied him to stay, and the university built him an $11.4 million lab.
Rebecca Moritz, who manages regulatory oversight of Wisconsin’s research on virulent pathogens, was a consultant to the university’s 2012 review committee. She rejects any suggestion that the safety review was less than stringent or objective.
“I think one thing the public doesn’t understand is that one misstep by a researcher at an institution could affect the work of every single researcher at an institution,” she said.
But Lipsitch said the ability to win grants and generate headlines does carry weight, including influencing where research gets published. Publication in a top journal is more likely if the research is controversial, he said.
By the same token, he doesn’t think his numbers should be the basis for any decision to limit gain-of-function research. He’s too deeply invested in one side of the issue.
“We don’t think we are the people to make the final calculations,” he said. But he’s insistent that the risk and the benefits should be quantified as far as possible, and policy should rest on those calculations.
Yes, he said, that will take time. But, given the stakes, “we can afford to go slower in this instance.”
About the Author