Scientists working on viruses, have, in some cases, developed viruses that are even more dangerous than the original in experiments called “gain-of-function.”
Gain-of-function is as it sounds: whatever is being studied has been given more function, like increased ability to transmit from one host to another, or increased virulence. From this work, scientists can continue to study the virus in depth, like determining how the disease changes to becoming more virulent.
In one gain-of-function study, scientists worked on the avian bird flu, a particularly lethal virus that showed up in humans in 1997.
This bird flu virus originated in the bird population but had jumped to humans: infection came when a human came in contact with an infected bird. Thankfully, in nature, it almost never moved from human to human. Although it had a high mortality rate, it didn’t spread very quickly.
A few years ago, this bird flu virus was altered in a lab, however. The engineered virus was then able to spread from ferret to ferret, an animal genetically similar enough to people to broaden the implications for an accidental escape into the population.
Then, there were several high-profile lab accidents.
By 2013, the US government stopped all federal funding of gain-of-function studies, including work on influenza, SARS, and Middle East respiratory syndrome (MERS).
The experiments are back.
The government has reopened funding for the development and study of genetically engineered viruses.
Should it have?
the big picture
When a virus “jumps” from, say, a bird host to a human, a huge immune response is triggered to fight the new invader, which can cause inflammation in the lungs, fluid build up and pneumonia. In 1918, the Spanish flu killed between 50-100 million people, and it did so faster than any other illness in history, particularly targeting young adults, possibly because older patients had been exposed to a similar virus, giving them some immunity the younger adults didn’t have.
Gaining some immunity could save the world from an even worse pandemic. Our best bet for an acquired immunity is through vaccines, and vaccines are developed through research, some of which is gain-of-function.
Some experts believe that the biggest threat facing our planet is from a pandemic for which we are not prepared. The World Health Organization writes, “The world will face another influenza pandemic– the only thing we don’t know is when it will hit and how severe it will be.”
The 1918 Spanish flu killed a few percent of the people it infected–and that was before modern medicine and modern clinics and early detection lab tests. Still, 50-100 million people died. Severe acute respiratory syndrome (SARS) killed 10% of those infected–a much higher percentage than the Spanish flu killed–but, thankfully, public health stopped the spread of the virus and less than a thousand people died.
If some new virus jumps from bird (or pig or whatever) to people, our public health system should be able to control it. But what if it’s genetically engineered to overcome those efforts? Why take the risk of introducing something into our world that is worse than anything nature has thrown at us?
response to crisis
In order to respond to a pandemic, we need information regarding the way a virus is transmitted, how to identify and model its spread, and how to create a vaccine, all of which is dependent on research on the genetic makeup of viruses and virus-host interactions.
Some researchers say that not only are gain-of-function experiments not necessary, they’re not even the best way to study pathogens.
For example, a researcher from Harvard (Marc Lipsitch) and one from Yale (Alison P. Galvani) write,
“Alternative approaches would not only be safer but would also be more effective at improving surveillance and vaccine design, the two purported benefits of gain-of-function experiments to create novel, mammalian-transmissible influenza strains.”
review of GOF experiments for safety
A panel of experts is now used to evaluate any government-funded research project with regard to the biosafety (what could happen if an accident occurred), as well as the way to minimize those risks and whether or not the possible benefits outweigh the risks.
This panel of “experts” is supposed to provide oversight, but there’s no transparency on the process, so who knows how thorough they are, or whether they are being influenced by market forces? Recently two experiments were green-lighted for funding on gain-of-function experiments and the only reason we know about them is because a reporter found out.
how about those accidents
And, let’s not forget: Accidents happen all the time in labs, even at the CDC (Center for Disease Control and Prevention). In 2014, in a CDC lab, dozens of workers were exposed to anthrax. In another accident, they shipped a deadly flu virus to a laboratory that had asked for a non-deadly strain. At the National Institutes of Health, 50-year-old vials of smallpox virus were found in a freezer where they weren’t supposed to be.
That’s a little concerning.
According to lab inspectors working across the country, over 200 incidents of loss or release of bioweapon agents are reported each year.
What if an agent is intentionally released? How does any information gained offset the dangers of having superbugs in the hands of terrorists?
When Chernobyl melted down, we didn’t stop using nuclear power, right? We need to put better protocols in place, make labs safer, rather than shutting down research.
But maybe we SHOULD have shut down nuclear power after Chernobyl??