Begin typing your search...

Boston University sparks headlines, creates lethal version of omicron Covid-19 variant

Researchers at Boston University sparked alarming headlines this week by creating a more lethal version of the omicron COVID-19 variant. At the heart of the uproar is the fact that the researchers didn’t have any obligation to inform anyone beyond an internal review board about what they were doing. Some officials at the National Institutes of Health only heard about the research through the media.

image for illustrative purpose

Omicron sub-lineage continues to be predominant variant, 349 samples of XBB1.16 detected
X

21 Oct 2022 4:00 PM IST

Researchers at Boston University sparked alarming headlines this week by creating a more lethal version of the omicron COVID-19 variant. At the heart of the uproar is the fact that the researchers didn't have any obligation to inform anyone beyond an internal review board about what they were doing. Some officials at the National Institutes of Health only heard about the research through the media.

Another recent development could prove even more concerning: Nature reported last week on plans for 40 new virology labs being built around the world. Known as BSL-4 labs, designed to deal with the most dangerous pathogens, they're being built in India, Singapore, the Philippines, Kazakhstan, and Russia, among other countries. The ostensible aim is to make us safer, but even before this current pandemic, some virologists saw these BSL-4 labs as a problem — the germ equivalent of the nuclear proliferation.

The bottom line is that the speed of scientific research has to be balanced with public safety.

The issue of lab safety has become politicised during the COVID-19 pandemic, as the political Right has been more likely to favour the possibility that SARS-CoV-2 originated from a lab accident in China. (Only truly fringe conspiracy theorists think it came from a deliberate leak.) People on the Left have been more likely to insist that the virus jumped from bats to humans, perhaps via another animal. So far, I don't think there has been enough evidence to tell us definitively where it came from. But regardless of COVID-19's true origin, the best way to prevent the next pandemic is to increase precautions surrounding all potential avenues, whether that's wet markets, bat guano collection, or research labs.

In the case of the BU researchers, there seems to be a grey area about how much detail they were obligated to report to government funding agents. Even if they followed existing guidelines to the letter, though, we need clearer rules for researchers and stronger oversight to make sure the risks inherent in live-virus research don't outweigh the potential benefits.

The debate has also put 'gain of function' research back in an unflattering spotlight. That term isn't well defined, but generally refers to research that alters viruses to change what they're capable of doing. Such experiments have been extremely controversial, including an endeavour to create bird flu viruses that can be transmitted between mammals, attempts to alter bat coronaviruses to infect human cells, and experiments aimed at finding new possible iterations of SARS-CoV-2. But 'gain of function' might also describe techniques that use altered viruses to deliver gene therapy to treat cancer, and hereditary diseases. With such a broad definition, it's not feasible or in the public's best interest to ban all gain-of-function research.

One solution could be an outside body, such as the Office of Science and Technology Assessment, to judge whether experiments using live viruses are safe enough. That's something Rutgers University biologist Richard Ebright suggested to me last year. That way independent experts can weigh the risks and benefits of research with public safety as the overriding goal.

It's possible that more oversight could slow down valuable research. Where does necessary transparency end, and micromanaging begin?

The best we can do is find a balance between research speed, public safety, and transparency when dealing with the modification of live viruses. More oversight here won't necessarily bog down our understanding of the current pandemic. Many experiments can be done with so-called pseudo viruses, which use key structures from real viruses but don't have the ability to replicate. These were important in work that was done quickly to understand the omicron variant when it emerged in South Africa last year — work that probably saved many lives by showing that the mRNA vaccines could still protect against this variant if people got a booster shot.

Unexpected things can go wrong when scientists work with dangerous viruses and bacteria. Accidents, and even deliberate leaks have happened in the past.

Purdue University virologist David Sanders once told me that he'd been on a team inspecting a lab called Vector in Siberia where there had been a 2004 Ebola leak that killed a worker, and a suspected 1977 leak of a previously extinct flu strain, which subsequently spread worldwide. The movie and book The Hot Zone is based on a true story about a deadly relative of Ebola cropping up in a primate facility in Virginia in 1989.

Or consider the anthrax attacks that took place in 2001 in the wake of September 11. The US biodefense community assumed it must be the work of foreign terrorists. But it turned out the attacks were carried out by an American scientist who worked in a high-security lab.

Blind trust in scientists isn't being 'pro-science'. Scientists can have motives other than the public's best interest, including producing high-impact publications to further their careers. Sometimes even with the best of intentions, they make mistakes.

Covid 19 Omicron Boston University Ebola SARS Public interest 
Next Story
Share it