World War C

Home > Other > World War C > Page 10
World War C Page 10

by Sanjay Gupta


  A question we now have to consider is this: Can COVID hide out in the body and continue to inflict damage? Can it persist long after the acute phase of illness has resolved? And could this be the cause of chronic symptoms in a subset of COVID long-haulers? Unfortunately, documenting persistent COVID infection isn’t as easy as a repeat throat swab PCR or a simple blood test. Polymerase chain reaction, or PCR, is the laboratory technique used to amplify and detect genetic material from a specific organism, such as a virus. The PCR test to detect an active COVID infection in an individual is the “gold standard” for diagnosing the disease because it’s the most accurate and reliable test, but you have to collect enough specimen for the test to work. It often takes invasive and painstaking measures to confirm a virus’s ongoing presence, especially if it’s hiding out in unsuspecting cells and tissues after the acute phase of infection. Such detective work is usually not available to patients outside a research setting. In many cases, we don’t have proof that the virus is persistent—but we also don’t have proof that it’s not. Absence of evidence is not evidence of absence.

  Studies are underway to understand how some viruses, including coronaviruses, could become persistent infections, and in some cases long after presumed recovery from the acute phase. The hypothesis that coronaviruses could linger in the body was suggested as far back as 1979.30 And it remains to be fully documented how and where COVID can hide in places far from the respiratory system.31 This highlights an important possibility: If the virus can remain in some people and cause chronic illness, then it similarly may not be gone in those who’ve experienced chronic illness after acute COVID, or the active phase of infection.

  One case in particular that garnered attention in scientific circles and became a case report published in the New England Journal of Medicine involved a forty-five-year-old man with a severe, rare autoimmune disorder.32 His condition had him on multiple drugs that suppressed his immune system when he contracted COVID in spring 2020. After a five-day stay at Brigham and Women’s Hospital in Boston, he was discharged and quarantined alone at home for the next several months during which he was readmitted to the hospital multiple times for a recurrence of his infection. He was treated with many courses of antiviral medications and once with an experimental antibody drug.

  Every time he thought he was free of the virus, he wound up back in the hospital, and he ultimately died from COVID after a grueling 154 days. The report that features the case calls attention to the obvious concern about people who harbor high levels of virus for months. The authors write: “Although most immunocompromised persons effectively clear [COVID] infection, this case highlights the potential for persistent infection and accelerated viral evolution associated with an immunocompromised state.”33

  One thing is for sure: If a pathogen wanted to infect as many humans as possible, its doorman would be the ACE2 receptor, the COVID co-conspirator. This Bonnie and Clyde duo is perfectly paired to spread to as many people as possible across the planet and ravage as many bodies as possible. “The virus acts like no pathogen humanity has ever seen,” wrote one group of scientists for the journal Science.34 “Its ferocity is breathtaking and humbling.”35

  Tony Fauci described COVID’s near-perfect adaptation to humans to me as if speaking from the virus’s wise perspective:

  Not only am I going to infect you, but I’m going to make sure that many of you don’t have symptoms. And I’m going to make sure that those of you who don’t have symptoms are going to account for 50 percent of the transmissibility. The people who are young and who are healthy, who don’t get symptoms. I’m going to use them to spread as much as I possibly can because they’re going to be well—they’re not going to get sick and they’re going to be infecting all their friends in this superspreader event. But I’m going to look for the vulnerables and the vulnerables are the elderly and those with underlying conditions. And if I kill the vulnerable, I’m not going to eliminate the population, so I always have a lot of people that I can still infect.… Now, that sounds crazy, but that’s the metaphor that when you deal with infectious diseases, you say, Damn! This virus, it’s such a bad evil virus because it’s doing things in such a nefarious way. It’s using transmissibility among otherwise healthy people and vulnerability of people who are going to wind up dying. It’s a bad, bad virus.

  On top of all that, people who cannot clear the virus can become breeding grounds for mutant strains, including individuals who never knew they were infected to begin with. Although the scientists studying the immunocompromised man with severe illness first wondered if he was merely acquiring new strains of the virus and becoming reinfected, they eventually determined that the same strain had been evolving over time in his body, acquiring a cluster of new mutations at an alarming speed. The man did make some antibodies in a feeble immune response, but his level of resistance was too low to clear the virus and just enough to put pressure on it. The virus was allowed to live in an environment where it had to change in order to survive. Such scenarios are unusual, but they raise important questions we must address if we are to gain control of the variants and get ahead of the virus’s attempt to thrive—which brings us to the topic of vaccines and their curative powers to prevent disease and end pandemics.

  In this 1802 political cartoon, an etching by Charles Williams (1797–1830), vaccination was depicted as a diseased cow-like horned monster being fed baskets of infants and excreting them to symbolize vaccination and its effects. SOURCE: THE WELLCOME COLLECTION, LONDON.

  CHAPTER 4 Cows

  Three hours after the virus’s code was published in January 2020, scientists around the world went to work to develop diagnostic tests and vaccines. Not a single case had been confirmed yet in the United States when Fauci ordered his team to get going on a vaccine. “The decision that we made on January the 10th to go all out and develop a vaccine may have been the best decision that I’ve ever made with regard to an intervention as the director of the institute,” Fauci told me. It was a gamble, because an all-hands-on-deck approach to the vaccine development was going to be costly and there was no way of knowing at the time what would transpire. A “pandemic” had yet to be declared.

  Although the skies looked clear, potential scenarios were playing in the back of Fauci’s mind. The possibility that this would go bad, even though it wasn’t going bad yet, was too hard to ignore. Fauci remembers the moment he trained his institute’s efforts on a COVID vaccine. “I just turned to our people and said, ‘Let me worry about the money. Just go and do it.’ And boy, was that the right decision.” By January 15, Fauci and his team were collaborating with Moderna on vaccine development. Sixty-three days later, they went into the phase 1 trial. “Just as we were going into the phase one trial,” Fauci recalled, “count sixty days from January 10th, bingo, you’re in the explosion in New York.”

  On a cold Sunday night in November 2020, Fauci sat bundled up on his deck, having a (physically distanced) drink with a friend when the call came. Albert Bourla, the CEO of Pfizer, was on the other end of the line. Pfizer had commenced its clinical trials in early May on a vaccine in development with BioNTech that, like Moderna, was based on the same technology. “Tony,” Bourla said, “are you sitting down?”

  Bourla had the results from the phase 3 trials that had been taking place for months after the first two phases. Bourla told him it was “amazing.” I have known Dr. Fauci for twenty years, and he has always loved doing impersonations, and now he laughed as he mimicked Bourla giving him the news in a lyrical Greek accent. “Tony, it’s more than 90 percent effective at reducing the risk of developing severe COVID!” Keep in mind that the FDA had set expectations earlier saying they would consider 50 percent efficacy worthy of authorization. In any given year, the seasonal flu shot is 40 to 60 percent effective. For a scientist like Fauci, it was a deeply emotional, cathartic moment.

  A week later, Moderna divulged similar results.1 Perhaps the most stunning thing about these two vaccines, which both rely o
n new mRNA technology, is how well they were shown to work across all age, racial, and ethnic groups. It’s like having PPE, personal protective equipment, at a cellular level; it turns the human body into an internal vaccine factory.

  If you were to ask a molecular biologist about these new mRNA-based vaccines, she would tell you that Cinderella has gone to the ball.2 Always a beautiful technology, mRNA had largely been passed over until this moment, when it finally got a chance to shine brightly. Ever since we figured out the code of life, DNA, more than half a century ago, we’ve decoded a lot about the human body and the way DNA programs how we function and carry out our biological business. But research into DNA has not yielded actual defenses against disease. It is RNA research that has produced those. Not only are RNA-based vaccines being considered for all sorts of other diseases now, some of which have yielded to no other approach, but other pharmaceutical uses of RNA technology are now coming into their own as well, as I will describe later. And while you may fret and worry at the fact that these new vaccines are just that—“new”—they are born from decades of study. They take vaccine technology from an analog to a digital space.

  In molecular biology, the discovery of DNA’s double helix led to a universal truth: We owe our life to the relationship between how genetic sequences come together to code for proteins. And those proteins’ characteristic shape stems from how various amino acids link together and fold. It’s like language: Letters form words, which then form sentences whose meaning depends on how those words come together in a unique sequence and ultimately tell a story. The transfer of information from the cell’s genomic library to its active physical interpretation (i.e., protein) depends on a particular form of RNA that acts as a translator of sorts. The gene sequence is first copied from DNA to RNA; that RNA “transcript,” or record of instructions from the DNA, is then edited to form a molecule called a messenger RNA (mRNA), which then goes on to make proteins—the ultimate product that supports and proliferates life.

  In the twenty-first century, with horrendous diseases like smallpox and polio long gone or disappearing, people may fail to appreciate the power of vaccines. But vaccines may have done more good for humanity than any other medical advance in history. And because billions of people take them, they are also the most studied. “Vaccination is the most powerful gift of science from modern medicine,” Dr. Bob Redfield emphasizes whenever he speaks publicly. And as Dr. Paul Offit, co-inventor of the rotavirus vaccine, shared with me, “We have largely eliminated the memory of many diseases.” And without that memory, we easily underestimate or dismiss risks—until and unless we encounter them firsthand. The disappearance of vivid personal memories of polio, whooping cough, diphtheria, mumps, and measles has likely contributed to the rise of anti-vaccination sentiment in spite of the well-documented danger of these diseases.

  Smallpox was among the most fearsome diseases we have ever encountered and dates back tens of thousands of years, when it would routinely decimate large populations throughout Africa, China, and Europe. It may have been responsible for the death of the Egyptian pharaoh Ramses V, whose mummified head reveals classic smallpox scars. Later, smallpox scourged Western Europe, wiping out more people than the Black Plague, and landed in the United States along with European settlers. A century or so prior to a hint of the concept of vaccination, doctors in the seventeenth century found that scratching a bit of fresh material, or pus, from a smallpox pustule to an uninfected person under their skin, via a sharp lancet, would provide some protection against the illness. This was termed inoculation, from the Latin inoculare, meaning “graft.”

  When the English aristocrat and poet Lady Mary Wortley Montagu contracted smallpox in the early 1700s, she survived but was left severely disfigured. Her brother died from the disease. After she ordered that her children be inoculated, people who heard about the noble family’s use of the technique started to think more positively about the concept of inoculation, though it would not be scientifically validated for about another half a century.

  As is the case with many other groundbreaking discoveries, the smallpox vaccine, the first and most powerful of all, happened from a serendipitous and monumental observation. In the late 1700s, a small-town country doctor, Edward Jenner, noticed that farmers and milkmaids exposed to cowpox never seemed to suffer from smallpox during its frequent outbreaks. The milkmaids would retain their beautiful, blemish-free complexions after a brief bout with the illness, unlike those who either died from smallpox or suffered mightily and had a pockmarked face to show for it. Jenner began investigating if these workers were getting naturally vaccinated (vacca means “cow” in Latin) by exposure to the cowpox virus, which somehow provided protection against the smallpox virus.

  Poxviruses are known to infect many animals, and cowpox was a common disease among cattle at the time of Jenner’s observations, but produced much milder symptoms than smallpox, its more deadly relative. In 1796, Sarah Nelms, a young dairymaid, went to Jenner with cowpox lesions on her hands. After noticing the pustules were on the part of Sarah’s hands she used to milk cows, Jenner inquired about the health of the animals. In fact, Sarah told him, a cow named Blossom had recently been infected with cowpox. Back then, there was no requirement to get approval from an independent review board to experiment and test his theory. So Jenner obtained some of the material from Sarah’s pockmarked hand and scratched it into the arm of an eight-year-old boy, named James Phipps, the son of his gardener.

  About a week later, Phipps developed temporary symptoms that included chills, a fever, some generalized discomfort, and loss of appetite. Two months later, Jenner conducted a risky human challenge trial, when he purposely exposed the young child to smallpox material. Keep in mind, this was a known deadly infection at the time, and no one was certain this would work or that the boy wouldn’t succumb to the illness. I can only imagine the wave of relief when the boy stayed well, and Jenner concluded that his subject was protected from the deadly smallpox. Still, his idea of vaccination, scratching a small amount of cowpox virus into healthy individuals, was not an easy sell initially. But eventually people accepted the pricks in the arm—a vaccine that was itself a living virus named vaccinia and was delivered via a bifurcated needle. Most people born before 1972 have the telltale roundish, semi-sunken scar on their upper arm to show for it.

  Unlike more modern vaccines, the smallpox pricks carried such a high viral load, injected just below the skin’s surface, that a local infection of smallpox would occur, followed by the scar that could be up to an inch in diameter. After decades of worldwide vaccination, smallpox was declared eradicated in the United States in 1972, just a few years after I was born; in 1977 a single case of smallpox occurred in Somalia for the last time, and in 1980, the WHO considered smallpox to be eradicated worldwide.3

  The story didn’t quite end there, however. In recent years, rumors about the original vaccine coming from horses rather than cows have caused scientists to rethink the centuries-old story. Jenner himself had suspected that cowpox originated from horsepox and sometimes used material directly obtained from horses to inoculate against smallpox.

  Not a lot is known about horsepox, and the virus seems to have become extinct, although it possibly remains circulating in an unknown reservoir. Studies mapping its genome show it to be very similar to some old vaccinia strains, bolstering the hypothesis that the vaccine could have been derived from horses. In a letter to the editor published in the New England Journal of Medicine in 2017, researchers said they discovered vials of smallpox vaccines from the nineteenth century that contained the horsepox virus.4 And to add another layer of puzzlement, both horsepox and cowpox may originally have been rodent poxviruses that only occasionally infected livestock. At least one company today is revisiting a live, modified horsepox virus to develop a COVID vaccine, modifying it to target the COVID spike protein.

  Dr. Larry Brilliant, a visionary epidemiologist, technologist, and philanthropist, had the privilege of seeing the last ca
se of smallpox in the world during his crusade to end the “speckled monster” scourge in the 1970s while working in collaboration with the WHO. He is one of our generation’s most decorated and celebrated public health experts with a sharp eye on ending pandemics as CEO of Pandefense Advisory and chair of the advisory board of the nongovernmental organization Ending Pandemics. Over the years, he has become a friend, and our communication is often in Hindi, which he knows quite well from all the time he spent in India. While he didn’t choose his last name, I tell him that it suits him very well. It was while working in India, the last place on Earth where smallpox persisted, that Brilliant came across a young girl named Rahima Banu who had contracted the virus in October of 1975 at the age of two and survived. She was the last case in an unbroken chain of transmission of killer smallpox that went all the way back to Pharaoh Ramses and beyond, probably 10,000 years.5 “Billions of people died of smallpox,” Brilliant reminds me. At one stage of the virus’s treacherous march across Europe, it was the single biggest cause of death, killing 400,000 every year. In the Americas, it ravaged Native Americans and led to the collapse of entire cultures. It slaughtered around 30 percent of those who contracted it, leaving a third of survivors blind and almost all who did not die scarred for life. Medical historians have even suggested that we owe part of our longevity today—a doubling of life expectancy between 1920 and 2020—to the smallpox vaccine and eradication of the menace through activism and vaccination campaigns.6

 

‹ Prev