Miracle Cure
Page 31
Another way of applying the arithmetic of failure rates is to use it to predict the number of promising “new molecular entities” that will actually prove out as therapeutically useful. A pharmaceutical company that identifies a thousand compounds with some potential for the treatment of a disease like Alzheimer’s, for example, and knows that the failure rate in preliminary testing was somewhere between 95 and 99 percent, could guess that between ten and fifty compounds might survive to the next round. Moreover, if the failure rate were constant, then the likelihood of success would increase over time; the longer you spend looking for the next miracle drug, the closer you are to finding it.
The relevance for the riskiness of drug development is fairly clear. If pharmaceutical research were characterized by a constant failure rate—even a very high one—it might be expensive, but not particularly risky: At any given moment, the probability of a successful outcome would be known. If, on the other hand, the failure rate were fundamentally variable, then years will be spent on completely fruitless searching.
Drug development, from proof of concept (sometimes called “phase 0”) through phase 3 clinical trials, has never exhibited anything resembling a constant failure rate. This means that it has inevitably grown riskier over time. Many close observers of the phenomenon have argued that this is a reason for transferring the risk of pharmaceutical innovation to society at large, by increasing government support both for basic biomedical research and for testing the products of that research.
There’s no intrinsic reason why government agencies (or not-for-profit institutions like universities) are fundamentally incapable of funding, or even managing, the phased testing system that Frances Kelsey developed at the FDA more than fifty years ago. However, their involvement would not fundamentally alter the relationship between risks and rewards. For more than a century, society has farmed out the risk of pharmaceutical development, testing, and manufacturing to the institutions willing to undertake it, but they only do so when the potential rewards are large. Inveighing against pharmaceutical company greed just camouflages this unavoidable truth.
—
The machine of pharmaceutical innovation—one that wouldn’t exist, would never even have been built, but for the antibiotic revolution—is decidedly imperfect, full of inefficiencies and side effects, and incredibly costly to run and maintain. Paul Ehrlich’s side chains, or Dorothy Crowfoot Hodgkin’s X-ray crystallography, or even Norman Heatley’s jury-rigged distillation apparatus were the result of motivated intellectual effort in search of some reward. The motivations haven’t always been particularly noble; Pasteur’s hatred of Germany for the Franco-Prussian War and Ernst Chain’s lust for a Nobel Prize come to mind. Only a very jaundiced observer, though, would think the bargain wholly bad. Winston Churchill famously observed, “It’s been said that democracy is the worst form of Government except for all those other forms that have been tried from time to time. . . .”
The pharmaceutical industry is rightly criticized for spending millions to acquire knowledge that it then uses to find more expensive treatments for existing conditions. Or even to “medicalize” conditions in order to create a market for a new treatment.
On the other hand, consider HAART.
Though the disease was first discovered in 1981 (and given the name AIDS a year later), the virus responsible wasn’t identified until 1983. Even before then, Burroughs Wellcome’s U.S. subsidiary, with its long history of investigating obscure diseases, was researching the new and horrifying disease that killed its hosts by destroying their immune systems and so exposing them to hitherto rare syndromes like the cancer known as Kaposi’s sarcoma and, even more relevant, retroviruses like HIV. In 1983, one of the company’s biochemists, Jane Rideout, started investigating the chemical properties of an antibacterial compound known as azidothymidine: AZT for short.
Simultaneously, other researchers at Burroughs Wellcome had pioneered a dramatically different way of testing new molecules for effectiveness; instead of Selman Waksman’s time-honored trial-and-error method of testing large numbers of promising chemical compounds, the new technique—which would win a Nobel Prize for its inventors—required identifying a chemical that the target pathogen needs to reproduce, and replacing it with an analogue that attracts the pathogen while sabotaging its reproduction. Rideout realized that AZT was a near-perfect analogue for a chemical required for HIV . . . and Burroughs Wellcome agreed. Only three years after the human immunodeficiency virus was first identified—a pace that recalls the original antibiotic revolution—the Burroughs Wellcome company introduced AZT as the first effective anti-AIDS drug.* A decade later, when Merck received FDA approval for the antiviral drug Indinavir, and a separate set of approvals were granted to drugs known as NRTIs (nucleoside reverse transcriptase inhibitors), the combination therapy known as HAART, for highly active antiretroviral therapy, transformed HIV from a death sentence to a chronic, and treatable, condition.
HIV-positive patients are scarcely alone in their debt to pharmaceutical innovation. Tens, perhaps hundreds, of millions of victims of a thousand diseases from leukemia to river blindness are alive and thriving entirely because of a drug breakthrough. For them, and especially for the literally uncountable number of people whose bacterial infections, from strep throat to typhus to anthrax, were cured by a ten-day regimen of antibiotics, the bargain probably seems an extraordinarily one-sided one. Like Anne Miller and Patricia Thomas, they were, and are, living, breathing evidence that Joseph Lister’s dream came true.
EPILOGUE
“The Adaptability of the Chemist”
Many of the institutions at the forefront of research on infectious disease are as prominent today as they were during the years of the antibiotic revolution, and even before. While the Lister Institute of Preventive Medicine is now a charity employing fewer than half a dozen full-time staff members, it continues to fund research and disburse prize fellowships, some of them very substantial. The Robert Koch Institute is an agency of the Federal Republic of Germany, still performing significant research, though with more emphasis on epidemiology than pharmacology or biochemistry. The Institut Pasteur remains one of the world’s preeminent research institutions, with more than a hundred labs and a thousand scientists working throughout the world on basic biochemistry, virology, and a huge number of other subjects.
Oxford’s Sir William Dunn School of Pathology continues to occupy the cutting edge of research into human health and disease, publishing papers on everything from T-cell activation to intercellular signaling to—yes—bacterial pathogenesis. And St. Mary’s Hospital still runs a Centre for Infection Prevention and Management (and is custodian for Alexander Fleming’s laboratory, now a museum).
Peoria’s Northern Lab—renamed the National Center for Agricultural Utilization Research in 1990, but still known to almost everyone as the Northern Lab—remains committed to producing more and better agricultural produce, while also maintaining the Agricultural Research Service’s Microbial Culture Collection, nearly a hundred thousand strains of actinomycetes, yeasts, and molds that are made available to research institutions throughout the world. In 1965, the Rockefeller Institute for Medical Research became Rockefeller University, but didn’t miss a step in any other sense; twelve Nobel Prizes in Physiology or Medicine were awarded to its researchers over the next fifty years.
The laboratories built by Abbott, Squibb, Merck, Eli Lilly, and others in the 1930s continue as factories of innovation, though the drug industry has experienced a vertigo-inducing series of corporate makeovers since the beginning of the antibiotic revolution. Some of the original companies recruited by A. N. Richards and the OSRD in 1943 are still recognizable, though each is enormously larger. Merck, which merged in 1953 with the Philadelphia-based Sharp & Dohme and subsequently acquired half a dozen other competitors, including Schering-Plough, now produces revenue in excess of $40 billion annually. Pfizer is even bigger, a company with sales of more than $50 billi
on. Eli Lilly is a $23-billion company. The combination of Bristol-Myers and Squibb, which merged in 1989, weighs in at nearly $20 billion as does Abbott Laboratories.
Others are no longer going concerns, run onto the rocks by waves of the “creative destruction” that the Austrian economist Joseph Schumpeter called the defining characteristic of capitalism. In 1988, Eastman Kodak acquired Winthrop (or Sterling Winthrop), a member of the original penicillin project and the discoverer of the first quinolone antibiotics. It was then broken apart and sold, in pieces: to the French pharmaceutical company Sanofi, to the British firm SmithKline Beecham (a successor to the original Beecham’s Pills, now known as GlaxoSmithKline), and to the revived German giant, Bayer, which, as a result, finally reacquired the rights to the name “Bayer Aspirin.” Earlier, in 1974, Bayer had acquired Cutter Laboratories. Parke-Davis, America’s biggest pharmaceutical company through the 1950s, never achieved that status again; in 1976, it was acquired by Warner-Lambert. When Pfizer acquired Warner-Lambert in 1976, though, the most valuable asset in the transaction was a discovery made in Parke-Davis’s labs: the cholesterol-reducer atorvastatin, which, as Lipitor, became the most profitable drug of all time. Hoechst AG, where Emil Behring and Paul Ehrlich made history, was reconstituted after the Second World War. In 1999, it merged with Rhône-Poulenc (the onetime employer of German pharmacology’s nemesis, Ernest Fourneau). In 2004, the combined company—briefly Aventis—was acquired by Sanofi.
It is difficult to calculate, with all those acquisitions and divestitures, just how large the enterprises borne out of the original penicillin project became. A reasonable guess is that they deliver to their shareholders somewhere north of $40 billion in operating income annually. What they don’t deliver much of is antibiotics. Though Pfizer still makes four antibacterial compounds, in 2011 the company closed its dedicated-to-antibiotics Connecticut research lab. Roche, Bristol-Myers Squibb, and Eli Lilly—all charter members of the penicillin project—no longer make any antibiotics at all. Neither does Johnson & Johnson, the largest pharmaceutical company in the world.
The primary reason is that it’s extraordinarily difficult to find new antibiotics. After sixty years, almost every antibiotic that remains on pharmacy shelves still uses one of a very limited number of methods for attacking pathogens—disrupting bacterial DNA, weakening bacterial cell walls, inhibiting the enzymes used by bacteria to synthesize proteins—that were used by the original beta-lactams, macrolides, and tetracyclines. The successors to penicillin and erythromycin are more effective and less toxic than the versions that started the antibiotic revolution in the 1940s, but they’re refined versions of a seventy-year-old biochemical technology. Molecules aimed at new targets, such as drugs that disrupt bacterial DNA synthesis (by, for example, inhibiting the enzyme that allows DNA to unwind without breaking), are regularly tested. A few have made it all the way into clinical trials.
In addition, almost all of the newly discovered molecules that show some antibacterial potential have the tyrothricin problem: They’re just as toxic to humans as they are to pathogens, which places something of a ceiling on their appeal. It’s because antibiotic-resistant infections have become so dangerous, and so ubiquitous, that drugs like colistin, first isolated in 1949 but so toxic to kidneys and the nervous system that it never came into wide use, is now a last-resort drug for resistant Gram-negative infections. When a patient is at risk of death from an infection that is resistant to safer antibiotics, the risk of kidney failure appears less daunting.
The genomic revolution, by identifying which genes were the blueprints for essential proteins, could, in theory, have empowered medicinal chemists with the ability to target only the genes necessary for bacterial survival and leave the ones for mammals untouched. It’s easy to see why this seemed so promising; knowing every aspect of a particular bacterium’s genetic makeup—what it eats, how it reproduces—would surely produce true “magic bullets.”
The promise of genomic antibacterials remains unfulfilled. The first bacterial genome was sequenced in 1995—Haemophilus influenzae, the likely killer of George Washington in 1799—and thousands of genes were identified shortly thereafter as potential antibacterial targets, because they produced proteins essential to bacterial survival. Dozens of pharmaceutical companies evaluated them, exposing the genes to literally hundreds of thousands of molecular compounds (GlaxoSmithKline, between 1995 and 2001, assayed nearly half a million alone). Seven years, and more than $100 million later, fewer than half a dozen even qualified as “hits”: potential “lead molecules.” Given historical rates of attrition, the number that might even make it to the next step—as a “development candidate”—is statistically indistinguishable from zero.
Even if weren’t so difficult to find new antibiotics, their very nature makes them a suboptimal long-term investment given the need to allocate limited resources among a wide range of alternatives. Antibiotics were, in a sense, victims of their own dramatic effectiveness. A drug that does its job in ten days can’t possibly compete for institutional resources with one that will be taken every day for a lifetime. The managers and shareholders of companies like Merck, Pfizer, and Eli Lilly didn’t require very sophisticated arithmetic to see a greater potential return from drugs that treated chronic ailments rather than acute infections. And they invested their research and validation assets accordingly. From 1962, when George Lesher of Sterling Winthrop* discovered the first of the quinolone antibiotics, until 2000 not a single new class of antibacterial drugs appeared. Between 2011 and 2013, the FDA approved only three new molecular compounds that might combat bacterial pathogens. The cost of developing a new antibiotic is higher, in relative terms, than the price for a new drug to treat depression, or cancer, or hypertension.
What are those costs? The most widely cited method for calculating the cost of drug development estimated, in 2003, that the average out-of-pocket cost for a drug at the moment it received marketing approval from the FDA was more than $400 million. Another calculation, made in 2011, and no less controversial, came up with a median R & D cost of $43.4 million.
There are two primary reasons for the huge difference between the estimates, each of which is regularly used as a bludgeon in the never-ending debate over drug prices. The lower number fails to account for any costs incurred prior to the submission of the drug for the first stage of FDA approval; this “phase 0” stage, in which thousands of potential molecules are screened for evidence of antibacterial activity, and hundreds extracted in quantities sufficient for testing, can take more than five years, and is responsible for fully 30 percent of the $400 million figure. The lowball calculation only estimates the R & D costs directly attributable to the new and approved drug. Since most of the R & D budget of a large pharmaceutical firm is spent on drugs that never make it to market, failing to account for those dollars somewhere is a fairly significant bit of financial sleight of hand. A number that estimates total R & D costs without any basic research costs because—as the author of the original paper wrote—“there is no reasonable estimate available,” doesn’t encourage much faith in the number’s precision.
This doesn’t mean that the figures provided by drug companies themselves are disinterested and therefore reliably accurate. The big pharmaceutical corporations are regularly accused of profiteering; of overcharging for lifesaving medications; of financing favorable research and suppressing negative results; of producing drugs to treat conditions that are virtually nonexistent or new and expensive versions of drugs that are no better than the ones already on offer. Companies with those sort of image problems have every reason to magnify the size of their research budgets, if only to slow their descent in public esteem.
However, the commitment of pharmaceutical companies to research isn’t just a PR strategy. Pfizer’s audited spending on research and development exceeds $11 billion a year. If it really cost less than $50 million to bring a new drug to market—if the much-cited $43.4 million figure were
accurate—this would suggest that Pfizer alone should be launching 250 new drugs annually.
In 2012, the best year for new drug approvals since 1996, the FDA approved a total of thirty-seven “new molecular entities” for the entire pharmaceutical industry.
None of them were antibiotics.
—
If you leave the Smithsonian Museum of American History, the custodian of Anne Miller’s world-historic medical chart, and walk two blocks east along Constitution Avenue until you reach Fifteenth Street, then turn right and proceed for half a mile until you reach Pennsylvania Avenue, you’ll find yourself facing the White House. There, on July 9, 2012, seventy years after penicillin pulled Mrs. Miller back from the brink of death, President Barack Obama signed into law Senate Bill 1387: the Food and Drug Administration Safety and Innovation Act. The new law, yet another amendment to the original 1938 Food, Drug, and Cosmetic Act, included dozens of provisions regarding everything from a new protocol for the approval of medical devices, to the authorization of fees for users of generic and prescription pharmaceuticals, to protection of the global supply chain for finished drugs.
Less well publicized, but almost certainly as important, it incorporated into the legislation a series of provisions known collectively by the acronym GAIN: Generating Antibiotic Incentives Now. Intended to increase the likelihood that pharmaceutical companies would invest in the development of antibiotics that treat serious or life-threatening conditions, GAIN offered fast-track approval for compounds that promised to combat infections, and a five-year extension of their exclusive term of patent.
The reason for GAIN and other similar proposals was simple and terrifying. In the United States alone, antibiotic-resistant bacteria now infect two million people annually. More than twenty thousand of them die. Alexander Fleming’s observation in his 1945 Nobel Lecture—“It is not difficult to make microbes resistant to penicillin in the laboratory by exposing them to concentrations not sufficient to kill them”—had been an understatement of massive proportions. Infectious disease specialists today look back at the early days of antibiotic resistance with a kind of nostalgic fondness. How much easier to deal with bacteria that produce a single enzyme that inactivates penicillin* than with a hospital full of patients infected with MRSA (for methicillin-resistant S. aureus), which doesn’t just laugh at penicillin, but cephalosporin, ampicillin, and every other beta-lactam antibiotic? Or XDR TB (extensively drug-resistant tuberculosis), a bacillus that is unaffected by either isoniazid or rifampin, the more recent agents called fluoroquinolones, and at least one of these second-line drugs: capreomycin, kanamycin, or amikacin.