Book Read Free

The Danger Within Us

Page 6

by Jeanne Lenzer


  At the time, it was not uncommon for doctors to supplement their income as farmers or pharmacists. Although doctors’ incomes rose to $5,000 ($68,000 in 2015 dollars) by 1929, their pay was still relatively modest: a quarter of doctors in 1929 earned less than $2,300. To put that in context, the average income in the US at this time was $1,800, and a train engineer could earn $4,700.87

  Health insurance was first offered in the 1930s, mostly to cover hospital care.86 It spread slowly. As recently as 1940, just 3 percent of the US population had health insurance. Outpatient coverage came even later. In 1950, only 12 percent of all healthcare costs were covered by insurance, and total costs for a doctor’s visit remained low, at just three to five dollars per visit ($26 to $44 in 2015 dollars).88 Millionaire doctors were yet to be minted: the average doctor in 1940 was still solidly middle class, earning just over twice the income of an average worker. By 1949, the average doctor earned $11,058, and the median family income was $3,400.89

  Then, in the 1960s, researchers such as Eugene Braunwald began to take advantage of advances in electronics, computers, and biotechnology to invent new procedures, devices, and drugs. Over the subsequent few decades, during the “golden age” of medicine, new medical technologies poured onto the market. Medicine was transformed by CAT scans, MRIs, and PET scans, which could make visible every millimeter of the human body. The market was flooded with new implantable devices, including cardiac stents, designed to open blocked coronary arteries; cochlear implants, which could help deaf people hear; implantable cardioverter defibrillators, which could correct dangerous heart rhythms; and deep-brain stimulators, to help people with Parkinson’s disease. But these technologies came with eye-popping price tags that only a handful of wealthy individuals could afford.

  Medicare would change all that.

  This second historical development was set in motion in 1965, when President Lyndon Johnson signed Medicare into law. Medicare served two main political and economic purposes; it provided a hedge against growing demand for a national health program by offering coverage to seniors rather than the entire population. It also opened up a vast new market for costly healthcare technologies, transforming senior citizens, regardless of their personal financial circumstances, into relatively affluent medical consumers.

  Initially, the American Medical Association (AMA), representing US physicians, had vigorously opposed Medicare as “socialized medicine.”90 As passage of the law became increasingly unavoidable, lobbyists for the AMA, which contributed campaign money to politicians, pressured Congress and won several concessions to ensure that doctors would get a big cut of the goodies. Paul Starr, author of the Pulitzer Prize–winning book The Social Transformation of Medicine, tracks the role the AMA played in shaping Medicare for the financial benefit of doctors.91 The provisions inserted into Medicare by the AMA delivered more than one Trojan horse to the public.

  In her book Overtreated: Why Too Much Medicine Is Making Us Sicker and Poorer, Shannon Brownlee, senior vice president of the Lown Institute, in Brookline, Massachusetts, explains the complex, far-reaching impact of the advent of Medicare on the US healthcare system. Some of the results, she notes, were unquestionably positive: “One of the great social programs of the twentieth century, Medicare not only made healthcare available to millions of elderly citizens, it also spurred the desegregation of hospitals in the South—and brought down infant mortality rates among African Americans as a result.” But not all the changes Medicare effected were for the public good. “For all their worries about socialized medicine imperiling their livelihoods,” Brownlee says, doctors would in the end “reap a bonanza from Medicare.”90

  One AMA provision ensured that doctors would be paid on a fee-for-service basis for whatever they said was their “usual and customary” fee. This created a bonanza for doctors and led to dramatic increases in prices for healthcare services. Fee-for-service payments provided financial incentives for doctors to do more things to patients because they were paid more for doing more. More tests. More surgeries. More procedures. More pills. Of course, this incentive had always been there, but now the scale was several orders of magnitude greater. Since doctors could bill Medicare at far higher prices than they could reasonably expect the average patient to pay out of pocket, doctors began to claim ever-escalating “usual and customary” fees. Conversely, the kinds of nonquantifiable, impossible-to-charge-for services that patients often need and want most, such as the availability of a doctor who will spend time listening to them and carefully discussing options, were not rewarded by the Medicare system.

  Another feature of Medicare that contributed to the spectacular rise in national healthcare expenditures was that it specifically excluded a cap on payments.90, 91 Unlike a single-payer system, in which government budgets a certain amount of money for healthcare, just as it does for defense, education, and policing, Medicare was infinitely expandable. Whatever doctors charged and no matter how often they submitted charges, Medicare paid.91

  Although some changes in reimbursement schemes for Medicare payments, including price caps, would be instituted many years later, the initial cap exclusion opened the door to soaring profits for doctors and hospitals. Routine hospital, intensive-care unit, and coronary-care unit stays costing tens and even hundreds of thousands of dollars were now possible. Suddenly healthcare prices were soaring far beyond what the average Joe or Jane without Medicare could afford. This in turn led to a surge in demand for private and worker-based health insurance, because those without Medicare could no longer afford medical treatment. By 1986, just twenty years after the passage of Medicare, fully 84 percent of the population was covered by health insurance.92

  Another provision of Medicare resulted in cost-plus payments to hospitals, i.e., whatever price hospitals set as their “costs” would be reimbursed by the government with an additional fee on top, which also ensured a rising price tag, because hospitals were paid more if they charged more. Brownlee has bemoaned the arrangement, saying it allowed hospitals to make out “just like the Beltway bandit military suppliers.”

  Because the demand for hospitalization is largely under the control of doctors, admission rates don’t necessarily reflect the needs of patients but often are driven by hospitals’ financial need to fill their beds. As a result, cost-plus payments meant that hospital charges rose at a spectacular rate. Brownlee describes the financial cycle this way: “Build a new wing, fill it up, your costs rise, and therefore your reimbursements and profits rise. Build build build! And of course you need more hospital-based nursing and physician labor to staff all those beds, and so our hospital-centric system grew and grew and grew—at the expense of community and home-based care. House calls went the way of the bleeding cup and the leech.”90

  The evolving economic system affected healthcare in other ways, many of them unpredictable. With the rise of multiple insurers, doctors had to deal with endless insurance forms as well as the varying standards of care required by third parties. Consequently, administrative costs began to soar until they came to consume about 20 percent of healthcare expenditures. This led many doctors to sell their practices to large practice groups that could handle the flood of paperwork. Banding together into groups also allowed doctors to pool the capital resources needed to pay for expensive equipment, such as CT scanners, which in 2012 had an average list price of $1.2 million.93

  Over time, many doctors became disenchanted with practice groups and their incessant meetings and disagreements about how to handle the business side of medicine, leading many doctors to agree to sign on as employees of large groups or hospitals that span entire towns, states, and, in some instances, nations. Thus the long-term impact of Medicare was both a vast expansion of national spending on healthcare services and the transformation of healthcare from a cottage industry made up of thousands of small independent providers to a corporatized business dominated by big companies.

  The third key development was the Bayh-Dole Act, passed in 1980. This law,
intended to promote innovation through “technology transfer,” allowed public universities to patent the products of their research for the first time. Bayh-Dole institutionalized financial incentives that encouraged scientific researchers at universities to focus more on discoveries with major commercial potential than on those with long-term, purely scientific, or public-health value.94 The monetary incentives set in motion by Bayh-Dole have led to a number of scandals involving falsified, exaggerated, or distorted medical research claims (more on those scandals later).

  Thanks in large part to these three factors—the rise of technologically based medical care, the passage of Medicare along with the rise of private health insurance, and the Bayh-Dole Act, which rewarded commercial research at universities—the medical-industrial complex was born. It is a behemoth made up of interlocking organizational interests: hospitals, insurers, professional medical associations, pharmaceuticals companies, device manufacturers, research institutions, medical journals, electronic-medical-record developers, and many more. All these parties are economically dependent, ultimately, on the existence of end products for sale: medical treatments, drugs, surgeries, and devices, from insulin to pacemakers to life-support machines. Without those products, many of the insurers, the hospitals, the manufacturers, and even the doctors themselves would have no market, no income, and no reason for existing. Thus the logic of the marketplace makes it all but inevitable that this network of interested individuals and organizations must, consciously or unconsciously, devote much of its time, energy, and financial resources to promoting more sales of its products—more drugs, more medical procedures, more tests, more surgeries, more medical devices, and so on.

  From the 1960s through 1980, advocates of free-market healthcare supported each of these developments on the grounds that market forces increase efficiency and put downward pressure on spending. Those predictions proved wrong. In 1950, US health expenditures accounted for just 4.6 percent of the gross domestic product. By 2009, they consumed 17 percent of the GDP.87, 95 Healthcare spending continued to accelerate from there, and by 2015 expenditures reached $3.2 trillion. One of every five dollars in the US now goes to pay for medical expenses.

  Today, for the first time in history, instead of being compensated as teachers and engineers are, it’s possible for doctors, hospital CEOs, and insurance executives to become multimillionaires. In a capitalist society like the one in the US, the availability of unlimited profits is a powerful motivating force. The vast sums of money that began to pour into healthcare in the late 1960s inevitably attracted big corporations, ambitious entrepreneurs, and savvy investors looking for profitable opportunities. And just as inevitably, they began to influence the motivations and expectations of healthcare providers, even people who’d originally entered the field for noble humanitarian reasons. Companies and their employees may be forced to adopt the same ethically questionable methods and incentives their competitors use, lest they fail in the marketplace, making it difficult for even the most well-intentioned people to act in ways consistent with their values. And for some individuals, seeing colleagues and industry rivals becoming wealthy doing the same kind of work as they do, can encourage the choice to get in on the action.

  One of the most powerful and rapidly growing participants in this burgeoning economic network is the medical device industry. Generating estimated revenues of more than $136 billion in the US in 2014, the implantable-device industry is even more lucrative for many of its component companies than the highly profitable pharmaceuticals industry.96, 97 Operating-profit margins for the largest companies include 30.0 percent for Zimmer Biomet (artificial joints), 28.6 percent for Medtronic, 25.8 percent for St. Jude Medical, and 23.1 percent for Stryker (orthopedic implants).97 Johnson & Johnson’s device division squirreled away $7.2 billion in profits in 2012, while Medtronic pulled down $3.6 billion in profits that year.98 Ultra-high prices help fuel those giant profit margins: Medtronic charges approximately $19,000 for a neurostimulator used to treat back pain, which is four times what it costs to manufacture the device. Hospitals pay $4,500–$7,500 for a popular type of hip implant that costs $350 to manufacture.98 Hospitals are asked to sign confidentiality agreements, obliging them not to reveal their purchase prices.

  Implantable devices have become fixtures of the healthcare landscape. No one knows how many millions of Americans are walking around today with medical devices implanted in their bodies. Unlike drugs, which are labeled, dated, and tracked, there is no comparable tracking system for medical devices. An effort to correct the problem with unique device identifiers (UDIs) has been under way for well over a decade, and although most devices will have to have bar codes by 2020 indicating the UDI, there is currently no mechanism or requirement to incorporate the UDIs in patients’ electronic health records, thus undermining their central purposes: to allow tracking of device performance and to rapidly notify implanted patients of device problems or recalls.99, 100

  Millions of Americans are implanted with devices annually: half a million have stents placed in their coronary arteries each year, while nearly a million have artificial hips, knees, and shoulders implanted. In 2002, one expert calculated that one of every ten Americans is implanted with a medical device. If the 10 percent rate of 2002 holds true today, that would mean that thirty-two million Americans have implanted devices in their bodies and are profoundly affected (for better or worse) by the way the FDA approves, monitors, and manages those devices. With the rapid increase of implantable devices in the past decade, however, the number of Americans with an implanted device is probably substantially more than thirty-two million. A 2011 review found that 6.7 million individuals were implanted annually with the top eleven implanted devices—meaning that in just over ten years, some seventy million individuals were implanted. And of course, as profit-seeking corporations, device makers use a portion of their huge profits to mount marketing and advertising campaigns designed to convince doctors and ordinary citizens that even more medical devices are needed.

  The allure and remarkable benefit of some devices is undeniable. John Calhoun, a sixty-four-year-old man with a pacemaker, says he’s delighted to have the device inside him. “I have to take pills every day for my high blood pressure, and it reminds me every day that I have a problem. Besides, it’s a hassle. But the pacemaker is just there.” He taps the spot under his collarbone where a small bulge gives away the presence of the pacemaker. “I don’t have to think about it. And it’s saved my life.”

  Certain devices have changed many lives for the better. Artificial lenses used in cataract surgery have restored vision to millions who would otherwise have been functionally blind. Pacemakers keep hearts beating. And artificial hip joints have allowed previously bedridden and wheelchair-bound patients to walk again.

  But sadly, countless patients have also been betrayed by the companies that manufactured the devices implanted in their bodies and by the regulatory authorities that were supposed to ensure that these devices would help rather than harm them. And just as we don’t know how many people are currently implanted with medical devices, we don’t know how many people die each year from complications caused by medical devices. An estimate by the Brookings Institution pegged the number at three thousand.101 But the actual number could be much higher. In 2015, approximately sixteen thousand deaths associated with medical devices were reported to the FDA. And a Government Accountability Office analysis found that 99 percent of device-related “adverse events” are never reported to the FDA and that the “more serious the event, the less likely it was to be reported.”102 Based on the GAO analysis, that means medical devices could have been associated with as many as 1.6 million deaths in 2015. Even if only 1 to 10 percent of those deaths were caused by a device, that means between 16,000 and 160,000 people may have been killed by devices, making medical devices one of the leading causes of death in the US.

  So why is the Brookings estimate three thousand while the FDA’s own database reveals approximately sixte
en thousand reported deaths in 2015? It’s a mystery. The Brookings report was produced under a cooperative agreement between the FDA and the Engelberg Center for Healthcare Reform at the Brookings Institution. Former and current FDA experts, including former FDA commissioner Mark McClellan, authored the report, which according to Brookings took a good deal of expertise and “countless hours” of research to compile. But neither the FDA nor Brookings was able to explain how they arrived at the estimate of three thousand deaths, which was reported by the New York Times, Modern Healthcare, and other outlets.

  In an e-mail to me dated August 29, 2016, a spokesperson for the FDA stated, “The report does not provide a citation for its mention of the 50,000 serious injuries and 3,000 deaths, so it is unclear how these numbers were derived.”

  Brookings is funded in part by drug and device manufacturers (for example, in 2015, Genentech, which develops both drugs and devices, contributed between $500,000 and $999,999 to Brookings). The planning board for the report included representatives of device makers such as Medtronic, United BioSource, and ReVision Optics. Conspicuously absent from the planning board were members of any watchdog groups, such as the Public Citizen Health Research Group or the National Center for Health Research, among others.101

 

‹ Prev