Book Read Free

Best Care Anywhere

Page 3

by Phillip Longman


  I was similarly shocked at how little the various specialists involved in her care seemed to consult with one another, or to keep up to date on the results of tests. In one emotionally devastating meeting, for example, the discussion began with various members of Robin’s “team” optimistically discussing her prospects for reconstructive surgery. Robin and I were both thrilled that the lumpectomy was an apparent success and that her chemotherapy seemed to be working to contain the cancer. But well into the meeting, one doctor began to fidget, finally asking if anyone had looked at the results of a recent liver scan. The team quickly departed, leaving Robin and me in an empty examining room for 30 or 40 minutes. Eventually, a grim-faced oncologist returned. The cancer had metastasized to her liver. It looked as if she was terminal.

  As I said, I never blamed her doctors for her death, but seeds of doubt sprouted in my mind about the system in which they were operating. Most of the doctors were sympathetic enough, and all were highly credentialed. But there seemed to be little attention given to managing information and coordinating care. It was as if, upon arriving at an airline gate, you were informed that the airline had lost track of the plane, couldn’t find its passenger manifest, and couldn’t say if it had passed its last inspection. At any given time, Robin’s medical records and test results seemed to be scattered in paper files kept by different departments. If any one doctor played the role of pilot, much less air traffic controller, I had no idea who he or she was.

  The experience of Robin’s treatment set off unsettling questions in my mind, though I tried to suppress them. Who was in charge of quality control? Why did everything seem to be done on the fly? Why did almost every routine process—doctor visits, lab tests, chemotherapy sessions—seem to involve interminable waits or changes in plan? I couldn’t offer Robin any comfort either when she received the news that she only had an estimated seventeen days to live and would have to go home from the hospital to die. A doctor had changed his mind without telling us about when he would share with us the results of Robin’s latest tests. And so she received this death sentence while alone in the hospital and had no one to talk to about it for hours. In a normal business, such as an airline, being perpetually late and having to shift plans constantly are sure signs that its processes are breaking down and that something bad is waiting to happen.

  Then there were all the logistical and insurance issues. When was someone going to change her IV? When could our two-year-old son visit her? How long could she stay in the hospital after she had been declared terminal? How could one arrange for home hospice care, what did it cost, and who would pay? I came away feeling that no patient should ever enter a hospital without having some kind of full-time advocate—a caring, calm, and shrewd relative or friend at least, preferably with medical training and a law degree—to help navigate all the potential perils. And I wondered why the American health-care system, or at least this one prestigious corner of it, had come to be like this.

  A short time after Robin died, I read in the newspaper that the Institute of Medicine had issued a landmark report in which it estimated that up to 98,000 Americans were killed every year in hospitals as a result of medical errors—a toll which exceeded that of AIDS, breast cancer, or even motor vehicle accidents. The article also put it another way: It was like three jumbo jets crashing every other day and killing all on board. I was shocked, but upon reflection, not incredulous.1

  Indentured Servitude

  Another reason I was eager to accept Fortune’s assignment was that the American health-care crisis seemed finally to be coming to a head. As long ago as 1970, the editors of Fortune had put out a special issue on medical care, declaring it “on the brink of chaos.” BusinessWeek, that same year, had a cover story on American health care titled “$60 Billion Crisis.” But health care by now was close to a $2 trillion crisis, and that didn’t even count all the indirect costs it was imposing on the economy and Americans’ pursuit of happiness.

  One of those indirect costs that I, along with millions of other Americans, had experienced firsthand was finding myself trapped in a job by my need for insurance. Shortly after Robin’s cancer was diagnosed, U.S. News went through a management shake-up. The editor who had hired me was summarily fired, and I found myself on the losing side of a regime change. The jig was up, and it was time for me to go.

  Although I had several tempting offers, I had to stay and tough it out as best I could because I could not risk changing insurance plans with Robin’s preexisting condition. As it turned out, I was fortunate to be able to keep my job for as long as I had to, and I’m very grateful to all involved for that. But the experience sensitized me to how many Americans are stuck in place year after year—unable to start a new business, go back to school, or even take time off to care for a loved one—just because of the way we finance our health-care system.

  I was also aware, of course, of the many familiar trend lines that were making our health-care system unsustainable and that have since grown much worse. Every year, the cost of health care rises faster than the economy grows, with results that are as predictable as they are depressing.

  Because of its soaring price, we see millions of workers forced to forgo raises and to assume more and more of the cost of their health care, even if they are still lucky enough to have group insurance. We see once-proud corporations like General Motors made wards of the state and forced to downsize in large part because of their ruinous liabilities for employee and retiree health-care benefits. We see state and local governments raising taxes, laying off teachers and firemen, neglecting roads, and cutting mass transit as expenditures for Medicaid and other health expenses relentlessly crowd out other budget priorities. We see the federal government going deeper and deeper in debt and having its credit downgraded as it tries to cover the exploding cost of publicly financed health-care programs. And we see Social Security, Medicare, and the rest of America’s social safety net becoming imperiled by our failure to “bend the cost curve” on health care even as childhood poverty explodes and the baby boomers—my generation—begin experiencing the infirmities and chronic illnesses of old age.

  At current growth rates, health-care spending is projected to consume anywhere from 119 percent to 142 percent of the entire real increase in U.S. per capita income over the next seventy-five years, sucking trillions of dollars away from other vital purposes.2 Can health-care spending at that level even begin to ameliorate the ill health it would cause? Such a price would necessarily reduce, as it is reducing today, the amount of time and money left for educating children, fighting poverty, investing in green technology, fostering community, and relieving all the other socioeconomic determinants of illness.

  Health Care’s Declining Pace of Progress

  A final reason I was eager to take on Fortune’s assignment was a little-known but diabolical fact I had stumbled upon shortly after Robin died. The more I thought about it, the more alarming and outrageous it seemed to me. I discovered it after reading a study by the Federal Reserve that calculated how many hours, in different eras, the average American worker had to be on the job to make enough money to purchase various big-ticket items.

  The study showcased the example of cars. Back in 1955, for instance, the average worker had to labor 1,638 hours to earn enough to buy a brand new Ford Fairlane. By 1997, the average American worker earned enough in just 1,365 hours to buy a brand new Ford Taurus, which, unlike the Fairlane, came with such standard features as air conditioning, airbags, cruise control, and power windows, steering, and brakes; and it got much better mileage. According to the study, a similar pattern of improving quality at lower real cost is true of nearly every big-ticket item for sale in the American economy.3

  But what, I wondered, would happen if one included the cost of health care, which the study did not? It’s a simple calculation, and when I did the math, the results were as devastating as they were revealing. If you’ve ever wondered how the nation’s per capita GDP could grow year after year over t
he last generation without most Americans feeling any richer, here’s a big part of the explanation.

  Let’s travel back to 1964, for example. Most Americans were feeling prosperous. Suburbia was burgeoning. Record numbers of American youth were becoming the first in their families to go to college. Intellectuals complained about the miseries of “The Affluent Society.” Yet the average American worker took home only $2.53 an hour. How does that square? You can’t just say that a dollar went farther in those days, because as we’ve just seen, the real cost of cars and just about every other consumer item has actually declined since that era. But there is a ready explanation. While workers in the 1960s had to put in many more hours on the job to purchase items like televisions, cars, or a ride in an airplane, they hardly had to work at all to cover the cost of health care. At the time, health-care spending in the United States was just $197 per person per year. This low cost meant that with a mere 78 hours of labor (or by the end of the second work week in January, for those working full time), the average worker earned enough to cover the per capita cost of health care, including that of all children and retirees.

  By contrast, in 2007, despite massive improvements in productivity outside the health-care sector, the average worker had to put in 411 hours before earning enough to cover the average per capita burden of medical expenses, which by then had risen to over $6,300. Put another way, in that year, it was well into March before the average American, working a 40-hour week, earned enough to pay the health-care sector’s growing claim on personal output.

  Given current trends in wages and health-care spending, by 2054, the average American worker will need to devote 2,970 hours a year to cover the cost of health care. That would mean working at least 8 hours a day, every day of the year, from January to December, with all of life’s needs outside of health care somehow financed by still more exertion. So much for the Affluent Society. Obviously, something big is going to give.

  It gets worse when you think about it. What kind of health care did Americans get back in 1964 for just $197? For those too young to remember that era, health care back then was far from primitive. A strong memory I have from childhood is that of my maternal grandfather explaining to me, sometime in the mid-’60s, how he could not in good conscience continue practicing medicine because it had become too sophisticated, complicated, and fast paced for him to follow any longer. He had graduated from the University of Michigan’s medical school in 1927 as part of a new generation of doctors whose training was rigorous, competitive, and grounded in science, and he had gone on to enjoy a distinguished career in medicine. But by the mid-1960s he felt out of his depth.

  The operations performed in that era included open-heart surgery, the implanting of pacemakers, and neurosurgery for the treatment of Parkinson’s disease and other neurological disorders. Electrocardiograms were in common use, and doctors had long since learned how to use defibrillation to jump-start stalled hearts. My own mother almost died of a misdiagnosed appendicitis in the 1960s, but once the right diagnosis was made, she was easily saved by an appendectomy, which by then had become a routine operation.

  Thanks to the increasing use of kidney dialysis machines, death rates from kidney disease were also plunging. Anesthesia no longer just meant knocking patients out with ether; it included local anesthesia, pain management, resuscitation, oxygen therapy, and the use of mechanical ventilators to avoid lung complications in patients recovering from major surgery. The polio vaccine was fully developed, and tuberculosis was nearly vanquished, as were such devastating child killers as diphtheria and whooping cough. Wonder drugs like penicillin and other new antibiotics had caused the death rate from pneumonia and other infectious diseases to plummet.

  In all the time I spent growing up in the 1960s, I knew only one classmate who died from a childhood disease, and I knew none who lost a parent to illness. Then, as now, cancer patients were treated with prolonged chemotherapy, the development of which had been generously supported by government funding since the mid-1950s. Doctors did not yet have PET scans or MRIs, but X-rays achieved much the same purpose and in any event required expensive equipment and highly trained personnel.

  The quality of doctors was also very high. Long gone were the quacks who had typified American medicine at the beginning of the century. By the 1960s, even elderly doctors, such as my maternal grandfather, had undergone medical training as prolonged and exacting as that received by doctors today, including a minimum of four years of medical school and at least one year of postgraduate internship. Though most physicians may not have lived up to the performance of such celebrated television doctors of the era as Marcus Welby, MD, and Dr. Kildare, polling data clearly show that health-care leaders in that era enjoyed a reputation for probity and professionalism that is long gone today.4

  Hospitals in the 1960s also offered levels of service as high as, and in some ways higher than, they typically do now. Private and semiprivate rooms were already the norm. In the 1960s, patients were also allowed to stay in the hospital much longer than today, which of course cost money. Not all of those extra days were medically necessary, but it would have been considered malpractice to send home patients who still required infusions or highly flammable oxygen tanks, as is routinely done today. Nor would hospitals simply send terminally ill patients home to die with a stash of morphine and some counseling from a social worker, which, as I discovered with Robin, is too often the present-day meaning of home “hospice” care.

  A Mexican acquaintance, who used to work in a store near our house, couldn’t believe it when I told him, shortly after Robin died, the reason he hadn’t seen me in two weeks. I explained that, with no hospice beds available at nearby institutions, I’d been holed up with my mother at home, with barely time to eat or sleep, trying to tend to Robin in her final days while at the same time looking after a very upset and angry two-year-old who couldn’t bear to watch his mother slowly die before his eyes. “In Mexico,” the man said, shaking his head, “we never send people home to die.”

  Hospitals in the 1960s also routinely bore the cost of providing a calm, safe place for alcoholics to detox and for the emotionally distraught to have their “nervous breakdowns”—services that are today usually offered on an “outpatient” basis. Again, providing these services in the hospital was not always the most cost-effective option, but these services were part of what Americans got from the health-care system for just $197 per year, or just two weeks’ labor.

  So why does the average American now have to work beyond mid-March of every year to earn the per capita cost of health care? Where is all this money going? And what improvements in health is it buying? Here the facts get even more outrageous. Yes, many individuals today owe their lives to treatments that were unavailable a generation ago. These notably include timely treatments in emergency rooms, which were still uncommon in the 1960s. (Though these days, emergency room treatments are maybe not so timely—if you arrive at the ER on a Friday night, you may wait through the weekend to be treated for a nonlethal condition.) We’ve also become much better at keeping underweight babies alive. Elective treatments like cataract surgery have improved the quality of many people’s lives. And yes, since the passage of Medicare and Medicaid in 1965, the poor and elderly have far better access to health care than they previously did.

  But for the population as a whole, the results in improved health and life expectancy are astonishingly modest. The rate of improvement in life expectancy, for example, actually slowed substantially after the explosion in health-care spending that began in the 1960s. Between 1900 and 1960, life expectancy at birth in the United States increased by an average of 0.64 percent per year. From 1960 to 2004, however, that rate of improvement declined by 40 percent, to just 0.24 percent per year.5 Moreover, the gains in life expectancy that have been achieved over the last forty years have come largely from broad social and technological trends, not strictly from medical interventions.

  For example, today’s Americans smoke f
ar less, drive far safer cars, run much less risk of being injured on the job, and are much less likely to be shot accidentally, to cite just four major nonmedical sources of increased longevity. From 1960 to 2002, the age-adjusted death rate from unintentional injuries, such as from car wrecks, firearm accidents, and on-the-job accidents, declined by 42 percent. Medicine can take credit for some of the increases in longevity over the last generation, but at least half of the improvement comes from nonmedical factors, such as mandatory airbags, gun locks, and the great shift of the workforce away from farms, factories, and mines into less hazardous, service-sector work. Epidemiologist John P. Bunker, a world-recognized authority on the determinants of health and longevity, estimates that only about 50 percent of the seven years of increased life expectancy at birth since 1950 is attributable to medical care.6

  The same period saw an astonishing increase in the cost and volume of medical care. According to Harvard health-care economist David M. Cutler, in 1960 the average American aged sixty-five or older consumed an inflation-adjusted $11,495 in health care during his or her remaining lifetime. By 2000, that number had jumped to $147,054. Yet despite this elevenfold increase in health-care spending per senior, the resulting gain in life expectancy was a mere 1.7 years.7 Measured by its “rate of return,” or the extra years of human life produced per health-care dollar spent, American medicine is amazingly unproductive and inefficient.

  Nor can we say that the increasing cost of health care just reflects the aging of the population. As the baby boomers slam into old age, population aging will indeed become a source of increased demand for health care. But over the last thirty or forty years, the percentage of population over sixty-five has grown only modestly, and there is broad consensus among researchers that population aging has so far been a minor factor in driving health-care costs.

 

‹ Prev