Best Care Anywhere
Page 9
A key to building enough political support to close such institutions was negotiating an unusual agreement with Clinton’s Office of Management and Budget. Under the agreement, any money Kizer managed to save by closing hospitals wouldn’t simply go back to the Treasury, as under the normal rules of federal bureaucracy, but could be used by the VHA for other purposes, such as building new outpatient clinics, expanding VistA, or ensuring that every VHA patient was assigned a primary care physician. This allowed VHA employees, veterans, and other interest groups to see that much more was going on under Kizer’s leadership than just ruthless downsizing.
Another key for cutting through political gridlock was Kizer’s decision to decentralize, reducing the authority at VHA’s central headquarters in Washington. As part of this plan, he created a series of twenty-two regional administrative districts, most of them crossing state boundaries and vested with as much power as possible in areas such as budgets and policy making. One practical advantage was simply to put VHA managers closer to those they managed and thereby create more accountability. But the measure was also politically shrewd.
For example, as chairman of the Senate’s Veterans Committee, U.S. senator John D. Rockefeller IV had considerable leverage over veterans’ issues and also had a particularly contentious relationship with Kizer. The senator found his state of West Virginia divided into five regional districts, all of which fell partly in other states. Because of the state’s mountainous terrain, people there have always been far more likely to travel to a neighboring state than to cross the state in search of care. This plan worked well for West Virginia’s veterans. But the administrative change meant that Rockefeller needed far more cooperation from veterans and politicians in other states if he wanted to save or tinker with some particular VA facility within his state.
Regionalizing the VHA power structure had other advantages as well. “It’s easier to have that dialogue with real people in the community,” says Kizer, “than it is with a congressional committee, where everyone wants to stand up for the flag and ‘do something’ for veterans, and you’ve got C-SPAN there hovering.” Decentralization combined with the VHA’s state-of-the-art information systems also meant that it became possible to hold regional administrators accountable for a wide range of performance measures, including how well they coordinated physicians, hospitals, and medical care services for a defined population within their administrative regions.
In his original blueprint for transforming the VHA, titled “Vision for Change,” Kizer wrote:
In an integrated health-care system, physicians, hospitals, and all other components share the risks and rewards and support one another. In doing so, they blend their talents and pool their resources; they focus on delivering “best value” care. To be successful, the integrated health-care system requires management of total costs, a focus on populations rather than individuals, and a data-driven, process-focused customer orientation.7
Kizer’s vision went far beyond any integration done by HMOs and other “managed care” private-sector providers, who quickly discovered that they most often lacked a “business case” for improving quality. In contrast to even the largest HMOs, the VHA could count on a relatively stable population of patients, which in turn gave it a built-in case for pursuing quality. Take, for example, the choice of drugs it uses. Many of those drugs, such as statins, which help lower cholesterol, bring about only long-term benefits to most patients—specifically, a reduced chance of one day suffering from heart attack or stroke. An HMO in which patients are constantly churning has no real financial interest in whether the particular statins it prescribes are the most effective.
For health-care providers who lack long-term relationships with their patients, even the question of whether a drug may eventually turn out to have long-term safety problems is not an urgent concern—so long as it has been approved by the Food and Drug Administration—because by the time patients begin to experience any long-term complications, they will have long since moved on to other health plans. Because of the churning of patients that occurs in nearly every American health-care system other than the VA, decision making tends to be dominated by short-term financial costs rather than by long-term benefits to patients’ health.
Workhorse Drugs
Realizing the unique incentives the VA had to maximize its patients’ health, Kizer set up an elaborate drug review process to establish what is known as a “formulary” of recommended drug therapies. Field investigations by VA physicians and pharmacists compared the effectiveness of new drugs with current therapies, considered any safety concerns, and decided whether the VA should include these new drugs in its formulary.
One result was that the VA would sometimes pay for pricey drugs that typically were not covered by other health-care plans, such as an expensive but effective compound used in the treatment of schizophrenia and high-quality statins used to treat high cholesterol. “If you know you’re going to have your patients for five years, ten years, fifteen years, or life,” explains Kizer, “there are both good economic and health reasons why you would want to use these more expensive drugs. You have a population of patients who are at high risk for sclerotic heart disease, and you’ve got them for life. You make a different decision about what’s on your drug formulary than you might if you knew you only had them for a year or two.”
After evaluating the safety and effectiveness of different competing therapies, the VA typically settles on a few “workhorse” drugs—such as the statin simvastatin to treat high cholesterol—that become part of the VA’s standard medical protocol. This exercise in evidence-based medicine not only brings health benefits to patients but also has the effect of further leveraging the VA’s already considerable purchasing power over drug companies, thus allowing it to negotiate deep discounts even on the highest-quality drugs.
Predictably, many drug companies hate the power the VA has over them. They fund studies claiming to find some inadequacy in its formulary, with the usual complaint being that the VA does not include enough “new and improved” drugs. One such study, for example, published by the drug industry–funded Manhattan Institute, purported to find a two-month decline in life expectancy among VA patients because the VA formulary included a lower fraction of new drugs than those typically in use by the rest of the health-care sector.8
Yet the independent and prestigious Institute of Medicine debunked the claim, finding that “the VA National Formulary is not overly restrictive.”9 As the millions of Americans who took Vioxx and other COX-2 inhibitors have learned all too painfully in recent years: just because the Food and Drug Administration approves a drug doesn’t mean it is a superior therapy, or even a safe one. It only means that in some short-run trials, usually financed by the manufacturer, the new drug proved more effective than a placebo.
According to William Korchik, a VA doctor who has participated in the VA’s drug review process, another big benefit of this policy over the years has been avoiding dangerous drugs.
We took a tough stand on the [COX-2] inhibitors by not putting the drug on our national formulary and requiring prescribers to complete a risk assessment tool on each patient before a COX-2 inhibitor could be provided.… Predictably, we were criticized up and down about our restrictiveness. But now I can say we were appropriately restrictive because there was not data [proving their safety].10
By 1998, Kizer’s shake-up of the VHA’s operating system was already earning him management guru status. His story appeared that year in Straight from the CEO: The World’s Top Business Leaders Reveal Ideas That Every Manager Can Use. Yet the revolution he helped set in motion at the VA was only beginning, even as the rest of the U.S. health-care system fell deeper into crisis.
SIX
Safety First
Everyone understands that a good health-care system needs highly trained, committed professionals. They should know a lot about biochemistry, anatomy, cellular and molecular immunology, and other details about how the human body wo
rks—and have the academic credentials to prove it. But these days, if you get sick with a serious illness, chances are you’ll see many doctors, including different specialists. Three-quarters of Medicare spending goes to patients with five or more chronic conditions, who see an average of fourteen different physicians annually.1 Therefore, how well these doctors communicate with one another and work as a team becomes critical. “Forgetfulness is such a constant problem in the system,” says Donald Berwick of the Institute for Healthcare Improvement. “It doesn’t remember you. Doesn’t remember that you were here and here and then there. It doesn’t remember your story.”
Are all your doctors working from the same medical record and making legible entries? Do they have a system to make sure they don’t collectively wind up prescribing dangerous combinations of drugs? Is any one of them going to take responsibility for coordinating your care so that, for example, you don’t leave the hospital without appropriate follow-up medication and the knowledge of how and when to take it? Just about anyone who’s had a serious illness or tried to be an advocate of a sick loved one knows that all too often the answer is no.
And it’s not just doctors who define the quality of your health care. All kinds of other people are also involved—nurses, pharmacists, lab technicians, orderlies, and even custodians. Any one of these people could easily kill you by performing duties incorrectly or if some aspect of the job is not properly managed with safety in mind. Modern hospitals may not produce catastrophic failures that kill thousands of people at a time. But doctors, nurses, and hospital technicians routinely deal with very dangerous technologies and powerful drugs that do kill thousands of Americans every year, albeit usually one at a time. Even a job such as changing a bedpan, if not done right, can spread deadly infection throughout a hospital. These jobs are all part of a system of care, and if the system lacks cohesion and quality control, many people will be injured and many will die.
Just how many? Nobody knows for sure, of course. One problem is a culture of cover-up that pervades health care. All individuals involved in medicine face a very real likelihood of being sued or punished by their superiors if they admit to even trivial mistakes. Given the can-do ethos of medicine, personal shame also causes many doctors and nurses to obscure mishaps or mistakes.
Then again, many of the accidents that occur in medicine go unrecognized by all involved. The elderly patient slips into dementia and eventually a coma; no one realizes that the proximate cause of her death was a pharmacist who misread a doctor’s scribbled prescription. Another elderly patient succumbs to pneumonia. No one realizes that the proximate cause of the infection was an orderly who neglected to wash his hands.
But there is no doubt the number of medical mistakes is very high. In 1999, the Institute of Medicine (IOM) issued a groundbreaking study, titled To Err Is Human, that still haunts health-care professionals. Hospital medical records revealed that up to 98,000 people die of medical errors in American hospitals each year.2 Subsequent findings suggest that the study may have substantially underestimated the magnitude of the problem. For example, hospital-acquired infections alone, most of which are preventable, account for an additional 100,000 deaths per year.3 According to a more recent IOM report, hospital patients in the United States experience an average of at least one medication error, such as receiving the wrong drug or wrong dosage, every day they stay in the hospital.4
On top of this are all the errors of omission. For example, there is little controversy over the best way to treat diabetes; it starts with keeping close track of a patient’s blood sugar levels. Yet, if you have diabetes, your chances are only one in four that your health-care system will actually monitor your blood sugar levels or teach you how to do it. According to a RAND Corporation study, this oversight causes an estimated 2,600 diabetics to go blind every year and another 29,000 to experience kidney failure.5
All told, according to the same RAND study, Americans receive appropriate care from their doctor only about half of the time. The results are deadly. In addition to the 98,000 killed by medical errors in hospitals and the 90,000 deaths caused by hospital infections, another 126,000 die from doctors’ failures to observe evidence-based protocols for just four common conditions: hypertension, heart attack, pneumonia, and colorectal cancer.
Why does this extraordinary loss of life go on year after year? The short answer is that, with the large exception of the veterans health-care system, few health-care providers are integrated or cohesive enough in their management and operations to promote safety and evidence-based medicine systematically.
In health care, as in all realms of life, the root cause of most accidents is not that some single person or even group of persons made a mistake, though they may have. Instead, the root cause is almost always a lack of any system or process for preventing human error or negligence. So, for example, a nurse may inadvertently kill a patient by administering a dose of potassium chloride concentrate thinking that it is liquid Tylenol or saline solution. But the root cause of the mistake was not one person’s lack of diligence, though the nurse may have been tired and distracted at the time. The root cause was the fact that both bottles were made by the same manufacturer and looked alike, and there was no system for preventing a nurse from confusing them. Because of this, firing the nurse won’t prevent the accident from happening again, or even reduce its chances by much. Some other nurse will eventually also be tired and distracted and will make the same mistake until there is some systematic fix that prevents it.
Full Disclosure
Long before studies like To Err Is Human began to appear, the veterans health system under Ken Kizer had begun to attack safety issues systematically. His first step was to convince his boss, the late VA Secretary Jesse Brown, that the VA should adopt a policy of full disclosure of any medical errors. Without this policy, Kizer argued, he could not get VA doctors and other personnel to see the scope of the problem or enlist them in creating a culture of safety.
The idea carried obvious political risk. No other health-care provider in the United States disclosed its mistakes. Kizer recalls Brown’s warning to him: “If this goes south, and politically it doesn’t work out, you’re the first casualty.” Kizer accepted those terms and announced the policy. Starting in 1997, the VA began maintaining a Patient Safety Event Registry. Reporting of medical mistakes became mandatory. At the same time, the VA promised medical personnel that it was looking for systematic solutions to safety problems, not seeking to fix blame on individuals except in the most egregious cases. The good news was a thirtyfold increase in the number of medical mistakes and adverse events that got reported. The bad news was that those numbers, despite evidence of continued underreporting, added up to appalling totals.
According to a report released by the VA’s medical inspector, the veterans health system committed 2,927 medical errors leading to 710 deaths between just June 1997 and December 1998. In addition to medication errors, the problems described by the report included surgery on the wrong body part or the wrong patient, errors in blood transfusions, patient abuse, improper insertion of catheters or feeding tubes, and a variety of other therapeutic misadventures. Other errors included losing track of 113 patients—who turned up in hospitals and nursing homes in which they were not listed—and failing to prevent 277 patient suicides.6
It took a while for the press to catch wind of this report, though it was a matter of public record. Kizer remembers going to Kinko’s one weekend to make a copy for the New York Times’s Robert Pear, who had called him at home to ask about its existence. Predictably, once Pear broke the story, it generated tough headlines around the country. “VA Finds Deadly Errors at Hospitals,” trumpeted the Orlando Sentinel. Under the headline “Killer Hospitals” the Detroit News editorialized: “Congress should disband the veterans health system and hand its beneficiaries vouchers or tax credits to purchase their own health care.”
But many news outlets realized the broader context. The Institute of Medicine’s
well-publicized To Err Is Human report, released shortly before, gave every reason to believe that the rate of medical errors was even worse throughout the rest of the American health-care system. The VA was at least admitting to its own mistakes, and even more impressively, doing something about them. A positive New York Times editorial quoted the nation’s top expert in health-care safety, Dr. Donald Berwick: “The Veterans Health Administration has made a more serious commitment to improving health safety than any other large system in the country.”7
Lessons from Challenger
Among the signs of that commitment was Kizer’s recruitment of former air force flight surgeon, astronaut, and NASA accident investigator James P. Bagian to head up a new national center for patient safety based in Ann Arbor, Michigan. Bagian, who supervised NASA’s investigation of the 1986 Challenger space shuttle disaster, brought with him the view that safety can only be achieved by creating systems that are, as he puts it, “fault tolerant.” O-rings and other tiny parts will fail; the challenge is to find out why and when they do, and to engineer changes that minimize the consequences. Similarly, some managers will always be tempted to discount safety concerns; the challenge is to build management processes that put the burden of proof on those who argue that a flight is safe to launch rather than on those who have doubts. At the same time, “close calls” and “near misses” will happen far more than actual accidents. The challenge here is to make sure there are processes by which as many of these close calls as possible get reported and analyzed so that their root causes can be determined and future catastrophes avoided.