Book Read Free

Complications

Page 24

by Atul Gawande


  That weekend she had gone back home to Hartford, Connecticut, to attend a wedding. (She had moved to Boston with some girlfriends the year before, after graduating from Ithaca College, and landed work planning conferences for a downtown law firm.) The wedding had been grand and she had kicked off her shoes and danced the whole night. The morning after, however, she woke up with her left foot feeling sore. She had a week-old blister on the top of her foot from some cruddy sandals she had worn, and now the skin surrounding the blister was red and puffy. She didn’t think too much of this at first. When she showed her foot to her father, he said he thought it looked like a bee sting or maybe like she’d gotten stepped on dancing the night before. By late that afternoon, however, riding back to Boston with her boyfriend, “my foot really began killing me,” she said. The redness spread, and during the night she got chills and sweats and a fever of one hundred and three degrees. She took ibuprofen every few hours, which got her temperature down but did nothing for the mounting pain. By morning, the redness reached halfway up her calf, and her foot had swelled to the point that she could barely fit it into a sneaker.

  Eleanor hobbled in on her roommate’s shoulder to see her internist that afternoon and was diagnosed with a cellulitis. Cellulitis is your garden-variety skin infection, the result of perfectly ordinary bacteria in the environment getting past the barrier of your skin (through a cut, a puncture wound, a blister, whatever) and proliferating within it. Your skin becomes red, hot, swollen, and painful; you feel sick; fevers are common; and the infection can spread along your skin readily—precisely the findings Eleanor had. The doctor got an X ray to make sure the bone underneath was not infected. Satisfied that it was not, she gave Eleanor a dose of intravenous antibiotics in the office, a tetanus shot, and a prescription for a week’s worth of antibiotic pills. This was generally sufficient treatment for a cellulitis, but not always, the doctor warned. Using an indelible black marker, she traced the border of the redness on Eleanor’s calf. If the redness should extend beyond this line, the doctor instructed, she should call. And, regardless, she should return the next day for the infection to be checked.

  The next morning, Eleanor said—this morning—she woke up with the rash beyond the black line, a portion stretching to her thigh, and the pain worse than ever. She phoned the doctor, who told her to go to the emergency room. She’d need to be admitted to the hospital for a full course of intravenous antibiotic treatment, the doctor explained.

  I asked Eleanor if she had had any pus or drainage from her leg. No. Any ulcers open up in her skin? No. A foul smell or blackening of her skin? No. Any more fevers? Not since two days ago. I let the data roll around in my head. Everything was going for a cellulitis. But something was pricking at me, making me alert.

  I asked Eleanor if I could see the rash. She pulled back the sheet. The right leg looked fine. The left leg was red—a beefy, uniform, angry red—from her forefoot, across her ankle, up her calf, past the black ink line from the day before, to her knee, with a further tongue of crimson extending to the inside of her thigh. The border was sharp. The skin was hot and tender to the touch. The blister on the top of her foot was tiny. Around it the skin was slightly bruised. Her toes were uninvolved, and she wiggled them for me without difficulty. She had a harder time moving the foot itself—it was thick with edema up through the ankle. She had normal sensation and pulses throughout her leg. She had no ulcers or pus.

  Objectively, the rash had the exact appearance of a cellulitis, something antibiotics would take care of. But another possibility lodged in my mind now, one that scared the hell out of me. It was not for logical reasons, though. And I knew this perfectly well.

  Decisions in medicine are supposed to rest on concrete observations and hard evidence. But just a few weeks before, I had taken care of a patient I could not erase from my mind. He was a healthy fifty-eight-year-old man who had had three or four days of increasing pain in the left side of his chest, under his arm, where he had an abrasion from a fall. (For reasons of confidentiality, some identifying details have been changed.) He went to a community hospital near his home to get it checked out. He was found to have a small and very ordinary skin rash on his chest and was sent home with antibiotic pills for cellulitis. That night the rash spread eight inches. The following morning he spiked a fever of one hundred and two degrees. By the time he returned to the emergency room, the skin involved had become numb and widely blistered. Shortly after, he went into shock. He was transferred to my hospital and we quickly took him to the OR.

  He didn’t have a cellulitis but instead an extremely rare and horrendously lethal type of infection known as necrotizing fasciitis (fashee-EYE-tiss). The tabloids have called it a disease of “flesh-eating bacteria” and the term is not an exaggeration. Opening the skin, we found a massive infection, far worse than what appeared from the outside. All the muscles of the left side of his chest, going around to his back, up to his shoulder, and down to his abdomen, had turned gray and soft and foul with invading bacteria and had to be removed. That first day in the OR, we had had to take even the muscles between his ribs, a procedure called a birdcage thoracotomy. The next day we had to remove his arm. For a while, we actually thought we had saved him. His fevers went away and the plastic surgeons reconstructed his chest and abdominal wall with transfers of muscle and sheets of Gortex. One by one, however, his kidneys, lungs, liver, and heart went into failure, and then he died. It was among the most awful cases I have ever been involved in.

  What we know about necrotizing fasciitis is this: it is highly aggressive and rapidly invasive. It kills up to 70 percent of the people who get it. No known antibiotic will stop it. The most common bacterium involved is group A Streptococcus (and, in fact, the final cultures from our patient’s tissue grew out precisely this). It is an organism that usually causes little more than a strep throat, but in certain strains it has evolved the ability to do far worse. No one knows where these strains come from. As with a cellulitis, they are understood to enter through breaks in the skin. The break can be as large as a surgical incision or as slight as an abrasion. (People have been documented to have gotten the disease from a rug burn, a bug bite, a friendly punch in the arm, a paper cut, a blood draw, a toothpick injury, and chicken pox lesions. In many the entry point is never found at all.) Unlike with a cellulitis, the bacteria invade not only skin but also deep underneath, advancing rapidly along the outer sheaths of muscle (the fascia) and consuming whatever soft tissue (fat, muscle, nerves, connective tissue) they find. Survival is possible only with early and radical excisional surgery, often requiring amputation. To succeed, however, it must be done early. By the time signs of deep invasion are obvious—such as shock, loss of sensation, widespread blistering of the skin—the person is usually unsalvageable.

  Standing at Eleanor’s bedside, bent over examining her leg, I felt a little foolish considering the diagnosis—it was a bit like thinking the ebola virus had walked into the ER. True, in the early stages, a necrotizing fasciitis can look just like a cellulitis, presenting with the same redness, swelling, fever, and high white blood cell count. But there is an old saying taught in medical school: if you hear hoofbeats in Texas, think horses not zebras. Only about a thousand cases of necrotizing fasciitis occur in the entire United States each year, mainly in the elderly and chronically ill—and well over three million cases of cellulitis. What’s more, Eleanor’s fever had gone away; she didn’t look unusually ill; and I knew I was letting myself be swayed by a single, recent, anecdotal case. If there were a simple test to tell the two diagnoses apart, that would have been one thing. But there is none. The only way is to go to the operating room, open the skin, and look—not something you want to propose arbitrarily.

  Yet here I was. I couldn’t help it. I was thinking it.

  I pulled the sheets back over Eleanor’s legs. “I’ll be back in a minute,” I said. I went to a phone well out of her earshot and paged Thaddeus Studdert, the general surgeon on call. He called back from the O
R and I quickly outlined the facts of the case. I told him the rash was probably just a cellulitis. But then I told him there was still one other possibility that I couldn’t get out of my head: a necrotizing fasciitis.

  The line went silent for a beat.

  “Are you serious?” he said.

  “Yes,” I said, trying not to hedge. I heard an epithet muttered. He’d be right up, he said.

  As I hung up the phone, Eleanor’s father, a brown-and-gray-haired man in his fifties, came around with a sandwich and soda for her. He had been with her all day, having driven up from Hartford, but when I was seeing her, it turned out, he had been gone getting her lunch. Catching sight of the food, I jumped to tell him not to let her eat or drink “just yet” and with that the cat began crawling out of the bag. It was not the best way to introduce myself. He was immediately taken aback, recognizing that an empty stomach is what we require for patients going to surgery. I tried to smooth matters over, saying that holding off was merely “routine procedure” until we had finished our evaluation. Nonetheless, Eleanor and her father looked on with new dread when Studdert arrived in his scrubs and operating hat to see her.

  He had her tell her story again and then uncovered her leg to examine it. He didn’t seem too impressed. Talking by ourselves, he told me that the rash looked to him only “like a bad cellulitis.” But could he say for sure that it was not necrotizing fasciitis? He could not. It is a reality of medicine that choosing to not do something—to not order a test, to not give an antibiotic, to not take a patient to the operating room—is far harder than choosing to do it. Once a possibility has been put in your mind—especially one as horrible as necrotizing fasciitis—the possibility does not easily go away.

  Studdert sat down on the edge of her bed. He told Eleanor and her dad that her story, symptoms, and exam all fit with cellulitis and that that was what she most likely had. But there was another, very rare possibility, and, in a quiet and gentle voice, he went on to explain the unquiet and ungentle effects of necrotizing fasciitis. He told them of the “flesh-eating bacteria,” the troublingly high death rate, the resistance to treatment by antibiotics alone. “I think it is unlikely you have it,” he told Eleanor. “I’d put the chances”—he was guessing here—“at well under five percent.” But, he went on, “without a biopsy, we cannot rule it out.” He paused for a moment to let her and her father absorb this. Then he started to explain what the procedure involved—how he would take an inch or so of skin plus underlying tissue from the top of her foot, and perhaps from higher up on her leg, and then have a pathologist immediately look at the samples under the microscope.

  Eleanor went rigid. “This is crazy,” she said. “This doesn’t make any sense.” She looked frantic, like someone drowning. “Why don’t we just wait and see how the antibiotics go?” Studdert explained that this was a disease that you cannot sit on, that you had to catch it early to have any chance of treating it. Eleanor just shook her head and looked down at her covers.

  Studdert and I both turned to her father to see what he might have to say. He had been silent to this point, standing beside her, his brow knitted, hands gripped behind him, tense, like a man trying to stay upright on a pitching boat. He asked about specifics—how long a biopsy would take (fifteen minutes), what the risks were (a deep wound infection was the biggest one, ironically), whether the scars go away (no), when it would be done if it were done (within the hour). More gingerly, he asked what would happen if the biopsy were positive for the disease. Studdert repeated that he thought the chances were less than 5 percent. But if she had it, he said, we’d have to “remove all the infected tissue.” He hesitated before going on. “This can mean an amputation,” he said. Eleanor began to cry. “I don’t want to do this, Dad.” Mr. Bratton swallowed hard, his gaze fixed somewhere miles beyond us.

  In recent years, we in medicine have discovered how discouragingly often we turn out to do wrong by patients. For one thing, where the knowledge of what the right thing to do exists, we still too frequently fail to do it. Plain old mistakes of execution are not uncommon, and we have only begun to recognize the systemic frailties, technological faults, and human inadequacies that cause them, let alone how to reduce them. Furthermore, important knowledge has simply not made its way far enough into practice. Among patients recognized as having heart attacks, for example, it is now known that an aspirin alone will save lives and that even more can be saved with the immediate use of a thrombolytic—a clot-dissolving drug. A quarter of those who should get an aspirin do not, however; and half who should get a thrombolytic do not. Overall, physician compliance with various evidence-based guidelines ranges from over 80 percent of patients in some parts of the country to less than 20 percent in others. Much of medicine still lacks the basic organization and commitment to make sure we do what we know to do.

  But spend almost any amount of time with doctors and patients, and you will find that the larger, starker, and more painful difficulty is the still abundant uncertainty that exists over what should be done in many situations. The gray zones in medicine are considerable, and every day we confront situations like Eleanor’s—ones in which clear scientific evidence of what to do is missing and yet choices must be made. Exactly which patients with pneumonia, for example, should be hospitalized and which ones sent home? Which back pains treated by surgery and which by conservative measures alone? Which patients with a rash taken to surgery and which just observed on antibiotics? For many cases, the answers can be obvious. But for many others, we simply do not know. Expert panels asked to review actual medical decisions have found that in a quarter of hysterectomy cases, a third of operations to put tubes in children’s ears, and a third of pacemaker insertions (to pick just three examples), the science did not exist to say whether the procedures would help those particular patients or not.

  In the absence of algorithms and evidence about what to do, you learn in medicine to make decisions by feel. You count on experience and judgment. And it is hard not to be troubled by this.

  A couple weeks before seeing Eleanor, I had seen an arthritic and rather elderly woman (she was born before Woodrow Wilson was president) who had come in complaining of a searing abdominal pain that radiated into her back. I learned that she had recently been found to have an aortic aneurysm in her abdomen and instantly my alarm bells went off. Examining her gingerly, I could feel the aneurysm, a throbbing and tender mass just deep to her abdominal muscles. She was stable, but it was on the verge of rupturing, I was convinced. The vascular surgeon I called in agreed. We told the woman that immediate surgery was the only option to save her. We warned her, however, that it was a big surgery, with a long recovery in intensive care and probably in a nursing home afterward (she still lived independently), a high risk that her kidneys would not make it, and a minimum 10 to 20 percent chance of death. She did not know what to do. We left her with her family to think on the decision, and then I returned fifteen minutes later. She said she would not go ahead with surgery. She just wanted to go home. She had lived a long life, she said. Her health had long been failing. She had drawn up her will and was already measuring her remaining days in coffee spoons. Her family was devastated, but she was steady-voiced and constant. I wrote out a pain medication prescription for her son to fill for her, and half an hour later she left, understanding full well that she would die. I kept her son’s number and, when a couple weeks had passed, called him at home to hear how he had weathered the aftermath. His mother, however, answered the telephone herself. I stammered a hello and asked how she was doing. She was doing well, she said, thank you. A year later, I learned, she was still alive and living on her own.

  Three decades of neuropsychology research have shown us numerous ways in which human judgment, like memory and hearing, is prone to systematic mistakes. The mind overestimates vivid dangers, falls into ruts, and manages multiple pieces of data poorly. It is swayed unduly by desire and emotion and even the time of day. It is affected by the order in which information is
presented and how problems are framed. And if we doctors believed that, with all our training and experience, we escape such fallibilities, the notion was dashed when researchers put us under the microscope.

  A variety of studies have shown physician judgment to have these same distortions. One, for example, from the Medical College of Virginia, found that doctors ordering blood cultures for patients with fever overestimated the probability of infection by four- to tenfold. Moreover, the highest overestimates came from the doctors who had recently seen other patients with a blood infection. Another, from the University of Wisconsin, found evidence of a Lake Wobegon effect (“Lake Wobegon: where the women are strong, the men are good-looking, and all the children are above average”): the vast majority of surgeons believed the mortality rate for their own patients to be lower than the average. A study from Ohio University and Case Western Reserve Medical School examined not just the accuracy but also the confidence of physicians’ judgments—and found no connection between them. Doctors with high confidence in a judgment they made proved no more accurate than doctors with low confidence.

  David Eddy, a physician and expert on clinical decision making, reviewed the evidence in an unflinching series of articles published over a decade ago in the Journal of the American Medical Association. And his conclusion was damning. “The plain fact is,” he wrote, “that many decisions made by physicians appear to be arbitrary—highly variable, with no obvious explanation. The very disturbing implication is that this arbitrariness represents, for at least some patients, suboptimal or even harmful care.”

  But in the face of uncertainty, what other than judgment does a physician have—or a patient have, for that matter? Months after seeing Eleanor that spring afternoon, I spoke with her father about the events that had unfolded.

 

‹ Prev