Attending

Home > Other > Attending > Page 6
Attending Page 6

by Ronald Epstein


  PURSUING DOUBT

  If curiosity is its own reward, one would expect that people—including doctors—would display it more often. Yet being curious also takes people outside their comfort zones because it has to do with increasing—not removing—uncertainty. It goes counter to the human tendency to oblige reality to fit our preconceived notions. Being curious involves being aware that the situation is not as tidy as it might seem. This sense of doubt is unsettling for many doctors and patients, especially when the stakes are high.15 Patients undergoing surgery for cancer want to hear—and surgeons want to say—“I got it all.” Patients and doctors want a diagnosis to be definitive, beyond doubt, and a treatment to be the best available. Sometimes we have that degree of certainty, but more often that certainty is elusive—provisional, incomplete, or evolving. While a biopsy can prove that you have cancer, it cannot tell you exactly how long you will live or if you will be the one who will benefit from treatment or the one who won’t. Being mindful means feeling uncertainty, not turning away from that feeling of uncertainty and not clinging to the negative emotions that arise.16

  Curiosity not only draws attention to the outside world, but also draws attention to one’s inner experience: “Am I tired? Am I too sure of myself? Am I in a hurry? What’s new here?” Even the way clinicians interact with patients can help them be curious: “I’m wondering—have I addressed what’s really important to you? Am I missing something? Is there something more that you want to tell me?” I make a habit of asking myself “What’s interesting about this patient?” and “What’s still unknown?” as a habitual (but not foolproof) way of avoiding self-deception and premature closure.

  Curiosity is not only good for patient care; it is also good for health professionals. By enriching their connections with their patients and feeling more effective—on their game—they feel more vitality in their work.17 Recent research suggests that there’s a feedback loop between curiosity on the one hand, and anxiety, defensiveness, and rigidity on the other. For years, psychologists have known (mostly from research in educational settings) that when we’re less anxious, defensive, and rigid, curiosity flourishes. New research suggests that it also works the other way; the more curious you are, the less anxious, defensive, and rigid you’ll be when under psychological stress. Psychologist Todd Kashdan conducted an interesting set of experiments in which participants were asked to think about their death, imagining what a terminal illness and dying might be like. As predicted by “terror management theory” (and common sense), subjects became defensive; they tried to push away death-related thoughts and tended to cling to familiar beliefs, people, and surroundings. The researchers also measured attentiveness and curiosity, both elements of mindfulness. People who were attentive and curious were less anxious and defensive than those who were equally attentive but had a less curious disposition.18

  Some people have personalities that seek novelty; they tend to be adventurous and less risk-averse. They see new challenges as exciting rather than terrifying. These tendencies manifest early in childhood, leading to speculation that brain chemistry, genetics, and social environment might all contribute.19 Curiosity is associated with release of the neurotransmitter dopamine, activating intrinsic reward circuits in the brain.20 These dopaminergic systems are triggered by novel experiences, especially sensory experiences, and magnified if the experiences are surprising and associated with some risk (Think about adolescents here!). Because dopamine release makes one feel good, curiosity persists even in the absence of tangible external rewards. People who score high on psychological markers of curiosity—in particular, those who have high “openness”—are biochemically different from their peers. Their dopamine receptors are more numerous and their genetic controlling mechanisms are different.21 The propensity to be curious is, to some degree, encoded on our genes.22

  Curiosity is not merely genetic; it grows in nurturing social environments. Children’s “exploration behaviors”—analogous to the more critical and nuanced curiosity of adults—are expressed to a greater degree among children whose emotional attachment to their parents and other adults is secure. Children raised in a supportive environment feel safer taking small risks and exploring the unknown than children who have experienced less nurturance. For them, curiosity leads to a sense of vividness—children feel that their world “comes alive” and provides a sense of fulfillment and happiness.23 They want to explore further, widening their reach. In contrast, those raised in abusive or neglectful environments tend to adopt a more fearful, anxious, or avoidant attachment style and cling to the familiar.24 They don’t explore or examine the world around them. They’re afraid of rejection and failure. When they become adults, the same factors hold true. Like children, adults are more curious in supportive environments—ones that promote inquiry and in which they can safely share their doubts, discoveries, and mishaps.25 Yet, clinicians tend to rate their work environments as not particularly supportive. They often rely on the relationships that they develop outside the workplace for support.26

  While in the past the social/cognitive environment and genetics were seen as opposing explanations for human behavior, the relatively new—and exciting—fields of behavioral and social epigenetics have made clear that the social environment affects the ways in which one’s genetic predispositions are actually expressed.27 If the social environment is safe and supportive, the genes that encode dopamine receptors are turned on. With more receptors, the sense of intrinsic reward from discovery and curiosity is greater. Conversely, if the environment is abusive or inconsistent, the same genes will be turned off. Put simply, genes affect psychological states and social interactions, and also the reverse—these same genes are regulated by the internal psychological environment as well as the social milieu. Even those who might have a low “natural” tendency toward curiosity become more curious if placed in a supportive environment with strong healthy relationships and encouragement to reflect and be self-aware.

  Curiosity is part of the social capital of medicine. Just like young children, medical practitioners who are more curious feel a greater sense of vividness and vitality. They are more satisfied with their work, more engaged with their patients, and do a better job of treating illnesses. Entertain, for a moment, the radical thought that health care institutions could actually support healthy learning environments. Clinicians would be more motivated; they’d inquire more deeply about patients’ illnesses and distress, form more meaningful relationships, and have a greater sense of self-confidence—all of which would promote greater quality of care.

  4

  Beginner’s Mind

  The Zen of Doctoring

  When I was nineteen, I spent three months at the San Francisco Zen Center. I had read a book, Zen Mind, Beginner’s Mind, written by Shunryu Suzuki Roshi, the founding abbot of the center. Shortly thereafter I applied to be a “guest student.” In the book, Suzuki Roshi describes in simple language the core principles of mindful living and meditation practice.1 It’s one of my “stranded on a desert island” books—each time I pick up the small volume, I find new wisdom. Suzuki Roshi said, “In the beginner’s mind the possibilities are many, in the expert’s they are few.” By this he meant that expertise can lead you to deep insights, but can also lead your mind away from its true nature—curious, open, creative.2 At that time I was a beginner, so I found this reassuring. But beginner’s mind is even more important for those with some claim to expertise.

  During one of my medical school rotations, a classmate was assigned a patient with hairy-cell leukemia, a disease that was fascinating to the physicians caring for her because the genetic basis of the disease had recently been discovered. (It’s called “hairy” because of the appearance of the cells under the microscope.) She was considered a “great case.” Despite our exquisite understanding of her illness, the treatments available provided no guarantees of a cure. On rounds with my classmate and our supervising residents, I saw a frail woman, bedridden, without family or get
-well cards in her room. No one seemed to be addressing either her pain or her isolation. It didn’t take much—my classmate took a moment to mention to the clinical team that the patient seemed uncomfortable and alone. Once alerted, the team provided a different kind of attention, focused on comfort and dignity; they ordered pain medications and arranged for a chaplain to come to the bedside. My classmate’s supervisor, clearly more expert than he, saw her as a “great case”; it took a beginner’s eyes to see her as a “suffering person.”

  Now that I am the senior member of clinical teams, I value medical students’ input more than ever. Often the medical student on the team is the one who asks the key question—something as simple as “Why are you doing that?” The naïve (and sometimes annoying!) questions of a bright medical student can profoundly alter an experienced clinician’s point of view. Recently a medical student asked me whether our social worker could help with transportation for an elderly patient with diabetes. Until he asked, I hadn’t considered why the patient had missed so many appointments; turns out she had no reliable way of getting to the office, and that’s why her diabetes was out of control. In retrospect it seems so obvious.

  When Suzuki Roshi talked about beginner’s mind, he was talking to (or about) beginners, but his message was even more important for experts. Experts tenaciously hold on to their expertise. They conflate their competence and experience with mastery. After all, we have worked hard to become the experts that we are, and suggesting that this hard-earned expertise should be set aside is a radical notion. But experts don’t always see how their expertise can limit their understanding. In the view of the Dreyfus brothers, professors at the University of California, Berkeley, who developed a model of expertise, experts know the answers, but only masters know the important questions. Experts revel in what they know, and masters revel in what they don’t.3

  Expertise can lead doctors to assume that they know things that they cannot. For example, doctors often feel that they are able to assess patients’ pain accurately. Patients’ accounts (backed up by research) suggest otherwise, that we don’t really know much about our patients’ distress unless we ask. Physicians’ estimates about their patients’ level of pain are often no better than chance, meaning that we often provide inadequate (or unnecessarily excessive) pain medication. Psychologist Cleve Shields studies communication between patients and physicians. He read transcripts of audio-recorded patient-physician office visits in which there was some discussion of pain. Physicians who used more “certainty words”—words that connoted that they were sure of themselves, beyond doubt—asked patients fewer questions about their pain: what it was like, what helped and what didn’t. The doctors’ presumptuousness got in the way of good care.4 Doctors, as they go through training, often get worse at understanding patients’ subjective experience of illness; the doctors’ expertise blinds them to patients’ experience of suffering and their empathy declines.5 They give privilege to objective information about patients over subjective information from patients. They are more likely to treat patients as diagnoses, as objects. Neuroimaging studies suggest that physicians are less emotionally reactive to seeing patients in pain than the general population—a good thing because it keeps them from becoming overwhelmed, a bad thing because sometimes they objectify patients and distance themselves too much.6

  Beginner’s mind uncouples expertise from one’s present experience. It is a cultivated naïveté, an intentional setting aside of the knowledge and preconceived notions that one has gained from books, journals, teachers, and past experiences to see the situation with new eyes. I think of it as putting my “expert self” on an imaginary shelf for a moment, easily within reach and readily available, but enough out of the way so that it doesn’t become an encumbrance to a more intuitive and holistic way of being. I think, “What does this patient need most today?” Then I seek the evidence to justify or refute my initial impressions. Simply setting my expert self aside helps me to consider new possibilities.

  Johann Sebastian Bach is reported to have said, “The problem is not finding [melodies], it’s—when getting up in the morning and out of bed—not stepping on them.”7 Bach was a consummate expert, perhaps the greatest composer who ever lived, and was continually creative and inventive within a tradition that had strict rules of composition. However, to be creative, he had to set aside some of his preconceived notions of what music could be to produce something new, fresh, and not formulaic. The same was true of other great composers who created new musical languages: Monteverdi, Beethoven, Wagner, Schönberg, and Cage. Similarly, in medicine, beginner’s mind liberates intuition; intuition can then inform my understanding, taking into account my prior ideas, successes, and failures yet remaining unfettered by them. Relying on expertise alone might produce a Dittersdorf,8 but hardly a Bach; in medicine, it might produce someone who seems to have all the answers but doesn’t ask the right questions.

  A FLAG IN THE WIND

  Gary, a friend of mine, was diagnosed with bladder cancer several years ago—the slow-growing curable kind, fortunately. He had cystoscopic surgery and was sent home with a catheter. After a few days, the catheter was removed, but he got into trouble—he had intense pain when he tried to urinate and developed urinary retention. The catheter was reinserted for a few days and then removed again—on a Thursday afternoon. On Friday, he started having abdominal pain that grew in intensity during the day. His urologist’s office was closed, so he went to the busy emergency room of a well-respected California hospital. The physician noted that Gary had little urine output, so he started an IV, perhaps assuming that Gary was dehydrated. He signed out to another physician, who noted that Gary was still not urinating much, so he increased the rate of the IV. They continued the IV drip overnight. In distress, Gary’s wife called me early the next morning. Gary was in agonizing pain. I spoke to the nurse on his unit, and I insisted that he be seen by the urology resident. Before the resident arrived, though, the nurse checked Gary’s abdomen and said, “Oh my God, your bladder is about to burst.” She placed a new catheter, draining two liters of urine from Gary’s bladder. He eventually did get well, but he spent two additional days in the hospital before going home because he developed a fever due to a (preventable) kidney infection.

  In one sense, this case defies the imagination—how could well-qualified doctors and nurses persist on an erroneous path when an alternative and logical explanation—recurrent urinary retention—would be perfectly obvious to someone with no medical training? It’s a striking example of cognitive rigidity—the resistance to changing one’s thinking or beliefs, a tenacious adherence to one view of reality. Every clinician can think of a time when he or she fell into a trap like this. In medicine, manifestations of illness are polysemous—they can be interpreted in many ways—creating a field of cognitive traps into which clinicians routinely fall. When harried, clinicians are more likely to stop thinking when they find the first, and not necessarily the best, of multiple interpretations of a situation. Cognitive scientists call this “search satisfaction.”9 Their interpretation then becomes ossified into a rigidly brittle—yet flimsy—“truth.”

  Novelist F. Scott Fitzgerald once said that “the test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function.”10 With the same contexts, scenes, and characters, one could write two completely different plays. Here, the doctors caring for Gary didn’t consider the possibility of two different story lines—the dehydration story and urinary retention story. They settled on one of the two and considered it fact, even though the dehydration story was a poor fit. Having once committed to a viewpoint, clinicians can be extraordinarily unwilling to consider another, even if it is a better fit; changing one’s mind is a source of shame, rather than a source of wonder. Gary’s situation is even more remarkable because all indications suggested that the treatment was not producing the desired effect, even from the beginning—it would be odd for someone w
ho was dehydrated not to urinate after two liters of extra fluid had been pumped into his system. The cognitive rigidity of one clinician was contagious, practically becoming a shared delusion among several health professionals. One definition of insanity is doing the same thing over and over and expecting different results.11

  Part of the problem is the fast-paced environment of clinical medicine. Clinicians feel under pressure to come to a diagnosis and treatment plan and move on to the next patient. Part of that pressure is internal, though—something about the quick thinking is exciting for physicians. Clinicians need some way of alerting themselves to the possibility that their understanding is provisional and incomplete, that their expertise can lead them astray. They need a trigger to help them slow down when they should.

  The fast pace of medicine is only part of the problem. It takes effort to hold two opposed ideas—to consider that a patient can be both a great case and a suffering person, to see the relevance of both the patient’s experience and your own diagnostic formulations. The ability to tolerate—and even embrace—ambiguity is central to being a good diagnostician. Master clinicians see that seemingly contradictory perspectives might offer explanations for an evolving situation; they have the cognitive flexibility to let go of ideas when those ideas are no longer useful. They can see how an illness is caused by a virus and by the failure of the body’s immune system—and thus allow a wider range of treatment options. They can see that a patient who doesn’t take his diabetes medicine regularly is both “noncompliant” and also “struggling to do the best he can”—and that way the clinicians can mobilize support while also encouraging the patient to do a better job. Mindful clinicians can feel confident while retaining some doubt. Just the other day I had to ask a colleague a simple question about a newborn (I don’t have many in my practice anymore). It was almost embarrassing, something any intern would know and I knew but needed to make sure I had it right. It takes humility to recruit additional expertise.

 

‹ Prev