Book Read Free

Black Box Thinking

Page 13

by Matthew Syed


  But he wasn’t getting through. The surgeon continued with the operation, the patient’s symptoms returned, and Pronovost had to deliver another dose of epinephrine. Again he explained to the surgeon that the latex was endangering the patient, but once again the surgeon disagreed. This was a medical issue, not a surgical one. Pronovost was more qualified to express an opinion. But the surgeon was in charge—and he wasn’t budging.

  By this time, with the argument escalating, the junior doctor in the room and the nurses were pale-faced. Pronovost was now certain that this was a latex allergy, given the second adverse reaction, and that if the surgeon didn’t change gloves the patient would die, possibly within minutes. So he changed tack, trying to nudge the argument away from the threat to the status of the surgeon and on to the basic calculation that would surely resolve the argument once and for all.

  “Let’s think through this situation,” he said gently. “If I’m wrong you will waste five minutes changing gloves. If you are wrong the patient dies. Do you really think this risk-benefit ratio warrants you not changing your gloves?”

  At this point, you might imagine that the surgeon would be forced to accept the logic of the situation. Surely he could not persist. But the theory of cognitive dissonance offers a different possibility. The risk-benefit ratio was not about weighing the life of a patient against the few moments it would have taken to change gloves. Rather, the risk-benefit ratio was about weighing the life of a patient against the prestige of a surgeon whose entire self-esteem was constructed upon the cultural insinuation of his own infallibility.

  The weighing exercise wasn’t even close. The surgeon became more entrenched; more utterly certain of his own judgment; he scarcely even considered the calculation that Pronovost had suggested. “You’re wrong,” the surgeon said. “This is clearly not an allergic reaction, so I’m not changing my gloves.”

  This could have been the end of it, and normally it would have been. After all, the surgeon is in charge. You are not supposed to challenge his judgment. But Pronovost, who had lost his own father to medical error and had chosen to devote his life to patient safety, stuck to his guns. He instructed the nurse to telephone the dean and the president of Johns Hopkins Hospital so that they could overrule the surgeon.

  The atmosphere in the operating room was now one of stunned silence. The nurse picked up the phone, but hesitated, looking at the two men. She was unsure what to do. Even now the life of the patient hung by a thread. Further contact with the latex gloves could prove fatal. “Page them now,” Pronovost said firmly. “This patient is having a latex allergy. I cannot allow her to die because we did not change gloves.”

  Only as the phone was being dialed did the surgeon finally budge. He swore, dropped his gloves, and strode out to change them. The tension finally began to abate.

  Once the operation was over, tests confirmed what Pronovost had suspected all along: the patient had a latex allergy. If the surgeon had got his own way, as he would have done 99.9 percent of the time, she would almost certainly have died.

  And this reveals the inextricable link between the lack of progress in key areas of our world and the absence of learning from failure. The context is health care, but the lessons extend far wider.

  Think of it this way: doctors are sometimes oblivious to their mistakes because they have already reframed them. They are not dishonest people; they are often unaware of the reframing exercise because it is largely subconscious. If there were independent investigations into adverse events, these mistakes would be picked up during the “black box” analysis and doctors would be challenged on them, and learn from them. But proper independent investigation is almost nonexistent. Moreover, such investigations generally rely on the information provided by professionals, which is often withheld in a culture that stigmatizes error.

  This means that doctors make the same mistakes again and again, while growing in the mistaken conviction that they are infallible. This, in turn, increases the cognitive dissonance associated with mistakes, tightening the noose still further. Admitting to error becomes so threatening that in some cases surgeons (decent, honorable people) would rather risk killing a patient than admit they might be wrong. The renowned physician David Hilfiker put it this way:

  Doctors hide their mistakes from patients, from other doctors, even from themselves . . . The drastic consequences of our mistakes, the repeated opportunities to make them, the uncertainty about our culpability, and the professional denial that mistakes happen all work together to create an intolerable dilemma for the physician. We see the horror of our mistakes, yet we cannot deal with their enormous emotional impact.16

  Now consider one final study into the scale of evasion in health care. What we haven’t yet done is try to break the numbers down into their component parts. Who is involved in the most cover-ups? Is it nurses, the junior members of staff? Or is it the doctors, the senior members, the ones with the prestigious educations and the responsibility to lead the industry forward?

  It will not surprise you to hear that it is the latter. Intelligence and seniority when allied to cognitive dissonance and ego is one of the most formidable barriers to progress in the world today. In one study in twenty-six acute-care hospitals in the United States, nearly half of the errors reported were made by registered nurses. Physicians contributed less than 2 percent.17

  If Peter Pronovost hadn’t been in the operating room on the day when the patient was reacting adversely to the latex surgical gloves, it isn’t just one patient who would have died. The deeper tragedy is that nobody would have learned from it. The failure would have been reframed: the blame would have been pinned on the patient’s unusual symptoms, rather than on the surgeon’s failure to remove his gloves. It would have left the surgeon free to make the same mistake again.

  Today, Pronovost is arguably the most influential doctor in American health care. His crusading work into medical error has saved thousands of lives. He has been awarded a MacArthur Fellowship, otherwise known as a genius grant. In 2008 he was named as one of the 100 most influential people in the world. But back in that operating room, he was still a junior clinician. Even now he acknowledges that saving the life of the patient was a close call. He has said:

  The patient was fortunate because I was already gaining a reputation as a safety leader. That gave me the courage to speak up . . . What if I was just starting out in my career? Would I have taken such a risk? Perhaps not. If the patient had died, it would have been blamed primarily on her allergy, not the surgeon. Similar dramas play out day after day in hospitals across the country. How many patients have been harmed or died as a result? Will we ever really know?18

  Chapter 6

  Reforming Criminal Justice

  I

  Trofim Lysenko was a dark-haired, bright-eyed biologist. He came from peasant stock in the west of what would become the Soviet Union and was spotted by the political leaders of the Communist revolution in the 1920s, when he claimed to have found a way to enhance crop yields.1

  The technique was not as successful as Lysenko claimed, but the young scientist was ambitious and politically savvy. Over a period of ten years he gradually moved up the academic ranks. In 1934, he was appointed to the Lenin All-Union Academy of Agricultural Sciences.

  It was then that he took a major gamble. In the early twentieth century, the science of genetics, based on the work of Gregor Mendel, a German friar and scientist, was just beginning to take off. It proposed that heredity was encoded in small units called genes and could be described using statistical rules. Lysenko became an outspoken critic of this new theory, positioning himself against a rising tide of scientific opinion.

  Lysenko was not stupid. He calculated that this stance would endear him further to the political elite. Marxism was based on the idea that human nature is malleable. Genetics, which held that certain traits are passed down from generation to generation, seemed like a threat to t
his doctrine. Lysenko started to defend a different idea: the notion that traits acquired during one’s lifetime could be passed on. It is sometimes called Lamarckism, after the original proponent of the theory.

  Scientific ideas should succeed or fail according to rational argument and evidence. It is about data rather than dogma. But Lysenko realized that he couldn’t silence the geneticists through argument alone. Thousands of scientists up and down the country were excited by the new genetic approach. They sincerely believed that it had intellectual merit and that it should, therefore, be pursued. And they had data to back up their beliefs.

  So Lysenko tried a different approach: instead of engaging in debate, he tried to shut it down. He called upon Stalin to outlaw the new theory of genetics. Stalin agreed, not because genetics had been proved wanting scientifically, but because it didn’t tally with Communist ideology. Together they declared genetics “a bourgeois perversion.” The ideas of Lamarck, on the other hand, were given the Communist seal of approval.

  Those who dissented from the Party line were ruthlessly persecuted. Many geneticists were executed, including Israel Agol, Solomon Levit, Grigorii Levitskii, Georgii Karpechenko, and Georgii Nadson, or sent to labor camps. Nikolai Vavilov, one of the most eminent Soviet scientists, was arrested in 1940 and died in prison in 1943. All genetic research was forbidden and at scientific meetings around the country geneticists were condemned and dismissed.

  Lysenko had silenced his critics and pretty much guaranteed that his own ideas would triumph. But this “success” had a familiar sting in its tail. By protecting his ideas from dissent, he had deprived them of a valuable thing: the possibility of failure. He proposed all sorts of techniques to improve crop yields, but nobody tested them out of fear of persecution. Science had effectively been detached, by political decree, from the feedback mechanism of falsification.

  The results were devastating. Before the rise of Lysenko, Russian biology had been flourishing. Dmitry Ivanovsky discovered plant viruses in 1892. Ivan Pavlov won the Nobel Prize for Medicine in 1904 for his work on digestion. Ilya Mechnikov won the Nobel Prize in 1908 for his theories on the cellular response to infection. In 1927, Nikolai Koltsov proposed that inherited characteristics are double-stranded giant molecules, anticipating the double helix structure of DNA.

  By the end of the purges, however, Russian science had been decimated. As Valery Soyfer, a Russian scientist persecuted during the Lysenko era, put it: “The progress of science was slowed or stopped, and millions of university and high school students received a distorted education.”2 This produced a ripple effect on the quality of life for millions of Russians, not least because the agricultural techniques proposed by Lysenko were often ineffective. This is what happens when ideas are not allowed to fail.

  For Communist China, which had also embraced Lysenko’s ideas, the results were, in many ways, even more catastrophic. Lysenko had publicly come out in favor of a technique of close planting of crop seeds in order to increase output. The theory was that plants of the same species would not compete with each other for nutrients.

  This fitted in with Marxist and Maoist ideas about organisms from the same class living in harmony rather than in competition. “With company, they grow easy,” Mao told colleagues. “When they grow together, they will be comfortable.” The Chinese leader drew up an eight-point Lysenko-inspired blueprint for the Great Leap Forward, and persecuted Western-trained scientists and geneticists with the same kind of ferocity as in the Soviet Union.3

  The theory of close-planting should have been put to the test. It should have been subject to possible failure. Instead it was adopted on ideological grounds. “In Southern China, a density of 1.5 million seedlings per 2.5 acres was usually the norm,” Jasper Becker writes in Hungry Ghosts, Mao’s Secret Famine. “But in 1958, peasants were ordered to plant 6.5 million per 2.5 acres.”

  Too late, it was discovered that the seeds did indeed compete with each other, stunting growth and damaging yields. It contributed to one of the worst disasters in Chinese history, a tragedy that even now has not been fully revealed. Historians estimate that between 20 and 43 million people died during one of the most devastating famines in human history.

  • • •

  The Lysenko incident is rightly regarded as one of the most scandalous episodes in the history of science. It has been the subject of dozens of books (including the magisterial Lysenko and the Tragedy of Soviet Science), hundreds of journal articles, and it is familiar to almost all researchers. It serves as a stark warning about the dangers of protecting ideas from the possibility of failure.

  Yet a different and more subtle form of the Lysenko tendency exists in the world today. Ideas and beliefs of all kinds are protected from failure, but not by a totalitarian state. Instead they are protected from failure by us.

  Cognitive dissonance doesn’t leave a paper trail. There are no documents that can be pointed to when we reframe inconvenient truths. There is no violence perpetrated by the state or anyone else. It is a process of self-deception. And this can have devastating effects, not least on those who were the subject of chapter 4: the wrongly convicted.

  And this brings us back to the DNA exoneration era. We have seen that these cases were difficult for the police and prosecutors to accept. But to close this section, let us explore these graphic failures in the criminal justice system and see what they tell us about how the system should be reformed to prevent them from ever happening again.

  The answer, it turns out, starts with creating a system that is sensitive to the inherent flaws in human memory.*

  II

  Neil deGrasse Tyson is an eminent astrophysicist, popular science writer, and media personality. He has eighteen honorary doctorates and was once voted the sexiest astrophysicist in the world. He is also a prolific public speaker. Many of his performances are on YouTube.

  For many years after 9/11, Tyson told a particular story about George W. Bush. The former president had made a speech in the days after the attack on the twin towers. Tyson quoted Bush as saying in this speech: “Our God is the God who named the stars.”4

  To Tyson this was a destructive thing for the president to say. He felt that Bush was seeking to divide Christians and Muslims in the aftermath of an attack by Islamic extremists. It was an insinuation that Christians believed in the true God, given that He had named the stars.

  As Tyson put it: “George Bush, within a week of [the attacks], gave us a speech attempting to distinguish ‘we’ from ‘they.’ And who are ‘they’? These were the Muslim fundamentalists . . . And how does he do it? He says . . . ‘Our God is the God who named the stars.’”

  But Tyson wasn’t finished. Bush was not just being bigoted, he said, but also inaccurate. In the next sentence Tyson revealed that two thirds of identified stars actually have Arabic names, having been discovered by Muslim scholars. “I don’t think Bush knew this,” Tyson said. “That would confound the point he was making.”

  The speech was highly effective. It mesmerized audiences and made an acute political point. It also positioned Bush as an irresponsible president, using a tragedy to divide Americans at a moment of great sensitivity. But there was a small problem. When a journalist from the Federalist website went looking for the Bush quote, he couldn’t find it. He searched the TV and newspaper archives for the statements of the president after 9/11, but the “stars quote” didn’t seem to be there.5

  When Tyson was contacted, he was adamant that he could remember Bush making the statement. “I have explicit memory of those words being spoken by the President,” he said. “I reacted on the spot, making a note for possible later reference in my public discourse. Odd that nobody seems to be able to find the quote anywhere.”

  But no matter how hard journalists looked for it, they couldn’t find it. The only speech that Bush had made in the aftermath of the attacks had been very different from the one highlighted by Tyson. “Th
e enemy of America is not our many Muslim friends,” Bush said. “It is not our many Arab friends. Our enemy is a radical network of terrorists and every government that supports them.” This was reconciliatory, and as for stars, he didn’t mention them at all.

  Only later did researchers uncover a quote where Bush did mention stars, but it wasn’t made after 9/11; it was spoken in the aftermath of the Space Shuttle Columbia disaster. “The same creator who names the stars also knows the names of the seven souls we mourn today,” Bush said.

  Needless to say, this put an entirely different gloss on the quote, and made something of a mockery of Tyson’s interpretation. This was a president offering words of comfort and hope for the families of those who had died in the Columbia tragedy—and he was making no contrast with Islam.

  But Tyson was nothing if not insistent. He said that he had a clear memory of Bush saying the words after 9/11. For a while, he wouldn’t budge. Only after weeks of being asked to find a scrap of evidence for the original quote did he finally issue a retraction. “I here publicly apologize to the President for casting his quote in the context of contrasting religions rather than as a poetic reference to the lost souls of Columbia,”6 he said.

  The post-9/11 “stars” speech by George W. Bush never happened.

  This episode is revealing because it shows that even practicing scientists are suckers for the seemingly inviolable power of memory. When we remember seeing something, it feels as if we are accessing a videotape of a real, tangible, rock-solid event. It feels like it must have happened. When people question one’s memory it is natural to get irate.

 

‹ Prev