The Icepick Surgeon

Home > Other > The Icepick Surgeon > Page 31
The Icepick Surgeon Page 31

by Sam Kean


  Eventually one chemist reported Dookhan to his supervisor. To his frustration, the supervisor pooh-poohed him. Maybe Dookhan rushed things sometimes, the supervisor conceded, but she’d been under a lot of strain at home, which probably clouded her judgment. Besides, given the new requirements to testify, the dreaded backlog was growing bigger every month, and the lab couldn’t afford to lose their superwoman now. The suspicious chemist reported his concerns to the local scientific union as well, but he got no further there. The union’s lawyer allegedly told him to back off, lest he ruin a young female scientist’s career. In sum, both the boss and the union gave Dookhan a pass.

  Still, Dookhan now had official accusations against her. One sloppy mistake, and her career would be over.

  As mentioned, the lab had a walk-in evidence safe to store drugs awaiting testing, and there were strict protocols about signing samples in and out. As she grew more cavalier, Dookhan started taking samples without bothering to sign them out, a breach of chain-of-custody rules. She finally got caught one day in June 2011 with ninety unsigned samples. She then tried to cover up her blunder by, again, forging a coworker’s initials in a logbook. Unfortunately, the coworker hadn’t been at the lab on the day in question. When confronted with the logbook, and asked if she’d violated the rules, Dookhan got slippery, saying, “I can see why you’d think that.”

  Even then, Dookhan’s bosses didn’t punish her. In fact, they did everything they could to hush up the chain-of-custody breach. In December, however, the Massachusetts governor’s office got wind of the breach and assigned the state’s inspector general to investigate. In the course of the investigation, several other lax practices at the lab came to light, including poor security and inadequate training for new chemists. (Later investigations would find even more alarming problems, including stray pills from old cases just lying around the lab. One supervisor had several test tubes in his desk drawers; one was labeled 1983.) By the summer of 2012, the state police, fearing for the integrity of their evidence, assumed control of the lab. Two days after the takeover, Dookhan’s fellow chemists spilled their suspicions about her to their new overseers.

  By that point, Dookhan had already resigned from the lab, given the seriousness of the chain-of-custody violations. But she had yet to face any consequences for dry-labbing tens of thousands of samples, until two detectives knocked on her door in late August 2012.

  They sat down with Dookhan in her living room to chat, and at first she denied everything. But the detectives had come prepared, and they laid the forged logbooks and calibration reports in front of her. At this point Dookhan said, “I got the work done, but not properly. I didn’t follow the procedures, and that was wrong.” In other words, she admitted violating some technical rules, but claimed that her science stood up.

  Mid-interview, Dookhan’s husband came home and pulled her into another room. He asked her if she needed a lawyer, and she assured him everything was fine—another lie. She then returned to the living room and continued the interview.

  When the detectives asked her whether she’d ever dry-labbed, Dookhan got slippery again. What do you think that term means?, she asked. When they explained, she denied it: “I would never falsify, because it’s someone’s life on the line.” The detectives responded with more evidence. As mentioned, Dookhan would sometimes guess that a drug was, say, cocaine, only for a subsequent machine test to find heroin or another substance. In that case, she’d sneak some cocaine from a different sample and resubmit that for more machine testing, to “confirm” her first claim. Well, in several cases the detectives had dug up the original sample again to retest it—and determined that it was heroin after all. It was damning proof that she’d forged the results.

  Tears were soon quivering in Dookhan’s eyes. She tried to downplay her fraud, insisting that she’d dry-labbed only a few times. When the detectives pressed further, she finally broke down. “I messed up,” she said. “I messed up bad.”

  Dookhan eventually pleaded guilty to twenty-seven counts of perjury, tampering with evidence, and obstruction of justice. Her confession also plunged the entire legal system of Massachusetts into chaos. Because Dookhan couldn’t remember which samples she’d dry-labbed and which she’d actually tested, all 36,000 cases she’d worked on during her career were now suspect. The state legislature had to allocate $30 million to deal with the fallout; one legal advocacy group estimated it would take sixteen paralegals a full year of work just to notify all the affected people, much less get them into court. Appeals began flooding in, and Massachusetts courts ultimately overturned 21,587 convictions, the largest such action in U.S. history.

  Drug analyst Annie Dookhan after her arrest for one of the most widespread frauds in science history. Upon being caught, she wept, “I messed up. I messed up bad.” (Courtesy of the Boston Herald.)

  The dismissals must have been sweet revenge for the likes of the cashew-crack perp, who knew that the lab’s superwoman was crooked all along. (People on the streets of Boston began to speak of being “Dookhaned.”) But there were other issues here as well.

  However you feel about America’s never-ending war on drugs— and all the fairly harmless people caught in its dragnet—at least some of those 21,587 defendants were violent offenders. Thanks to Dookhan, they suddenly went free. At least 600 convicts were released from jail or had charges dismissed, and 84 of them marched right back out and committed more crimes. One of them murdered someone in a drug deal gone south. Another was arrested on weapons charges. Upon being caught, he laughed: “I just got out thanks to Annie Dookhan. I love that lady.”

  In November 2013, a judge sentenced Dookhan to three to five years in prison. For comparison, trafficking a single ounce of heroin carried a sentence of seven years. Considering the scale of her misdeeds, the paltriness of the sentence frustrated many. “You walk away feeling this is really inadequate,” a state legislator said. “Three to five years is not adequate.” Indeed, Dookhan didn’t even serve three years, walking out of prison a free woman in April 2016.

  Annie Dookhan is hardly the only forensic scientist to be busted for wrongdoing. In the past twenty years, similar scandals have erupted in Florida, Minnesota, Montana, New Jersey, New York, North Carolina, Oklahoma, Oregon, South Carolina, Texas, and West Virginia. Sadly, the string of incidents includes the distortion or withholding of forensic evidence in at least three death-penalty cases.

  Incompetence has been an ongoing issue as well. Crime labs have been caught leaving evidence under leaky roofs or in unsecured hallways. One lab was run by police officers who got most of their scientific training through Wikipedia. Agonizingly, Massachusetts got burned a second time shortly after Dookhan’s arrest. A chemist in the state’s Amherst lab was caught dipping into samples of meth, cocaine, ketamine, and ecstasy at work and getting high while running tests. She also smoked crack in the courthouse bathroom before testifying.

  Dookhan’s fraud nevertheless stands out for its audacity and scope. In some ways, it’s hard to believe she got away with her crimes for so long. In other ways, it’s no surprise at all. Our culture puts scientists on a pedestal: We like thinking there are people out there who value probity and truth above all else. We want to believe them, and scientists get bamboozled by their colleagues as easily as anyone. Remember, Dookhan’s supervisors received warnings about her, but they were slow to take meaningful action. Professional magicians, in fact, have said that scientists are often easier to fool than regular folks, because they have an outsized confidence in their own intelligence and objectivity. The Dookhans of the world simply exploit this fact.

  To be sure, the vast majority of scientists deserve our trust. But no matter how you slice it, scientific fraud isn’t rare. Hundreds of scientific papers get retracted every year, and while firm numbers are elusive, something like half of them are retracted due to fraud or other misconduct. Even big-name scientists transgress. Again, it’s unfair to condemn people from the past for failing to meet today’s
standards, but historians have noted that Galileo, Newton, Bernoulli, Dalton, Mendel, and more all manipulated experiments and/or fudged data in ways that would have gotten them fired from any self-respecting lab today.

  Fraud and other misdeeds erode public trust and damage science’s greatest asset—its reputation. Unfortunately, as our society becomes more technical and scientific, these issues will only get worse: exciting new scientific ventures will also present new opportunities to do each other wrong. But all isn’t lost. As we’ll see in the conclusion, there are real, proven ways we can curb and cut down on such abuse.

  Footnotes

  1 Courtroom lawyers are familiar with the “CSI effect”—the unreasonable expectations that laypeople have for forensic science, due to pop culture. But they split on whether the CSI effect helps the defense or the prosecution. Some laypeople believe that forensic science is infallible: they’re awed by it, and they take whatever the experts say as gospel. This would play into the prosecution’s hands. Then again, because the technicians on CSI get perfect results every time, some jurors are disappointed when real-life scientists can’t match that precision, and they dismiss the results as worthless. This attitude would favor the defense. (Then there’s the merely ignorant. A presiding judge once overheard a juror complaining that the police in a certain case “didn’t even dust the lawn for fingerprints.”)

  2 In the interest of saving space, I’ll avoid ranting about the Melendez-Diaz ruling here. But you can visit samkean.com/books/the-icepick-surgeon/extras/notes for my argument.

  Conclusion

  New scientific breakthroughs almost always introduce new ethical dilemmas, and current technologies are no exception. What new ways to kill people will space exploration enable? Who will suffer most when cheap genetic engineering floods the world? What sorts of mischief could advanced artificial intelligence unleash? (For some answers to these questions, see the appendix.) The upside to putting on black mustaches and scheming up hypothetical crimes is that the very act of imagining them can help us anticipate and prevent those crimes in the future. There are things we can do immediately as well—strategies to promote ethical science in the here and now and avoid wading into the moral morasses we’ve encountered throughout this book.

  First and foremost, as basic as this sounds, scientists should strive to keep ethics in mind when designing experiments. This doesn’t need to be preachy or onerous. Even a simple prompt goes a long way, as one psychology study demonstrated in 2012.

  In the study, volunteers solved math problems for money; the better their score, the more cash. Then the real experiment started. The psychologists told the volunteers that they had to fill out a tax form to report their winnings; they could also apply for reimbursement for travel expenses using a second form. To encourage honesty, the volunteers had to sign a box on each form stating that they’d reported all information on it accurately. But not all the forms were created equal. In half the cases, the signature box was at the top, meaning that the volunteers had to swear to their honesty before filling in any data. In the other half, the signature box was at the bottom, and was filled in last. Guess which layout prompted more lying? Those who signed last, after filling everything out, were twice as likely to underreport winnings and overreport travel expenses. A similar trend held in a real-world experiment. This time the psychologists partnered with an insurance company that offered pay-as-you-go rates; basically, the fewer miles driven, the lower the premiums. The psychologists wanted to measure how honestly people reported their mileage on the forms, and once again, half the people signed at the top, half at the bottom. Those who signed at the bottom reported 2,500 fewer miles driven per car, a difference of 10 percent.

  Overall, the psychologists argued, having ethics in mind at the beginning of a task caused people to behave more honestly and checked their impulse to fudge things. (This probably explains why courts swear in witnesses before testimony, not after.) Moreover, after we lie, it’s already too late to fix things in some sense. We’re very good at rationalizing our own bad behavior by deploying the mental tricks we’ve seen throughout this book—using euphemisms to mask the truth, canceling out bad deeds with good ones, comparing ourselves favorably to people who do worse things, and so on. Signing forms at the end also enabled laziness. You might feel a genuine pang for having lied, but you’d also have to go back and change all your answers now—and really, who would bother? However cynical it sounds, one important part of ethics is making it convenient for people to be ethical.

  Now, obviously signing some little box won’t just magically eliminate all scientific sins. (What would it even say? I hereby swear not to do something so creepy and abusive that someone will write a whole book chapter about it someday, so help me god.) And no one can ever stop truly malicious people. But in most cases, with most people, having ethics in mind from the beginning prompts reflection and decreases the chances of disaster. To this end, the Nobel Prize–winning psychologist Daniel Kahneman has promoted the idea of “premortems.” In the more familiar postmortems, you examine some event after the fact to see what went wrong. In premortems, you brainstorm about what could go awry, and do so before starting. How, specifically, could this whole project turn into a debacle? Studies have shown that even ten minutes of reflection helps dispel groupthink and gives people a chance to voice doubts. Some groups even deliberately assign people the job of raising objections—a devil’s advocate role—to ensure at least some dissent. Along those same lines, scientists can overcome their blind spots by gathering input from a truly diverse group of people, who might raise flags that they missed. This includes people of different ethnicities, genders, and sexual orientations, of course, but also those who grew up in non-democracies or rural areas, those who grew up in blue-collar households or are religious. The more diversity of thought, the better.

  Another way to keep ethics in mind is (ahem) to read science history. Hearing some provost honk, “Be ethical!” is one thing. It’s another to immerse yourself in stories about transgressions and actually feel the gut-punch of the bad deeds. That’s why stories are so powerful—they stick. We also have to be honest that good intentions aren’t a shield. John Cutler had the best of intentions in Guatemala, to find ways of stopping syphilis and gonorrhea. He still infected people wantonly with STDs and killed several. John Money had the best of intentions in promoting the blank-slate theory of human sexuality, to increase tolerance for marginalized groups. He still ruined David Reimer’s life. Walter Freeman had the best of intentions in spreading psychosurgery, to provide relief for desperate asylum inmates. He still lobotomized thousands who didn’t need it. We all know what the road to hell was paved with.

  At the same time—and this is perhaps the hardest thing of all—it’s important not to paint a Cutler or Money or Freeman as a monster, because it’s all too easy to dismiss monsters as irrelevant. (I’m no monster, so I don’t need to worry.) If we’re honest with ourselves, any one of us might have fallen into similar traps. Maybe not in the specific cases above, and maybe not as egregiously. But somewhere, in some way, we too might have done something unethical. Honestly admitting this is the best vigilance we have. As Carl Jung said, an evil person lurks inside all of us, and only if we recognize that fact can we hope to tame them.

  Many people blithely assume that smarter people are more enlightened and ethical; if anything, the evidence runs the other way, since smart people assume they’re smart enough to elude capture. To revive the car analogy, having smarts is like having a huge engine with lots of raw horsepower. You might get to your destination faster, but if the steering (i.e., your morals) are out of whack, the chances of a spectacular wreck jump significantly. Morals also help us navigate life, and prevent us from heading down certain dangerous roads in the first place.

  The crimes detailed in this book shouldn’t undermine the incredible work that scientists do, day in and day out, in labs across the world. The far majority are lovely, selfless people, and our society wou
ld be immensely poorer without them—both materially and spiritually, considering all the wonders they’ve revealed. But scientists are still people. Like the chemist Harry Gold, they get drawn into conspiracies and betray friends. Like the pirate William Dampier, they grow obsessed with their research and look the other way at atrocities. Like the paleontologists Marsh and Cope, they try to sabotage their rivals and end up destroying themselves.

  Albert Einstein once said, “Most people say that it is the intellect which makes a great scientist. They are wrong: it is character.” I admit that when I first read that quote long ago, I scoffed. Who cares if a scientist is kindhearted or whatever? Discoveries—that’s what matters. After writing this book, however, I get it. On one level, science is a collection of facts about the world, and adding to that collection does require discoveries. But science is also something larger. It’s a mindset, a process, a way of reasoning about the world that allows us to expose wishful thinking and biases and replace them with deeper, more reliable truths. Considering how vast the world is, there’s no way to check every reported experiment yourself and personally verify it. At some point, you have to trust other people’s claims—which means those people need to be honorable, need to be worthy of trusting. Moreover, science is an inherently social process. Results cannot be kept secret; they have to be verified by the wider community, or science simply doesn’t work. And given what a deeply social process science is, acts that damage society by shortchanging human rights or ignoring human dignity will almost always cost you in the end—by destroying people’s trust in science and even undermining the very conditions that make science possible.

 

‹ Prev