Book Read Free

The Folly of Fools: The Logic of Deceit and Self-Deception in Human Life

Page 22

by Robert Trivers


  But there is one striking difference between space and aviation disasters. In the United States, aviation disasters are immediately and intensively studied by teams of experts on twenty-four-hour notice in an institution designed to be insulated from outside interference, the National Transportation Safety Board. The NTSB generally does a superb job and publicizes its findings quickly. It almost always discerns key causes and then makes appropriate recommendations, which appear to have helped reduce the accident rate steadily for some thirty years, so that flying is, by far, the safest form of travel. I know of only one case of a delayed report (about three years) and this was because of interference on the international level, when Egypt fought the truth to the bitter end.

  By contrast, NASA’s accidents are investigated by a committee appointed to study only a specific disaster, with no particular expertise, and sometimes with a preordained and expressed goal to exonerate NASA. Study of one disaster does not prevent another, even when it has many of the same causes identified in the first case. Of course, safety corners can more easily be cut when only the lives of a few astronauts are at stake, instead of the great flying public, including airline personnel.

  Aviation disasters usually result from multiple causes, one of which may be self-deception on the part of one or more key actors. When the actors number more than one, we can also study processes of group self-deception. A relatively simple example of this is the crash of Air Florida Flight 90 in 1982, in which both pilot and copilot appear to have unconsciously “conspired” to produce the disaster.

  AIR FLORIDA FLIGHT 90—DOOMED BY SELF-DECEPTION?

  On the afternoon of January 13, 1982, Air Florida Flight 90 took off from Washington, D.C.’s National Airport in a blinding snowstorm on its way to Tampa, Florida. It never made it out of D.C., instead slamming into a bridge and landing in the Potomac River—seventy-four people died, and five survivors were fished out of the back of the plane. Perhaps because one of those who died was an old friend of mine from Harvard (Robert Silberglied), I was listening with unusual interest when soon thereafter the evening news played the audiotape of the cockpit conversation during takeoff. The copilot was flying the plane, and you could hear the fear in his voice as he also performed the role the pilot should have been playing, namely reading the instrument panel. Here is how it went:

  Ten seconds after starting down the runway, the copilot responds to instrument readings that suggest the plane is traveling faster than it really is: “God, look at that thing!” Four seconds later: “That doesn’t seem right, does it?” Three seconds later: “Ah, that’s not right.” Two seconds later: “Well . . .”

  Then the pilot, in a confident voice, offers a rationalization for the false reading: “Yes, it is, there’s 80,” apparently referring to an airspeed of 80 knots. This fails to satisfy the copilot, who says, “Naw, I don’t think that’s right.” Nine seconds later, he wavers: “Ah, maybe it is.” That is the last we hear from the copilot until a second before the crash when he says, “Larry, we’re going down, Larry,” and Larry says, “I know.”

  And what was Larry doing all this time? Except for the rationalization mentioned above, he only started talking once the mistake had been made and the plane was past the point of no return—indeed when the device warning of a stall started to sound. He then appeared to be talking to the plane (“Forward, forward.” Three seconds later: “We only want five hundred.” Two seconds later: “Come on, forward.” Three seconds: “Forward.” Two seconds: “Just barely climb.”). Within three more seconds, they were both dead.

  What is striking here is that moments before we have a human disaster that will claim seventy-four human lives, including both primary actors, we have an apparent pattern of reality evasion on the part of one key actor (the pilot) and insufficient resistance on the part of the other. On top of this, typical roles were reversed, each playing the other’s: pilot (ostensibly) as copilot and vice versa. Why was the copilot reading the contradictory panel readings while the pilot was only offering a rationalization? Why did the copilot speak while it mattered, but the pilot started talking only when it was too late?

  The first thing to find out is whether these differences are specific to the final moments or we can find evidence of similar behavior in the past. The answer is clear. In the final forty-five minutes of discussion between the two prior to takeoff, a clear dichotomy emerges. The copilot is reality-oriented; the pilot is not. Consider their discussion of snow on the wings, a critical variable. Pilot: “I got a little on mine.” Copilot: “This one’s got about a quarter to half inch on it all the way.” There were equal amounts of snow on both wings but the pilot gave an imprecise and diminutive estimate, while the copilot gave an exact description.

  And here is perhaps the most important exchange of all, one that occurred seven minutes before takeoff. Copilot: “Boy, this is a losing battle here on trying to de-ice those things. It gives you a false sense of security is all that it does” (!!). Pilot: “This, ah, satisfies the Feds.” Copilot: “Yeah—as good and crisp as the air is and no heavier than we are, I’d . . . ” Here is the critical moment in which the copilot timidly advanced his takeoff strategy, which presumably was to floor it—exactly the right strategy—but the pilot cut him off midsentence and said, “Right there is where the icing truck, they oughta have two of them, pull right.” The pilot and copilot then explored a fantasy together on how the plane should be deiced just before takeoff.

  Note that the copilot began with a true statement—they had a false sense of security based on a de-icing that did not work. The pilot noted that this satisfies the higher-ups but then switched the discussion to the way the system should work. Though not without its long-term value, this rather distracts from the problem at hand—and at exactly the moment when the copilot suggests his countermove. But he tried again. Copilot: “Slushy runway, do you want me to do anything special for this or just go for it?” Pilot: “Unless you got something special you would like to do.” No help at all.

  The transcript suggests how easily the disaster could have been averted. Imagine the earlier conversation about snow on the wings and slushy conditions underfoot had induced a spirit of caution in both parties. How easy it would have been for the pilot to say that they should go all-out but be prepared to abort if they felt their speed was insufficient.

  A famous geologist once surveyed this story and commented: “You correctly blame the pilot for the crash, but maybe you do not bring out clearly enough that it was the complete insensitivity to the copilot’s doubts, and to his veiled and timid pleas for help, that was the root of all this trouble. The pilot, with much more experience, just sat there completely unaware and without any realization that the copilot was desperately asking for friendly advice and professional help. Even if he (the pilot) had gruffly grunted, ‘If you can’t handle it, turn it over to me,’ such a response would have probably shot enough adrenaline into the copilot so that he either would have flown the mission successfully or aborted it without incident.” It is this dreadful, veiled indecision that seems to seal the disaster: the copilot tentative, uncertain, questioning, as indeed he should be, yet trying to hide it, and ending up dead in the Potomac.

  The geologist went on to say that in his limited experience in mountain rescue work and in abandoned mines, the people who lead others into trouble are the hale and hearty, insensitive jocks trying to show off. “They cannot perceive that a companion is so terrified he is about to ‘freeze’ to the side of the cliff—and for very good reasons!” They in turn freeze and are often the most difficult to rescue. In the case of Flight 90, it was not just the wings that froze, but the copilot as well, and then so did the pilot, who ended up talking to the airplane.

  Earlier decisions infused with similar effects contributed to the disaster. The pilot authorized “reverse thrust” to power the airplane out of its departure place. It was ineffective in this role but apparently pushed the ice and snow to the forward edge of the wing, where they wou
ld do the most damage, and at the same time blocked a key filter that would now register a higher ground speed than was in fact obtained. The pilot has been separately described as overconfident and inattentive to safety details. The presumed benefit in daily life of his style is the appearance of greater self-confidence and the success that this sometimes brings, especially in interactions with others.

  It is interesting that the pilot/copilot configuration in Flight 90 (copilot at the helm) is actually the safer of the two. Even though on average the pilot is flying about half the time, more than 80 percent of all accidents occur when he is doing so (in the United States, 1978–1990). Likewise, many more accidents occur when the pilot and copilot are flying for the first time together (45 percent of all accidents, while safe flights have this degree of unfamiliarity only 5 percent of the time). The notion is that the copilot is even less likely to challenge mistakes of the pilot than vice versa, and especially if the two are unfamiliar with each other. In our case, the pilot is completely unconscious, so he is not challenging anyone. The copilot is actually challenging himself but, getting no encouragement from the pilot, he lapses back into ineptitude.

  Consider now an interesting case from a different culture. Fatal accident rates for Korea Airlines between 1988 and 1998 were about seventeen times higher than for a typical US carrier, so high that Delta and Air France suspended their flying partnership with Korea Air, the US Army forbade its troops from flying with the airline, and Canada considered denying it landing rights. An outside group of consultants was brought in to evaluate the problem and concluded, among other factors, that Korea, a society relatively high in hierarchy and power dominance, was not preparing its copilots to act assertively enough. Several accidents could have been averted if the relatively conscious copilot had felt able to communicate effectively with the pilot to correct his errors. The culture in the cockpit was perhaps symbolized when a pilot backhanded a copilot across the face for a minor error, a climate that does not readily invite copilots to take strong stands against pilot mistakes. The consultants argued for emphasizing copilot independence and assertion. Even the insistence on better mastery of English—itself critical to communicating with ground control—improved equality in the cockpit since English lacked in-built hierarchical biases to which Koreans responded readily when speaking Korean. In any case, since intervention, Korea Air has had a spotless safety record. The key point is that hierarchy may impede information flow—two are in the cockpit, but with sufficient dominance, it is actually only one.

  A similar problem was uncovered in hospitals where patients contract new infections during surgery, many of which turn out to be fatal and could be prevented by simply insisting that the surgeon wash his (or occasionally, her) hands. A steep hierarchy—with the surgeon unchallenged at the top and the nurses carrying out orders at the bottom—was found to be the key factor. The surgeon practiced self-deception, denied the danger of not washing his hands, and used his seniority to silence any voices raised in protest. The solution was very simple. Empower nurses to halt an operation if the surgeon had not washed his hands properly (until then, 65 percent had failed to do so). Rates of death from newly contracted infections have plummeted wherever this has been introduced.

  DISASTER 37,000 FEET ABOVE THE AMAZON

  Another striking case of pilot error occurred high above the Amazon in Brazil at 5:01 p.m. on September 26, 2006. A small private jet flying at the wrong altitude clipped a Boeing 737 (Gol Flight 1907) from underneath, sending it into a horrifying forty-two-second nosedive to the jungle below, killing all 154 people aboard. The small American executive jet, though damaged, landed safely at a nearby airport with its nine people alive. Again, the pilot of the small jet seemed less conscious than his copilot when the disaster was upon them, but neither was paying attention when the fatal error was made, nor for a long time afterward.

  The key facts are not in doubt. The large commercial jet was doing everything it was supposed to do. It was flying at the correct altitude and orientation (on autopilot); its Brazilian pilots were awake, alert, and in regular contact with their flight controllers. In addition, they were fully familiar with the plane they were flying and spoke the local language. The only mistake these pilots made was getting out of bed that morning. By contrast, the American crew was flying a plane of this kind for the first time. They were using the flight itself to master flying the craft by trial and error as they went along. Although they had had limited simulation training on this kind of airplane, they did not know how to read the instrument panel and, as they put it while in flight, were “still working out the kinks” on handling the flight management system. When attempting to do so, they could not compute time until arrival or weather ahead, much less notice whether their transponder was turned off, as soon enough it was. They tried to master the airplane display systems, toyed with a new digital camera, and planned the next day’s flight departure. They chatted with passengers wandering in and out of their cockpit. They did everything but pay attention to the task at hand—flying safely through airspace occupied by other airplanes.

  They were, in fact, flying at the wrong altitude, contradicting both normal convention (even numbers in their direction) and the flight plan they had submitted (36,000 feet for the Brasilia–Manaus leg of their trip). But their own error was compounded by that of the Brasilia controller who okayed their incorrect orientation. They had managed to turn off their transponder (or it had done so on its own), so they were flying invisible to other planes and were blind themselves—a transponder warns both oncoming craft of your presence and you of theirs—yet they were completely unaware of this. They were barely in contact with the flight controllers, and when they were, the pilots showed little evidence of language comprehension or of interest in verifying what they thought the controllers were saying (“I have no idea what the hell he said”). They had spoken disparagingly of Brazilians and of the tasks asked of them, such as landing at Manaus.

  Their flight plan was simplicity itself. They were to take off from near Sao Paolo on a direct leg to Brasilia at 37,000 feet; then they were to turn northwest toward Manaus at 36,000, since planes flying in the opposite direction would be coming at 37,000 feet. They then were to land at Manaus. Automatic pilots would attend to everything, and there was only one key step in the whole procedure: go down 1,000 feet when they made their turn high over Brasilia. This is precisely what the flight plan they submitted said they would do, it was the universal rule for flights in that direction, and it was assumed to be true by the flight bearing down on them from Manaus.

  It was not, however, what they did. Instead, as they made their turn, they were at that moment busying themselves with more distant matters—trying to calculate the landing distance at Manaus and their takeoff duties the next day. This was part of their larger absorption in trying to master a new plane and its technology. For the next twenty minutes, the mistake was not noticed by either the pilots or the Brazilian air controller who had okayed it, but by then the plane’s transponder was turned off and there was no longer clear evidence to ground control of who and where they were. There is no evidence of deception, only of joking around as if jockeying for status while being completely oblivious to the real problem at hand. This is a recurring theme in self-deception and human disasters: overconfidence and its companion, unconsciousness. Incidentally, it was the copilot who seems first to have realized what may have happened, and he took over flight of the plane, later apologizing repeatedly to the pilot for this act of self-assertion. He was also the first to deny the cause of the accident on arrival and provide a cover-up.

  In the example of Air Florida Flight 90, the pilot’s self-deception—and copilot’s insufficient strength in the face of it—cost them their lives. In the case of Gol Flight 1907, both pilots who caused the tragedy survived their gross carelessness while 154 innocents perished. This is a distressing feature of self-deception and large-scale disasters more generally: the perpetrators may not experience stro
ng, nor indeed any, adverse selection. As we shall see, it was not mistakes by astronauts or their own self-deception that caused the Challenger and Columbia disasters but rather self-deception and mistakes by men and women with no direct survival consequences from their decisions. The same can be said for wars launched by those who will suffer no ill effects on their own immediate inclusive fitness (long-term may be another matter), whatever the outcome, even though their actions may unleash mortality a thousand times more intense in various unpredictable directions.

  ELDAR TAKES COMMAND—AEROFLOT FLIGHT 593

  It is hard to know how to classify the 1994 crash of Aeroflot Flight 593 from Moscow to Seoul, Korea, so absurd that its truth was covered up in Russia for months. The pilot was showing his children the cockpit and, against regulations, allowed each to sit in a seat and pretend to control the plane, which was actually on autopilot. His eleven-year-old daughter enjoyed the fantasy, but when his sixteen-year-old son, Eldar, took the controls, the teen promptly applied enough force to the steering wheel to deactivate most of the autopilot, allowing the plane to swerve at his whim.

 

‹ Prev