The Crash Detectives
Page 19
The Tenerife accident gave Lauber’s work new energy, and in the years to come, cockpit resource management would be changed to “crew resource management,” in recognition that other flight personnel such as mechanics, flight attendants, dispatchers, and air traffic controllers had a role to play in safe flights. As a bonus, the acronym, CRM, remained the same.
“So many incidents in life as well as [in] other industries have broken down because of the ambiguity in communications,” said Christopher D. Wickens, a professor of psychology specializing in aviation human factors. “CRM is clearly one of the most important things that has developed in aviation over the past forty years.”
CRM was sometimes dismissed by those who said it’s a sissy notion, an “I’m okay, you’re okay,” touchy-feely exercise. Overall, resistance has subsided, though there remains a challenge in eliminating what Wickens calls the negative authority gradient, when differences in rank and experience in the cockpit create communication difficulties.
The power divide still restrains lower-ranked or less-experienced pilots from calling errors to the attention of a senior pilot. In a report for NASA, Dismukes and fellow human factors scientist Ben Berman discovered that captains would correct copilots when they made mistakes twice as often as first officers would when they saw the captain err.
“The cockpit traditionally was a strict hierarchy; the junior pilot never asked questions. Part of CRM training is to create an environment that, when [the junior pilot] has information that’s critical to the flight, the captain will listen,” Wickens said. Drawing a parallel to how rank is disregarded in safety-critical situations in the military, Wickens explained, “Landing on aircraft carriers, a low-ranking person can be in charge of things because they have the information that everyone needs. Authority shifts dynamically.”
Pointing out errors can make for difficult conversations, and many inhibiting factors were in play on the night in 2009 when a Colgan Air turboprop crashed on approach to the airport in Buffalo, New York. Rebecca Shaw, the twenty-four-year-old first officer, had been flying for Colgan for one year. Marvin Renslow, the forty-seven-year-old captain, had four years with the company but only a hundred hours flying as a captain on the Bombardier Q400. While Shaw sniffled with a head cold and responded with a lot of “uh-huhs,” the captain kept up a nearly one-way dialogue, even on approach to the Buffalo airport. Whether the first officer considered the banter a distraction isn’t clear. She did seem worried about the difficult conditions in which they were flying: at night, in ice. A reading of the CVR suggests she was not inclined to assert herself. Even her apprehension about the ice was less than direct: “I’ve never seen icing conditions. I’ve never deiced. I’ve never seen any . . . I’ve never experienced any of that.” She continued: “I’d have, like, seen this much ice and thought, ‘Oh my gosh we were going to crash.’”
As the plane neared the airport, Renslow mishandled a stick shaker alert that the plane was flying too slow, presumably because it had accumulated ice, though in actuality it had not. Stalling protections on the plane caused the nose to go down to gain airspeed but the pilot pulled it back up, exacerbating the problem. The plane crashed into a house near the airport, killing everyone on board and one person on the ground.
It was an entirely different accident in terms of specifics, but a case of the same reticence to speak up, when Asiana Airlines Flight 214 landed short of the runway at San Francisco International Airport on a clear summer day in 2013. A series of misunderstandings about the way the automation worked meant that the flight was coming in too low and too slow, and the decision to go around and try the landing again came too late.
The plane hit a seawall at the edge of the runway bordering San Francisco Bay, slammed onto the ground, and pivoted up before hitting the runway a second time. Lee Kang-guk, the captain, in the left seat, had ten thousand total flight hours on other jets but just thirty-three on the Boeing 777. He was transitioning from the Airbus A320 narrow-body under the supervision of Capt. Lee Jung-min, who was in the right seat.
After the accident, Lee Kang-guk told investigators that he delayed initiating a go-around because he thought “only the instructor captain had the authority.”
How open pilots are to asserting themselves, pointing out the errors of superiors, or acknowledging their own fallibility is highly influenced by culture. In an analysis in the Journal of Air Transportation in 2000, Michael Engle wrote that “there were extreme cultural differences” about whether “junior crewmembers should question the actions of captains” depending on where in the world they were from.
Forty years after the push to improve cockpit interaction and imbue the entire flight crew with a sense of shared responsibility, it seems the techniques work better in societies where individuality is valued more than rank. CRM may need to evolve to take into consideration the vastly different standards people have about interpersonal communication in parts of the world where aviation is experiencing the strongest growth, as in Asia, the Middle East, and South America.
The crashes of Colgan 3407 and Asiana 214 also shine a light on a hydra of issues that arrived like stowaways on the digital airplane: automation, complexity, and complacency.
Evolution
In the early days of flight, the cockpit was a busy and crowded place. The crew complement on the Hawaii Clipper in 1938 consisted of captain, first officer, second officer, third officer, fourth officer, engineer, assistant engineer, and radio operator—eight people required to fly six passengers. Each new generation of airplane incorporated advances that did better and faster a task formerly accomplished by the pilot. To fly as Wright had done meant to operate the machine with one’s body and engage with one’s senses. Each new advance made piloting less physical and more cerebral.
A normal crew consisted of three when Robert Pearson got his first job as an airline pilot in 1957. He was a first officer on the DC-3 for Trans-Canada Air Lines, which would become Air Canada in 1965. After flying the four-engine British Vickers Viscount, the DC-9, and the Boeing 727, Pearson was a forty-seven-year-old captain in 1983 when Air Canada went out and bought the world’s most modern jetliner, the two-engine wide-body Boeing 767. This airplane was radically different because of the incorporation of technology that eliminated the need for a third pilot. The flight engineer (sometimes called the second officer) had been responsible for supervising the airplane’s fuel, hydraulics, pneumatics, and electrical systems. But with the computers on the 767, the plane could monitor itself and present all that information to the pilots in bright, graphic, easy-to-read flight management system monitors.
The Boeing 767 was one step ahead of Airbus, which was producing an even more radical airplane, the first-generation fly-by-wire airliner that would put a computer between the flight controls and the control surfaces and create a protective flight envelope outside of which the pilot could not fly.
The digitization of flight started a new era, but were the airlines ready?
In February 1983, Pearson began a four-week course to qualify on the 767: two weeks of ground school and two weeks of flying the simulator. By April he was a captain. Sitting in the left seat, gazing at the array of gadgetry, he noted how many manual functions were now handled by the computer. “What did I know about computers? My experience with computers was using a Royal Bank of Canada ATM,” he said. He was about to have a near-catastrophic experience on the Boeing 767, the origin of which was in not understanding the basics of the new airplane’s technology.
On July 23, 1983, Pearson and First Officer Maurice Quintal were assigned to fly one of Air Canada’s new 767s from Montreal to Edmonton. Due to a series of misunderstandings, the ground crew calculated the amount of fuel to load on the airplane by converting fuel volume to pounds, which is how they filled the other airplanes in the Air Canada fleet. But the 767’s fuel system used kilograms. Since a pound is less than half a kilo, the error meant that only half the required fuel was pumped into the tanks for the four-and-a-hal
f-hour transcontinental flight. The fuel quantity display was not working, so the crew manually entered the number 22,300 into the flight computer—without realizing that the plane’s computer would consider it 22,300 kilos, or twice as much fuel as it actually contained. With the crew thinking the aircraft had enough fuel for the journey and then some, the plane departed.
Neither the pilots nor the fuelers realized their error, and the 767 no longer had a flight engineer managing the system whose job it would have been to ensure that the plane had the correct amount of fuel for the journey. “If everyone is trained and the lines are drawn as to who is responsible for what, there’s no ambiguity,” said Rick Dion, an executive with Air Canada maintenance who was a passenger on the flight. “In this case it was sort of open-ended. We weren’t aware who was responsible for the final say on this fuel stuff.”
Flight 143 was flying at forty-one thousand feet, about one hundred miles short of Winnipeg, when the first engine ran out of fuel, followed closely by the second. Without the engines to generate power, the pilots lost their flight deck instruments. They were seventy-five miles from the nearest airport. The riveting story of how experience and teamwork saved the day follows in part 5 of this book. The lesson here comes from Pearson, who said he and others learned that day that they were unprepared for the monumental leap in technology—and this from a man who had literally flown into the jet age.
“Transitioning from the noncomputer age to the computer age was more difficult than transitioning from propeller planes to jets, and it wasn’t because they flew twice as high and twice as fast. It was all the big unknowns,” he said.
After years of accidents attributable to pilot error, automating some functions was intended to make flying more precise, more efficient, and of course safer. A look at the decline in the rate of air accidents since the arrival of the digital airplane shows the benefits. The number of crashes resulting in the loss of the airplane, known as a “hull loss,” has remained stable over the years, while the number of flights increased from half a million a year in 1960 to nearly thirty million in 2013. The third and fourth generation of automated airplanes, those with digital displays and computers that protect the airplane from maneuvers outside a predetermined range of safe flight parameters, are even more effective.
Automation’s downside is that it creates both complexity and complacency. The complexity can cause pilots to misunderstand what the airplane is doing or how it works. It was complexity that caused half a dozen Air Canada employees to be unable to calculate how much fuel to pump into Flight 143. It was the opacity of the system that led the pilots to think that by entering the amount of fuel they thought had been loaded into the tanks, they would get an accurate reading of the fuel available for their flight. Recognizing the mistake afterward, Pearson said he understood for the first time the expression “Garbage in, garbage out.”
Considering how automation can lead to confusion, it is a paradox that it can also contribute to crew obliviousness. With the L-1011 on autopilot, all three men on Eastern Flight 401 turned their attention away from the controls to work on changing a lightbulb. More recently, a Northwest Airlines flight from San Diego to Minneapolis made headlines around the world when the pilots got so wrapped up working on their laptops that they flew past their destination.
Flight 188 was one hundred fifty miles beyond Minneapolis International Airport with 144 passengers in October 2009 when a flight attendant called the pilots, curious to know why the plane had not begun its descent. For fifty-five minutes, the pilots had failed to acknowledge radio calls from air traffic control in Denver and Minneapolis or calls from the flight crew of another Northwest plane. To this day, people suggest that the pilots must have fallen asleep, because how else could they have missed hearing all the people calling them on the radio?
Robert Sumwalt was a member of the NTSB at the time. A former airline pilot, he was familiar with the troublesome issue of complacency. In 1997, Sumwalt and two others went through anonymous pilot reports and found that failure to adequately monitor what the airplane was doing was a factor in one-half to three-quarters of air safety events. Between 2005 and 2008 an airline industry group found sixteen cases similar to Northwest Flight 188, including one in which a captain returned from the bathroom and found the first officer engaged in a conversation with the flight attendant. The copilot’s back was to the instruments, so he did not notice that the autopilot had disconnected and the plane was in danger of stalling. After losing four thousand feet of altitude, the captain was able to recover control of the airplane.
When Flight 188 made headlines, then-FAA administrator Randy Babbitt got on the evening news and castigated the flight crew. He pointed out that the Northwest pilots were on their laptops doing work unrelated to the flight, a prohibited activity. “It doesn’t have anything to do with automation. Any opportunity for distraction doesn’t have any business in the cockpit. Your focus should be on flying the airplane.”
Tough talk sounds good, especially when stories such as that of the Northwest Flight 188 get blasted all over the news, making air travelers nervous. Still, telling pilots to pay closer attention is too simple. It may not even be possible to give unrelenting focus to routine tasks, according to Missy Cummings, a systems engineer and director of Duke University’s Humans and Autonomy Lab. “The human mind craves stimulation,” she said. Failing to find it, the mind will wander.
Cummings, a former navy F-18 pilot, is a proponent of automation, and envisions a future with more of it, not less, if the problems identified by one of her former students at Massachusetts Institute of Technology can be resolved.
While working on her masters at MIT, Christin Hart Mastracchio conducted a study that showed that when automation reduces a workload too much, vigilance suffers. “Boredom produces negative effects on morale, performance, and quality of work,” she found. Now an air force captain at Minot Air Force Base in North Dakota, Mastracchio is a pilot on the sixty-year-old, eight-engine B-52 Stratofortress, and automation is not her problem.
“The B-52 is on the opposite end of automation. It takes five people just to fly it,” she told me. “It takes all of us working together to control the monstrosity. You need to find a center point where you have the right amount of automation.”
On the day that Asiana 214’s Lee Kang-guk was making his first approach to the San Francisco airport while training to be a captain on the Boeing 777, an electronic navigational aid that would normally have been used was down for maintenance. Since July 6, 2013, was a clear, sunny day, this might not have been a problem for many pilots, but it’s the practice of some airlines, including Asiana, to use automation all the time. After the plane’s crash landing, Capt. Lee Kang-guk told investigators he’d found making a visual approach “very stressful.” It was “difficult to perform,” he said, in the absence of the electronic system that tracks a plane’s glide path.
Capt. Lee Kang-guk was an experienced captain on the Airbus, though it is important to remember that he had just thirty-three hours on the Boeing 777. On approach, he made a series of errors while trying to get the airplane on the glide path to the airport. Neither he nor the pilot supervising him, Capt. Lee Jung-min, even discussed doing a go-around despite the fact that company policy required it when a plane was not at the appropriate height or speed approaching five hundred feet. In one respect, Capt. Lee Jung-min, with twelve thousand hours, was like Lee Kang-guk: it was his first flight as a training captain.
All these factors and others played a part in the accident. In its report, the NTSB concluded, “More opportunity to manually fly the 777 during training” would help pilots perform better.
In the process of writing this book, I had the chance to listen to a familiar story told from a new perspective. The tale begins the week prior to the historic flight of Orville and Wilbur Wright. Samuel Langley, the head of the Smithsonian Institution, had been given a government grant of $50,000 (equivalent to $1.2 million in 2016) to develop a powe
red airplane. Throughout the summer of 1903 he had been tinkering with this one-man contraption called the Great Aerodrome. It was to be catapulted from a track mounted on a houseboat in the Potomac River, but prototype flights had not gone well.
On December 8, 1903, with Langley’s assistant, test pilot Charles Manly, on board, the Great Aerodrome was pushed from the top of the boat, but it never became airborne. It collapsed and fell into the icy waters of the river. Discouraged, Langley gave up. Sixty-nine at the time, he may never have expected to live long enough to see man fly. Yet nine days later, the Wright Brothers made history with a twelve-second controlled flight at Kitty Hawk in North Carolina. That’s the story I knew.
The goose bump–inspiring and brilliant postscript was presented to me by John Flach, a professor and chair of the Department of Psychology at Wright State University in Dayton, Ohio: “Christine, the Wright Brothers learned that for a plane to work, it had to put control in the hands of humans. That’s a metaphor.”
It is a metaphor appropriate for aviation’s first century. But what about the second? Flying has gone from the days of the Wright Brothers controlling the plane by shifting their weight to pilots who sit at keyboards typing instructions that command a complex system of computers. For a while the debate has been over who or what does the job better, the human or the machine. What is emerging is that each does the job differently
“The computer is a rule-based system,” Flach told me. “What it means to be reasonable and human is to break the rules. A computer will continue to do its computing while the building burns around it. A human will adapt to the situation.”