In this case, the users' story fits into a set of stories that the technician has heard many times, and so the problem is easy for him to solve. From his accumulated experience, he knows at once what to do. Over time, the particular stories that the technician has come across blend into a common repository of expertise, so that he may not be able to recall the times he has seen this particular pattern before. He sizes up the situation, recognizes a familiar pattern, and knows immediately how to proceed, without any explicit reasoning by analogy. Through such pattern recognition, the technician lessens the burden of coping with new events.5
Other repair problems may be trickier. For instance, recall the story in Chapter One about the copier with the E053 error message, which might mean either a problem in the 24-volt interlock power supply or a shorted dicorotron. In that case, the E053 story may not by itself solve the technician's problem, but it sets the technician on a path that may lead to a solution without wasting time on the wrong diagnosis.
An extensive source of knowledge-sharing stories is Car Talk, the weekly U.S. National Public Radio show (it is also available on the Web at www.cartalk.com). On the show, people ring in with car problems that they're having. The two hosts, brothers Tom and Ray Magliozzi, open the discussion with some jokey questions about the listener's name or the place where they live (“Is that Amy with a ‘y’ or ‘i-e’?”) and in the process, we learn something of the caller's context. Then the hosts move on to the problem that the caller has with the car. The discussion is along the lines of: “So your ‘69 Toyota is making what kind of a noise? … Is it tch-tch-tch or is it tcho-tcho-tcho? … Oh, I see…. And your boyfriend worked on the car? Aha! Now we're getting to the heart of the matter!” The hosts have deep knowledge of cars, as well as of people. The discussion is freewheeling and funny, but focused on solving the problem at hand. The hosts match the story of each caller with stories from their own experience, and in the process they come up with the most likely explanation of what's bothering the callers and their cars. Like most other environments where knowledge is being exchanged, the discussion is edgy, curious, insightful, and lively.6
Exchanges of knowledge-sharing stories occur to everyone countless times every day. In the stories, we are sometimes teaching ourselves what we know and think. Sometimes we are telling them to others. The stories, when heard, can become part of their lived experience. Through the acquisition of this new experience, existing thoughts and beliefs can evolve. This is how we learn, and this is why the transmission of knowledge is largely made up of storytelling.7
When we encapsulate our experience in a story, we include some details from the actual experience, sometimes embellishing it with potentially fictional details, and leaving out much of the experience altogether. This process is called leveling and sharpening. We do this so that the story doesn't take as long as the original experience took to live, and so that we can give a coherent account of the experience to our listeners. Each time we tell a story, we level and sharpen it in different ways to meet the current context. As the story changes, so our memory of the underlying events changes.8
Easy to overlook, the knowledge-sharing story is the workhorse of narrative—unashamedly unentertaining but eternally useful.
Telling the Knowledge-Sharing Story
Stories focus on anomalies—events that go counter to expectations. When everything goes as you expect—the sun comes up, spring follows winter, the airplane works flawlessly—there's no story. The regular recurring events of our existence are simply the way things are. They are unremarkable. To have the basis for a story, we need something unusual, something different, something out of the ordinary, something strange.
It has been so since time immemorial. The distant puff of smoke might mean a forest fire. The faint roar of a lion might mean an attack on the village. Tiny deviations from the norm attract our attention so we can take preventive action before it's too late. Paying attention to apparent anomalies is one of the reasons that we have survived as a species.
Most of the anomalies that we notice are potential bad news of one kind or another. The stories cited at the start of this chapter were about photocopiers that had broken down or cars that needed repairing. It is when a problem arises that there is something to tell a story about.
Weak signals are the fertile area for knowledge-sharing stories. We can learn a great deal from stories about near misses—for example, two airplanes that almost collided in midair or the terrorist who almost slipped through security. In such cases, there's still time to learn. If we understand the root cause, we may be able to avoid the mistake in future. If we don't pay attention to these weak signals, we may encounter a real disaster—the plane crashes or the bomb explodes—and then it's too late to learn. We may no longer be alive. All that can be done is for the survivors to send in detectives to try to figure out what went wrong and how to prevent it in future.
If we pay attention to minor anomalies, we may be able to prevent them from escalating into large disasters. Peter Senge illustrates the exponential growth of problems with the following fable (paraphrased here):
One morning a farmer observed that a lily pad had sprung up on his pond. The following day there were two lily pads, and on the third day there were four. Since they did not seem to be doing any harm he took no action. However, the number of lily pads continued to double every day until the pond was completely covered on the thirtieth day. He didn't notice ‘til the twenty-eighth day, and on the twenty-ninth day, he sees the pond is half covered. He thinks about what to do, but it's too late. The pond is totally covered the next day.9
According to the report of the 9/11 Commission, the United States suffered various small-scale terrorist attacks by al-Qaeda prior to 9/11 so that the attack on that day should have been “a shock but not a surprise.” Here are some of the early warnings:
In February 1993, a group led by Ramzi Yousef tried to bring down the World Trade Center with a truck bomb.
In November 1995, a car bomb exploded outside the office of the U.S. program manager for the Saudi National Guard in Riyadh, killing five Americans and two others.
In June 1996, a truck bomb demolished the Khobar Towers apartment complex in Dhahran, Saudi Arabia, killing nineteen U.S. servicemen and wounding hundreds.
In August 1998, al-Qaeda carried out near-simultaneous truck bomb attacks on the U.S. embassies in Nairobi, Kenya, and Dar es Salaam, Tanzania. The attacks killed 224 people, including 12 Americans, and wounded thousands more.
In October 2000, an al-Qaeda team in Aden, Yemen, used a motorboat to blow a hole in the side of a destroyer, the U.S.S. Cole, killing seventeen American sailors.
The 9/11 attacks on the World Trade Center and the Pentagon were far more elaborate, precise, and destructive than any of these earlier assaults. But by September 2001, the executive branch of the U.S. government, the Congress, the news media, and the American public had received warning that Islamist terrorists meant to kill Americans in large numbers.10
Positive Stories Can Work
It is frequently said that people learn more from failures than from successes. It's also true that people do learn from stories with a positive tone. For instance, Gary Klein tells the following story in Sources of Power:11
In 1996, a physician called Norman Berlinger had to deliver a baby that was diagnosed in the womb as having a large cystic hygroma on the side of his neck. The sonogram suggested that the hygroma had grown inside the neck, wrapping around the trachea, with the implication that the infant would die shortly after delivery because his air passage was blocked. Berlinger's strategy was to pierce the trachea and insert a breathing tube into it.
Upon delivery, the infant gave a cry, suggesting a clear breathing passage. But then the passage sealed up. The infant could not even grunt. Berlinger remembered an earlier situation, when he had been called in to operate on a young man who had run his snowmobile into a strand of barbed wire strung above the ground to discourage trespassers. The wire
had jumbled the victim's neck tissue into sausage-like chunks. On that occasion, when Berlinger arrived, he found that the emergency technician had already inserted a breathing tube, and Berlinger had wondered how this was done. The technician later explained that he stuck the tube where he saw bubbles. Bubbles meant air coming out.
So in the delivery room, Berlinger looked into the mouth of the infant for bubbles. All he saw was a mass of yellow cysts, completely obscuring the air passage. No bubbles. Berlinger placed his palm on the infant's chest and pressed down, to force the last bit of air out of the infant's lungs. Berlinger saw a few tiny bubbles of saliva between some of the cysts and maneuvered the tube into that area. The laryngoscope has a miniature light on its tip, and Berlinger was able to guide it past the vocal cords, into the trachea. The infant quickly changed color from blue to a reassuring pink. The procedure had worked.
Just as Berlinger learned from the positive story about looking for the bubbles, human beings have always learned from positive stories. They learned from the positive story that rubbing two sticks together would cause fire. They also learned from the story of the Wright brothers in 1904 that a heavier-than-air machine could fly. In the 1950s, they learned that cheap photographic copying on plain paper was possible. In the 1990s, they learned from stories about the Web that cheap global communication was a reality.
These were all positive anomalies at the time they occurred. Today they are no longer anomalies. Making a fire or flying a plane or making a photocopy or communicating through the Web have become so commonplace that they are no longer fit subjects for stories unless some additional element is present.
The point is not that we don't learn from positive stories; rather, it is that at any time, negative anomalies far outnumber positive ones. Hence we learn more often from failures than from successes. Positive teaching stories are few compared to negative ones. But what they can teach is immensely valuable.
Knowledge-Sharing Stories Aren't Inherently Interesting
An unusual feature of knowledge-sharing stories is that they don't necessarily follow the principles of the well-made story as Aristotle described several thousand years ago in his Poetics. The well-made story has a beginning, a middle, and an ending; it has characters and a plot that combines a reversal and a recognition. The storyteller visualizes the action and feels with the characters so that listeners immerse themselves in the world of the story.
Knowledge-sharing stories tend to be about issues and difficulties and how they were dealt with and why the course of action solved the problem. They don't necessarily have a protagonist, that is, a hero or heroine, or even a recognizable plot, let alone a turning point and a recognition.
Think back to the story that began this chapter: the malfunctioning Xerox copier. Or look at the story in Chapter One about the copier with an E053 error code. These are stories in the broad sense of events linked in some kind of causal sequence, but they don't follow the traditional pattern of a well-made story. As a result, they aren't inherently interesting because the human implications aren't made explicit.
To make the sequence of events interesting, you need to graft something scary or exciting onto it, say by adding a hero who undertakes a journey—as is done in business school case studies with varying degrees of success. Klein's Sources of Power gives a brilliantly successful example. Like the E053 story, it features a generally reliable piece of equipment that occasionally gives misleading information. But this time, the story is interesting:
A nurse in a neonatal intensive care unit has been providing primary care for a baby in the isolette next to the baby described here. She has noticed this baby having subtle color changes over a period of several hours…. Then in a matter of seconds, the baby turns blue-black … his heart rate drops but then levels out and holds steady at eighty beats per minute.
She knows immediately that he has suffered a pneumopericardium. Air has filled the sac that surrounds the heart and turned it into a balloon…. The heart is essentially paralyzed. She knows he will die within minutes if the air around his heart is not released. She knows this because she has seen it happen once before, to a baby who was her patient. That baby had died.
Meanwhile, the baby's primary nurse is yelling for X-ray, and a doctor to come and puncture the baby's chest wall. She figures that the baby's lung has collapsed, a common event for babies who are on ventilators, and, besides, the heart monitor continues to show a steady eighty beats per minute. The nurse who first spotted the problem tries to correct her—“It's the heart; there's no heartbeat”—while the team around her continues to point to the heart monitor. She pushes their hands away from the baby and screams for quiet as she listens through the stethoscope for a heartbeat. There is none, and she begins doing compressions on the baby's chest. The chief neonatologist appears, and she turns to him, slaps a syringe in his hand, and says, “It's a pneumopericardium. I know it. Stick the heart.” The X-ray technician calls from across the room that she is right: the baby's pericardium is filled with air. The physician releases the air, and the baby's life is saved.
Afterward, the team talks about why the monitor had fooled them. They realize that the monitor is designed to record electrical events, and it continued to pick up the electrical impulses generated by the heart. The monitor can record the electrical impulse but cannot show whether the heart is actually beating to circulate blood through the body.12
By embedding the equipment story in a human context and turning it into a traditional well-told story, the narrator makes it not only interesting but moving. In Klein's telling, the raw knowledge-sharing story about misleading information from a piece of equipment has been transformed into a gripping story about the human dynamics of dealing with a life-and-death situation in a hospital. The audience sees the story through the eyes of a heroine who not only knows what to do but also has the courage to contradict the nurse who is giving the primary care for the baby in question and issue a blunt ultimatum to the higher-status neonatologist. And it helps that the story has a happy ending. The focus is on the baby who was saved, not on the baby who died. As a result, the story has a positive tone. We learn not only about the functioning of the misleading equipment but also about what it takes for a nurse to apply that understanding in the real-life tensions of an intensive care unit in a hospital.
By including details that bring the story within the listeners' frame of reference, the teller helps listeners imagine themselves in the story. If the situation is one the listeners have already faced in the past or may face in the future, it becomes personally relevant—and the more personally relevant the story becomes, the more likely it is to be indexed in memory, and the more likely they are to draw from it in future situations. The story about the pneumopericardium is thus easier for a nurse to remember than any list of pneumopericardium symptoms.
Note also that considerable storytelling skill has gone into producing this succinct but gripping story. Klein trains his people for months before they are able to spot potential stories and retell them effectively.13 If you simply gather stories in the field, you will tend to get stories like the Xerox repair stories: bland, useful, and uninteresting—except to those whose lives and livelihood may depend on the content.
For Knowledge Stories, Include an Explanation
While a knowledge-sharing story often lacks the elements of a well-told story, it also has something that the traditional story lacks—an explanation. Human knowledge, including scientific knowledge, consists primarily of explanations—generic stories that explain the causal relations between a set of phenomena. Facts can be looked up, and predictions alone are merely useful for testing alternative explanations. Explanations provide understanding that allows us to comprehend the past and grasp how the future will unfold.14
Without an explanation, a story about something that has happened is mere information. Thus, the users' story of the malfunctioning copier that opened this chapter was in itself mere information because the users didn't know w
hat was causing the symptoms. The story becomes knowledge once the technician completes the diagnosis and adds the explanation: the reversing roll needs to be repaired.
To build an explanation, define the beginning state, the end state, and the causal factors, and then assemble an action sequence that ties the elements together. Test the sequence for coherence (Do the steps follow from each other?), applicability (Does it account for the end state?), and completeness (Does it pick up everything important?). If it passes these tests, it becomes the explanation of what happened. If the beginning state is a frequently occurring one, it may also become a mental model on which to base future actions.15
The Leader's Guide to Storytelling Page 21