Book Read Free

Human Error

Page 5

by James Reason


  The Psychopathology of Everyday Life first appeared as a separate volume in 1904 and was subsequently enlarged over the next 20 years. It became one of Freud’s most popular books, running into many editions. It is said that Freud first became aware that he was famous when he discovered his cabin steward reading this book on his voyage to America in 1909. The core ideas and many of the best examples were later expressed more succinctly (and more convincingly) in the Introductory Lectures on Psychoanalysis, given at the University of Vienna between 1915 and 1917 and later published (Freud, 1922). Thereafter, the phrase ‘a Freudian slip’ entered the language, and the view that errors were unconsciously determined became firmly lodged, for good and ill, in scientific and popular belief (see Reason & Mycielska, 1982, for a further discussion of the Freudian slip).

  1.3. Meringer and speech errors

  Although Paul (1880) was probably the first linguist to appreciate the significance of speech errors as clues to the covert mechanisms underlying speech utterance, it was Freud’s compatriot, Meringer, who undertook the first major study (Meringer & Mayer, 1895). By 1908, he had assembled a corpus of over 8,000 slips of the tongue and pen, a collection that is still being picked over by contemporary researchers (see Fromkin, 1973, 1980). One reason why this corpus is still in use is that Meringer was extremely scrupulous about his methods of data collection and was at great pains to avoid selectional bias. His relentless questionings of speakers whose tongues slipped in his presence became so notorious at the University of Vienna that rooms would empty at his approach.

  1.4. James, Munsterberg and Jastrow

  Three important developments occurred in the United States during this turn-of-the-century period. In 1890, after 12 years of labour, William James completed The Principles of Psychology. Not only did these two volumes provide some unparalleled descriptions of everyday cognitive failings, they also contained, in the chapters on habit, memory and will, nearly all the necessary elements of a theory of human error. James himself did not formally attempt such an enterprise, but his thinking has had a profound influence on almost everyone that has. While preachers traditionally take their texts from the scriptures, contemporary psychologists, especially those with an interest in the cognitive failings of everyday life, frequently look for theirs in the writings of William James. One text in particular will see extensive service throughout this book: “Habit diminishes the conscious attention with which our acts are performed” (James, 1890, p. 114).

  In a characteristically generous and farsighted manoeuvre, James contrived to have Hugo Munsterberg, originally a somewhat undervalued student of Wundt’s at Leipzig, appointed as professor of psychology at Harvard, while he himself reverted to his earlier status as professor of philosophy. Among Munsterberg’s many projects at Harvard was an investigation of the unreliability of eye-witness testimony. His book, On the Witness Stand: Essays on Psychology and Crime (1908), remains an early classic in this now active field of research. It is also interesting to note that research into ‘the psychology of testimony’ was of sufficient interest to warrant regular literature surveys in the early issues of the Psychological Bulletin (see Whipple, 1910, 1911).

  In 1905, Joseph Jastrow, professor of psychology at the University of Wisconsin, published an analysis of some 300 ‘lapses of consciousness’ collected from his students. This was the first systematic attempt to investigate slips of action (as distinct from slips of the tongue) and stressed the necessity of some kind of attentional intervention in order to prevent action sequences from deviating along habitual but unintended routes (see Reason, 1984a; Norman, 1981). Throughout the remainder of his long working life, Jastrow continued to be fascinated with human error and especially with those factors that lead to and sustain wishful rather than wise thinking.

  1.5. The Gestalt tradition

  The next major influence upon the study of error resulted from the work of Max Wertheimer, Wolfgang Kohler and Kurt Koffka, from whose exchange of ideas at Frankfurt in 1912 emerged the new Gestalt psychology. Following the example of William James, these three young men were driven by an evangelical urge to rescue psychology from the Wundtian domination of elemental sensations, feelings and images and from the associative ties that bound these ‘atoms’ into mental ‘compounds’. They strove to replace this ‘mental chemistry’ with a concern for phenomenal wholes. Psychological phenomena, they argued, were always more than the sum of their constituent parts. Parts did not determine wholes, but were determined by them.

  Their first demonstrations of this thesis involved various perceptual phenomena. Sense impressions, they maintained, are not passive photographic impressions; rather, they are actively shaped by the observer’s interests and by his or her innate tendency to make the parts of the figure fit a uniform whole. In particular, small irregularities are overlooked. As a result, perceived objects conform more to the Gestalt principles of ‘good figure’ than the sense data alone would justify. Later, these principles were extended into the field of memory and gave rise to specific predictions about the ways in which remembered figures become distorted with time.

  Although it was not their primary aim, the Gestalt psychologists were among the first to formulate a testable theory regarding human error mechanisms. While its specific predictions were not always borne out (Woodworth, 1938; Riley, 1962; Baddeley, 1968) and the physiological basis of the theory found little support, the influence of the Gestalt psychologists upon contemporary thinking about cognitive function has been immense. To a considerable extent, this impact resulted from the work of two later individuals: Kurt Lewin in Germany and later the United States and Sir Frederic Bartlett at Cambridge. Each carried the Gestalt tradition into somewhat different spheres: Lewin into the fields of action, decision making, motivation and personality (and through his students Zeigarnik and Luchins into memory and problem solving, respectively) and Bartlett into the study of remembering, thinking and skilled performance.

  1.6. The neuropsychologists: Lashley and Head

  One of the few positive outcomes of the First World War was the boost it gave to the description and assessment of psychological disorders associated with various types of brain damage. Two leading figures of this period were Karl Lashley in the United States, and Sir Henry Head in Britain.

  An enduring question in psychology is how we achieve the relatively automatic performance characteristic of skilled or highly practised behaviour. William James (1890) argued for ‘response-chaining’ or the nonattentional control of habitual sequences by behavioural feedback. Once established, each movement in the sequence is triggered by the kinaesthetic feedback generated by the preceding movement. The only conscious involvement necessary is starting the sequence. The remaining actions were under peripheral or feedback control.

  In 1917, Lashley made some clinical observations of a soldier whose spinal injury had resulted in almost complete anaesthesia for one leg. Although not perfectly coordinated, this man’s leg movements showed normal accuracy in both direction and extent. Lashley argued that since the spinal injury had effectively removed proprioceptive feedback, the control of these movements must reside centrally in the brain and that these preprogrammed instructions were run off independently of feedback. This ‘centralist’ or feedforward theory of motor control has proved extremely influential in shaping contemporary views of the serial organisation of both skilled performance (see Keele, 1973) and of its characteristic errors (Lashley, 1951).

  Head (1920) was similarly concerned with the effects of traumatic deaf-ferentation upon postural control. His observations led to the introduction of the now widely used concept of the ‘schema’ to explain how postural impressions are matched against some internalised model of body position. “Every recognisable [postural] change enters consciousness already charged with its relation to something that has gone before For this combined standard, against which all subsequent changes of posture are measured before they enter consciousness, we propose the word schema” (Head, 1920, pp.
605-606). This notion was further developed by Bartlett (1932). Few concepts have proved so useful in explaining the occurrence of a wide variety of systematic error forms.

  1.7. Bartlett and ‘schemata’

  Bartlett (1932) invoked the notion of schema (schemata in the plural) to explain systematic errors that were apparent in the recall of pictorial and textual material. He found that reproductions made from memory were more regular, more meaningful and more conventionalised than the original stories or drawings. Odd or uncommon features of the to-be-remembered material were ‘banalized’ to render them more in keeping with the person’s expectations and habits of thought. Bartlett believed that his subjects were unconsciously attempting to relate the new material to established knowledge structures or schemata. In his enduring phrase, they manifested “effort after meaning”. These reconstructions were sometimes the result of conscious strategies, but more often they were unconscious processes.

  A schema was defined by Bartlett (1932, p. 201) as “an active organisation of past reactions, or of past experiences, which must always be supposed to be operating in any well-adapted organic response. That is, whenever there is any order or regularity of behaviour, a particular response is possible only because it is related to other similar responses which have been serially organised, yet which operate, not simply as individual members coming one after another, but as a unitary mass.”

  Bartlett emphasised three fundamental aspects of schemata: (a) that they were unconscious mental structures (“schema are active, without any awareness at all”), (b) that they were composed of old knowledge (“They are masses of organised past experiences.”), and (c) that long-term memory comprised active knowledge structures rather than passive images. Thus, schemata reconstructed rather than reproduced past experiences. And this process leads to certain predictable biases in remembering, due in large part to “the tendency to interpret presented material in accordance with the general character of earlier experience.”

  Like his Gestalt contemporaries, Bartlett emphatically rejected the earlier atomistic view of mental processes. In particular, he opposed the view, originating with Ebbinghaus, that each memory trace retained its own essential individuality. Since this was very much the orthodoxy of the thirties, Bartlett’s ideas fell largely upon stony ground. It was not until the mid-1970s that the schema notion was disinterred, and achieved its current prominence in psychological theorising as a term to describe higher-order, generic, cognitive structures that underlie all aspects of human knowledge and skill.

  1.8. The doldrums

  With the end of the First World War, German psychology, once the dominant tradition, was either spent or scattered (happily, it is now restored). The centre of psychological influence moved to the United States which had recently come under the sway of John B.Watson and the animal learning theorists. This marked the beginning of a 40-year ‘ice-age’ in which mentalism went underground or was pushed to the fringe and notions like volition, consciousness, imagery, intention and purpose largely vanished from the prestigious journals.

  The upshot of this bloodless civil war—whose Fort Sumter was the publication of Watson’s Behaviourist Manifesto in 1913—was that psychology came to be ruled by two quite distinct and mutually hostile forces. The behaviourists dominated academic psychology, and the psychoanalysts exercised a powerful influence over the rest, particularly the lay mind and the popular press. Neither camp had much interest in the deviation of action from intention. The behaviourists would not admit of intention, at least not as something having any scientific worth, and the psychoanalysts denied the deviation, in keeping with Freud’s assertion that errors are symptomatic of some unconsciously held wish.

  With a few notable exceptions —Spearman, Lewin, Bartlett and the social psychologists—serious interest in the nature and origins of human error languished for nearly 20 years, until it was revived by both the needs and the ‘clever’ machines of the Second World War. These brought in their train the information-processing approach to human cognition. From the vantage point of the late 1980s, it is possible to distinguish two distinct variants of this general approach: the natural science and the cognitive science traditions.

  2. The natural science tradition

  2.1. Focused attention and ‘bottleneck’ theories

  A major theoretical concern of the 1950s and 1960s was the location of the ‘bottleneck’ in human information processing. At what stage did a parallel processing system, capable of handling several inputs at the same time, become transformed into a serial system through which only one set of signals could be handled at any given moment? This question presumed that humans act as a single communication channel of limited capacity at some point in the information-processing sequence.

  One group of investigators maintained that the selection occurs at an early perceptual stage (Broadbent, 1958; Treisman, 1969). Others, the ‘late-selection’ theorists, claimed that the bottleneck is located at the point where decisions are necessary to initiate a response (Deutsch & Deutsch, 1963; Norman, 1968; Keele, 1973). This could be either an overt motor response, or a covert response concerned with storing material in long-term memory. There were also compromise theories that argued that there could be more than one bottleneck within the processing system (Kerr, 1973; Posner & Snyder, 1975).

  I will not review the current state of this rather arid debate (see Broadbent, 1982; Wickens, 1983). For our purpose, it is sufficient to note that both early and late selection theories still have their passionate advocates, though the weight of the evidence seems to favour the latter view. Needless to say, data exist that embarrass both camps.

  A number of pioneering studies carried out during the early years of the century (see Woodworth, 1938) showed that people are surprisingly good at focusing their attention upon a given task and ignoring irrelevant events in their immediate surroundings. Whereas these early experiments typically compared task performance with and without distractors, modern studies of focused attention have usually employed the simultaneous presentation of two or more sources of information and examined the subjects’ ability to process one of these selectively. A commonly used technique has been the dichotic listening task, in which two messages are presented by earphones to different ears and the person is required either to ‘shadow’ (to repeat every word on the attended channel) or to monitor (to detect particular target signals) the message stream emanating from a previously identified source. Other studies have employed brief presentations of visual stimuli organised in complex arrays or ‘overlapping’ videorecordings of two different activities (i.e., where both are visible on the screen at the same time).

  These modern studies confirmed the earlier findings: that people are indeed very good at processing one of two physically distinct concurrent sources of information, particularly when there is no ambiguity about the to-be-shadowed channel. But this selectivity is not perfect. Certain types of information from the nonselected channel can break through. These ‘break-throughs’ depend upon both the physical and the semantic properties of the unattended message.

  With regard to physical properties, people can usually tell if the unattended message was a human voice or a noise, and if the former, whether the speaker was a man or a woman. They can also detect gross alterations in the physical character of the irrelevant source: sudden changes in the voice of the speaker, a switch from a voice to a tone or an isolated sound (see Moray, 1969). In addition, breakthrough rates increase as the spatial separation between the signal sources diminishes (Treisman, 1964).

  Other kinds of breakthrough are clearly determined by the content of the unattended message. The attentional mechanism appears to be tuned to detect certain types of signals, regardless of whether they are on the selected or nonselected channel. Common intrusions are the presence of one’s own name, the name of a recently visited country or the title of a book by an author known personally to the listener.

  Yet other breakthroughs can occur when words on the
rejected channel fit into the context of what has just been processed on the selected channel. The implication is that the content of the selected message primes the attentional mechanism so as to make subsequent words of appropriate content more likely to receive conscious processing, even when they occur on the rejected channel. But this contextual tuning does not seem to work in reverse; the content of a rejected message has no effect on later performance (Broadbent, 1971).

  Finally, these selectivity experiments demonstrate that switching attention takes time. Selectivity is degraded with brief auditory messages presented dichotically. Performance is poor when people are given single pairs of words simultaneously, and are directed to attend to one ear. With longer messages, however, precueing is more advantageous. It seems likely that selection is most effective when the content of the attended message is coherently organised to permit the serial grouping of signals and some preview of future inputs (Kahneman, 1973). By the same token, time is also needed to redirect the focus of attention after a period of selective listening. Gopher and Kahneman (1971) showed that listeners are particularly susceptible to intrusions from the previously selected channel for some seconds after having been told to switch attention to the other ear. There are wide individual differences in proneness to such errors, and there is some evidence to show that characteristic switching rates, as measured by laboratory tests, are related to on-the-job proficiency and safety records of professional drivers (Kahneman, 1973).

 

‹ Prev