by James Reason
and priming, 101-3
schemata effect, 100-1
convergent memory searches, 110-12
corpus gathering, 13-14
‘creeping determinism,’ 91, 215
crime and intention, 7
‘cross-talk’ errors, 29
Davis-Besse nuclear power plant, 54, 237-8
decision making
computer model, 144-6
generic error-modelling system, 65-6
intelligent support systems, 237-9, 242
normative theories, 36-40
in productive systems, 203-4
declarative knowledge, 129
‘default assignments,’ 34
‘default values,’ 66
‘defence in depth’ principle, 179-80
DESSY studies, 91-2
detached intentions, 71
detectability of error (see also error detection), 60
developmental perspective, 79-81
diary records, 14
dochotic listening, 27-8
digital computers, 30
distributed processing models, 30-1
divergent memory searches, 110-12
divided attention, 28-9
‘do nothing’ method, 163
‘double-capture’ slips, 68-70
dual-task interference, 29
dynamic environmental simulation system (see also DESSY), 91-2
ecological interface design, 246-8
encysting, 93
environmental ‘capture,’ 71
epistemic awareness, 113-14
error classification, 10-12
error detection, 148-72
classification, 159
cognitive processes, 167-71
data-driven aspects, 159
and error type, 9-10, 158-61
feedback control, 149-50
forcing functions, 161-2
frequency biases, 170-1
Italian studies, 160-1
mistakes vs. slips, 60
in problem solving, 9-10, 157-66
Swedish studies, 158
error detection rates, 165-7
error forms, 13, 97-124
error management, 244-6
error prediction, 4
error probabilities, 222-3, 229
error proneness questionnaires, 239
error reduction, 217-50
error risk, 217-50
error tolerant systems, 233, 246
error types (see also knowledge-based errors; rule-based errors; skill-based errors)
and cognition, 12-13
error detection relationship, 166-7
execution failures, 8
experimental studies of error, 15-16
expertise and error type, 57-8
eye-witness testimony, 23
failure tokens, 205, 210
failure types, 205, 210
fault diagnosis training, 241-3
FAUST group, 244
feedback control
error detection, 149-50, 165
and knowledge-based mistakes, 165
production vs. safety, 203-4
safety operations, 209-10
and training errors, 245-6
feedback delay, 91-2
feedforward, 24-5
feelings of knowing, 113-14
focal working memory, 127-33
focused attention, 26-8
forcing functions, 161-2
‘fragment theory,’ 103
frames and schemata, 34
frequency biases
and cognitive underspecification, 103-10
and divergent memory, 111
error detection, 170-1
error form origin, 97
knowledge retrieval, 115-23
machine model of, 131-6
and schema concept, 123
‘frequency paradox,’ 104
Freudian slip, 21-2
frontal lobe damage, 108
fundamental attribution error
in institutions, 212-13
and irrationality, 41
fundamental surprise error, 212-14
‘fuzzy rule’ model, 44-6
‘gagging’ device, 163
GEMS (see also generic error-modelling
system), 61-8
General Problem Solver (Newell and Simon), 42
general problem-solving rules, 78-9
generalisations and cognition, 39
generic error-modelling system (GEMS), 61-8
Gestalt tradition, 23-4
Ginna nuclear power plant, 55, 237
global workspace model (Baars), 47-8
groupthink syndrome, 41
habit intrusions
and blocked memory, 105-6
and error prediction, 5, 107
skill-based errors, 68-70
halo effect, 90
Herald of Free Enterprise, 193-4, 256
heterarchical models, 30
hindsight bias (see also knew-it-all-along effect), 38, 91, 215
human cognitive reliability model, 225-6, 231
human error probabilities, 222-3, 229
human-interactive system, 175-6
human reliability analysis
peer review study, 230
qualitative criteria, 232-4
systematic comparisons, 231-2
techniques, 221-9
validity, 229-31
illusion of control, 91
illusions, Sully’s analysis, 20-1
illusory correlation, 90
immediate memory, 31-2
impossible accident, 216
inferences
functions, 116
knowledge retrieval, 116-19
INPO analysis, 186-7
instantiated schema, 35
intellectual emergency reaction, 93
intelligent decision aids, 237-9
intentional action, 7-10, 52, 100
intentions
algorithm for intentional action, 6
and error definition, 5-10
in fallible machine, 131-3
and schemata, 52, 100
skill-based errors, 71-2
varieties of, 5-6
interference errors, 72
InterLisp system, 164
intuitive physics, 82
involuntary actions, 7-10
irrationality, 41-2
Jars Test, 78
judgement, heuristics, 39-40
King’s Cross Underground disaster, 194, 257
‘knew-it-all-along’ effect (see also hindsight bias), 215
knowledge-based errors
assessment technique, 225-6
detection of, 157-68
detection rates, 166-8
error ratio, 58-9
and error types, 55-66
experts vs. novices, 58
failure modes, 86-96
feedback control, 57, 165
and forcing functions, 162
generic error-modelling system, 61-8
pathologies, 86-96
and problem configurations, 86-7
in problem solving, 65-6, 157-65
in Rasmussen’s framework, 42-4
reduction of, system, 246-8
knowledge retrieval, 112-24
laboratory studies of error, 15-16
language errors, 79-80
lapses, see slips and lapses
late-selection’ theorists, 27
latent errors
case study analyses, 188-94
and systems disasters, 173-216
lexical errors, 155-6
limb apraxias, 109
line management, 204-5
linear thinking, 92-3
‘Lisp Debugger,’ 164
Lohhausen studies, 92-4
matching conditions (see also similarity biases), 77
means-ends analysis, 42
memory aids, 239-42
memory failures, 107
memory illusions, 20-1
memory organizing packets, 66
memory span, 31-2
Meringer’s influence, 22
Minsky’s influence, 34
mistakes
attentional focus, 56
definition, 9
detectability, 9-10, 60, 160
feedback availability, 165
forcing function mismatches, 162
and nature of error, 8-10, 53-61
and performance level, 56
monitoring failures, 63-4
motor response errors, 154-5
movement errors, 154-5
multichannel processors, 30-1
naive physics, 82
naturalistic methods, 13-14
neuropsychology, 24-5
Newell and Simon’s model, 42
nominal error probabilities, 223
Norman-Shallice model, 36
normative theories, 36-40
North Anna nuclear power plant, 237, 243
nuclear power plants
error risk, 217-50
latent errors, 173-216
OATS technique, 224-5
Oconee nuclear power plant, 227, 237
omission errors, 184-7, 240
omitted checks, 68-72
operator action trees (OATS), 224-5
optimising theories, 36-40
overattention, 63, 73-4
overconfidence, 89-90
Oyster Creek nuclear power plant, 54, 60, 76, 165, 237
parallel distributed processing, 46-7
parallel search processes, 117
pathogen metaphor, 197-9
pattern recognition, 34
perceptual errors, 154-5
perceptual illusions, 20-1
perceptual slips, 72
performance shaping factors, 222, 228
perseveration, 109, 155
persistence-forecasting, 40-1
personae, schema variant, 35
phonological errors, 155-6
phonological priming, 101-3
place-losing, 46
planning failures, 8, 12-13
plans, schema variant, 35
postural control
automatic correction of, 150-4
Head’s theory, 25
Prairie Island nuclear power plant, 237
predictable error, 3
primary memory, 31-2
priming, 101-3
prior intention, 6-7
probabilistic estimates, 4
probabilistic risk assessment (PRA), 219-21, 233-4
structure, 219-20
‘tree’ models, 219
problem behaviour graph, 42
problem solving
and automated systems, 182-4
cued error discovery, 162-3
error detection, 157-61, 166
error detection rates, 166
‘fuzzy rule’ model, 44-6
generic error-modelling system, 65-6
strategic and tactical aspects, 158
problem space, 42
productivity
and safety, decisions, 203
systems aspects, 199-201
program counter failures, 71
Prolog, 137, 141
prospective memory, 107
‘prostheses,’ 239, 241
prototypes, 35
Psychology of Everyday Things, The, (Norman), 235-7
Psychopathology of Everyday Life, The, (Freud), 22
q-morphisms, 74
qualitative predictions, 5
quasi-homomorphisms, 74
questionnaire studies, 14-15
Rasmussen’s framework, 42-4
Rasmussen’s survey, 184-5
rationalist theories, 36-40
rationality, 39
recency effect, 32
recurrent intrusions, 105-6
redundancy, 77, 170-1
relevance bias, 167
representativeness heuristic, 40, 91
resident pathogen metaphor, 197-9
resource theories, 28-9
rigidity and problem solving, 78
risk management, 234
Risoe National Laboratory, 246
Rouse’s model, 44-6
rule-based errors
action deficiencies, 83-6
assessment technique, 225-6
connectionism distinction, 46-7
detection of, 157-65
detection rates, 166-8
developmental perspective, 79-81
encoding deficiencies, 81-2
error ratio, 58-9
and error types, 55-66
expertise role, 57-8
failure modes, 74-86
feedforward control, 57
generic error-modelling system, 61-8
in problem solving, 44-6, 65-6, 157-65
Rasmussen’s framework, 42-4
reduction of, systems, 246-8
rule strength, 77
situational factors, 59-60
Rumelhart’s influence, 35
S-rules, 44-5, 55
safety, 206-7
feedback loops, 209-10
organisational reactions, 211-12
and production, decisions, 203
safety information system, 209-10
safety parameter display systems, 238
‘satisficing,’ 37-9
schemata
Bartlett’s theory, 25-6
contemporary theories, 33-6
general activators, 99-100
human error framework, 51-2, 66
specific activators, 99-100
schizophrenic language, 108-9
scripts
human error framework, 66
schema variant, 35
Seabrook nuclear power plant, 227
selective attention, 87
selective listening, 27-8
‘self-correct’ devices, 164
self-knowledge, 248
self-monitoring
error detection, 149
problem-solving errors, 157-61
skill-based errors, 149-57
self-report questionnaires, 14-15
semantic context, 181-3
semantic knowledge, 112-24
serial search processes, 117
SHARP technique, 229
short-term memory, 31-2
similarity biases
and cognition, 103-10
and convergent memory, 111
error detection, 170-1
error form origin, 97
knowledge retrieval, 114-23
machine model of, 130-6
and rationality, 39
simulators
error investigation, 16
training use, 244
situational factors, 59-60
Sizewell B nuclear reactor, 213, 230
skill-based errors
assessment technique, 225-6
and attention, 68-72
detection, 149-57, 167
detection rates, 166-8
and error ratio, 58-9
and error types, 55-65
failure modes, 68-74
feedforward control, 57
generic error-modelling system, 61-8
Rasmussen’s framework, 42-4
reduction of, 246-8
skill-rule-knowledge framework, 42-3
‘skip errors,’ 167
SLIM technique, 228-9, 230
SLIM-SAM software, 228
SLIM-SARAH software, 228
slips and lapses
attentional focus, 56, 68-70, 107
cognitive factors, 107
detectability, 9-10, 156-7, 160
feedforward control, 57
and forcing functions, 162
memory aid use, 239-41
naturalistic studies, 63, 65, 149
and nature of error, 8-9, 53-61
omitted checks, 68-70
and performance level, 56
slips of the tongue, 106, 149
specific problem-solving rules, 78-9
speech errors
correction rate, 166
detection and correction of, 155-6
Meringer’s study, 22, 155
‘spew hypothesis,’ 104
spoonerisms, 72, 156
standardisation, 237
statistical problem solving, 166
‘step-ladder’ model, 44
stereotypical reactions, 44
stress
and knowledge-based errors, 92
and training errors, 245
stress-vulnerability hypothesis, 15
strong-associate substitutions, 109
‘strong-but-wrong’ routines, 57-8
‘strong-but-wrong’ rules, 75-6
strong habit exclusion, 70
strong habit intrusion, 68-70
strong schema capture, 46
Subjective Expected Utility Theory, 37
Success likelihood index methodology (SLIM), 228-31
Sully, James, 20-1
supervisory control, 175
automated systems, 175-82
definition, 175
temporal aspects, 182
symptom rules , 44-5, 55
Systematic human action reliability procedure (SHARP), 229
systems failures
accident causation, 199-214
automation role, 174-84
case study analyses, 188-94
and latent errors, 173-216
resident pathogen metaphor, 197-9
T-rules, 44-5, 55
‘talk aloud’ technique, 160
task-interactive system, 175
Technique for Human Error Rate Prediction (THERP), 221-4, 237
technological factors, 174-82, 250
TESEO technique, 226-7, 231
test-wait-test-exit task, 73
text comprehension, 35
THERP technique, 221-4, 230-1
Three Mile Island, 54-5, 60, 164-5, 189-91, 213-14, 251
time-reliability techniques, 224-6
tip-of-the-tongue states, 105-6
Toledo Edison’s nuclear power plant, 181
topographic rules, 44-5, 55
training errors, 244-6
training issues, 241-6
transposition errors, 155-6
‘tree models,’ 219
unintended actions, 8-10
unitization hypothesis, 170-1
unsafe acts, 206-7
vagabonding, 93
variable error, 3-5
vigilance, 180
violations
classification, 195-6
cognitive mechanisms, 196-7
error distinction, 194-7
visual modality, 127-8
visual errors, 154-5
visual search task, 155
visuo-spatial scratch pad, 128
volition, 7-10
warning devices, 163
word frequency effect, 103-4
word length, 31-2
working memory
concept of, 32-3
machine model, 125-37, 144
‘workspace,’ 2
Zeebrugge disaster, 193-4