Handbook Human Factor and Ergonomic
Handbook Human Factor and Ergonomic
t
1<
r
Tlúrd Edition
Edited by
Gavriel Salvendy
Purdue University
West Lafayette, Indiana
and
Tsinghua University
Beijing, People' S Republic 01 China
GQ
WILEY
JOHN WILEY & SONS, INe.
,\
No pan of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means,
electronic, mechanieal, photocopying, recording, scanning, or otherwise, except as permitted under Section 1070r 108 of the
1976 United States Copyright Act, without either the prior written perrnission of the Publisher, or authorization through payment
of the appropriate per-copy fee to the Copyright Clearanee Center, 222 Rosewood Drive, Danvers, MA 01923, (978) 750-8400,
fax (978) 646-8600, or on the web at www.copyright.eom. Requests to the Publisher for permission should be addressed to the
Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748-6011, fax (201) 748-6008,
or online at http://www.wiley.comlgo/pennission.
Disclaimer
The editor, authors, and the publisher have made every effort to provide accurate and complete information in the Handbook
but the Handbook is not intended to serve as a replacernent for professional adviee. Any use of the information in this
Handbook is at the reader's discretion. The editor, authors, and the publisher specifically disclaim any and allliability arising
directly or indirectly from the use or application of any information contained in this Handbook. An appropriate professional
should be consulted regarding your specific situation.
For general information about our other products and services, please contact our Customer Care Department within the United
States at (800) 762-2974, outside the United States at (317) 572-3993 or fax (317) 572-4002.
Wiley also publishes its books in a variety of electronic formats. Sorne content that appears in print may not be available in
electronic books. For more informatian about Wiley products, visit our web site atwww.wiley.com.
1098765432
-------------------------------_.-_ _-
..
CHAPTER 27
HUMAN ERROR
Joseph Sharit
University of Miami
Coral Gables, Florida
5.4 Human Error Data 737 10.2 Methods for Investigating Human Error 754
6 INCIDENT REPORTING SYSTEMS 737 11 TOWARD MINIMIZATION OF HUMAN
ERROR AND THE CREATION OF SAFE
6.1 Design, Data Collection, and SYSTEMS 756
Management Considerations 737
6.2 Historical Antecedent 741 REFERENCES 757
possible, due to the awkward posture the worker are appealing and to sorne extent justified. The issue,
must assume; tactile eues are not detectable due however, is not so much whether these views should
to requirements for wearing protective clothing; and be dismissed, but whether they should be embraced.
although present, auditory feedback from the switch's The position taken here is tbat human error is a
activation is not audible, due to high noise levels. real phenomenon tbat has at its roots many of the
Residual vapors originating from a rarely performed same attentional processes and architectural features
procedure during the previous shift ignite, resulting in of memory tbat enable the human to adapt, abstract,
an explosiono In the second case, a worker adapts the infer, and create, but that also subject tbe human
relatively rigid and unrealistic procedural requirements to various kinds of information-processing constraints
dictated in a written work procedure to demands that can provoke unintended or mistaken actions.
that continually materialize in the form of shifting Thus, although it may be convenient to explain
objectives, constraints on resources, and changes in unintended action slips (Section 3.2.5) such as the
production schedules. Managernent taeitly condones activation of an incorrect control or tbe selection of
these proeedural adaptations, in effect relying on tbe the wrong medication as rational responses in contexts
resourcefulness of the worker for ensuring that its goals characterized by pressures, confíicts, ambiguities, and
are mer (Section 8). However, when an unantieipated fatigue, a closer inspection of the work context
scenario causes the worker' s adaptations to result in an can, in theory, reveal the increased possibihry for
accident, management is swift to renounee any. support certain types of errors as cornpared to others. It
of aetions in violation of work procedures. is human fallibility, in a11 its guises, that infiltrates
In the first case the worker' s action that led to these contexts, and by failing to acknowledge tbe
the aecident was unintentional; in the second ease the interplay between human fallibility and context-for
worker' s actions were intentional. In both cases the instanee, the tendency for a eontext to induce "capture"
issue of whether tbe "actor" committed an error is by the wrong control or the wrong medication-we
debatable. One variant on the position that rejects the are left with a shoddier picture of tbe contexto
notion of human error would shift tbe blame for the Granted, the contextual details comprising dynamic
adverse consequences from the actor to management work activities are difficult enough to establish, let
or the designers. Latent management or latent designer alone tbeir interplay witb human fallibility. However,
errors (Section 7.3.1) would thus absolve the actor this fact attests only to the difficulty of predicting
from human error in each of these cases. The worker, human error (Section 4), especial1y complex errors, not
after all, was in the heat of the battle, perfonning to tbe dismissal of its existence. Whereas rejecting the
"normal work," responding to the con textual features notion of human error may represent a gracious gesture
of tbe situation in reasonable, even skillful ways. toward the human' s underlying disposition, it can also
A second variant on this position would cast dangerously downplay aspects of human fallibility tbat
doubt on the process by which the attribution of need to be understood for implementing error reduction
error is made (Dekker, 2005). By virtue of having and error management strategies.
knowledge of events, especially bad events such
as accidents, outside observers are able-s-perhaps 1.2 New Directions
even motivated-to invoke a backward series of Much of the practical knowledge that has been accu-
rationalizations and logical connections that has neat1y mulated on human error in the last half century
filtered out the subtle and complex situational details has derived primarily from industries requiring haz-
that are likely to be the basis for tbe perpetrating ardous operations that are capable of producing catas-
actions. Whether this process of establishing causality trophic events. Not surprisingly, the textbook scenar-
(Section 10) is due to convenience, or derives from the íos typically used for studying human error carne
inability to determine or comprehend the perceptions from domains such as nuclear power, chemical pro-
and assessments made by the actor that interlace the cessing, and aviation. Witb the publication of To Err
more prominently observable events, the end result Is Human (Kohn et al., 1999) carne the revelation
is a considerable underestimation of the inftuence of shocking data that formally announced the new
of contexto Even tbe workers themselves, if given scourge in human error-s-medical error. According
the opportunity in each of these cases to examine to this report, between 44,000 and 98,000 hospitalized
or reflect upon their performance, may acknowledge patients die annually as a result of human error. These
their actions as errors, easily spotting all the poor figures were extrapolated frorn studies tbat included
decisions and improperly executed aetions, when in the relatively weIl known Harvard Medical Practice
reality, within the frames of refercnces at the time Study in New York (Leape et al., 1991). Although
the behaviors oecurred, their aetions were in faet tbese figures have been contested on tbe grounds that
reasonable, and constituted "mostly normal work." many of tbe patients whose deatbs were attributed to
The challenge, according to Dekker (2005), is "to medical error were predisposed to die due to the sever-
understand how assessments and actions that from ity of their illnesses, tbere is also an opposing belief
the outside look like errors become neutralized or that these errors were underreported by as much as ti
normalized so that from the inside they appear factor of 10 (Cunen et al., 1995). lf true, the number
unremarkable, routine, normal" (p. 75). . of preventable hospital deaths attributable to human
These views, whieh essentially deny the existence error is staggeríng, even if adjustments are rnade for
of human error (at least on the part of the actors) deaths that were likely to occur due to illness alone.
710 DRSIGN FOR HEALTH, SAFETY, AND COMFORT
It also signals the need for heightened concern for the deviate from expectations and thus do not consu.
many mistakes in health care that are probably occur- tute human error. Similarly, intentional violations ot
ring outside hospital environments. procedures, although also of great concem, are typi-
Fear of blame and retribution through litigation cally cxcluded from definitions of human error when
accounts for much of the underreporting in health care the actions have gone as planned. For example, vio-
and reftects an industry that is still mired in the blame lations in rigid "ultrasafe and ultraregulated systems"
culture of traditional mid-twentieth-century American are often required for effectively managing work con-
industry. What truly dissociates medical error from straints (Amalberti, 2001). However, when violations
human error in other high-risk work domains is the result in unforeseen and potentially hazardous con-
belief by many people that they can assume the role ditions, these actions would constitute human error.
of expert based on the experiences they, a family Exploratory behavior under presumably protective or
member, or a close friend have had, and that they kind conditions as encountered in formal training pro-
have the right to hold an industry that is extracting high grams O[ trial-and-error self-Iearning situations, which
premiums for their services accountable for its actions. leads either to unintentional actions or mistaken aetions
It remains to be seen if this attitude will carry over to should also be dissociated from human error. This
other industries. In any case, medical error presents distinction highlights the need to acknowledge tbe
unique challenges, and although we do not intend to role of error- indeed, even the need for encouraging
diminish the significance of human error in industries errors-in adaptation and creativity and in the acqui-
with relatively long-standing traditions for addressing sition of knowledge and skill associated with learning.
the role of human error in safety, due emphasis will The situation becomes more blurred when humans
also be given to medical error. knowingly implement strategies in performance thar
wiII result in sorne degree of error, as when a
2 DEFINING HUMAN ERROR supervisor encourages workers to. adopt shorteuts
Tbe presumption of human error generally occurs that trade off accuracy for speed, or when the
when various types of committed or omitted human buman reasons tbat the effort needed to eliminate
actions appear, when viewed in retrospect, to be linked the possibility of some types of errors may inerease
to undesirable consequences, although unwanted con- the likelihood of more harmful errors. As with
sequences do not .necessarily imply the occurrence procedural violations, if these strategies come off as
of human error. Following the distinctions proposed intended, the actor would not consider the attendant
by Norman (1981) and Reason (1990), the term error negative outcomes as having resulted from human
usually applies only to those situations where there error. However, depending on the boundaries of
was an intention to perform sorne type of action, and acceptable outcomes established or perceived by
would inelude cases where there was no prior inten- external observers such as managers OI the public,
tion to act. Thus, a very well practiced routine that is the human' s actions may in fact be considered to be
performed without any prior intention, such as swip- in error. Accordingly, a persou's ability to provide
ing dirt from a tool, may constitute an error depending a reasonable argument for behaviors that resulted in
on the effect of that action. More typically, errors are unwanted consequences does not necessarily exonerate
associated with prior intentions to act, in which case the person from having committed an error. What of
two situations can be differentiated. If for whatever actions the person intends to commit that are normally
reason the actions did not proceed as planned, any associated with acceptable outcomes but which result
unwanted consequences resulting from these actions in adverse outcomes? Tbese would generally not
would be attributed to an error arising from an unin- be considered to be human error except perhaps
tentional action. In the case where the actions did by unforgiving stakeholders who are compelled to
proceed as intended but did not achieve their intended exact blame.
resuIt, any unwanted outcomes stemming from these The lack of consensus in arriving at a satisfying def-
actions would be associated with an error resulting inition of human error is troubling in that it can under-
from intended but mistaken actions. mine efforts to identify, control, and mitigate errors
In each of these situations the common eIement across different work domains and organizations. In
is the occurrence of unwanted or adverse outcomes. fact, sorne authors have abandoned the term human
Whether intended or not, negative outcomes need not error altogether. Hollnagel (1993) prefers the term
be directly associated with these actions. Human error erroneous action to human error, which he defines
thus also subsumes actions whose unwanted outcomes as. "an action which fails to produce the expected
may occur at much later points in time or following result and which therefore leads to an unwanted.conse-
the interjection of many other actions by other people. quence" (p. 67). Delcker's (2005) view of errors as "ex
It can also be argued that even if these actions did postfacto constructs rather than as objective, observed
not result in adverse outcomes but had the potential facts" (p. 67) is based on the accumulated evidence
to, they should be viewed as errors, in line with the on hindsight bias (Section 10.1). Specifically, the pre-
current emphasis on near rnisses and the recognition disposition for this bias has repeatedly demonstrated
that what separates many accidents from events with how observers, including people who may have been
no visibly apparent negative consequences is chance recent participants of the experiences being investi-
alone. Acts of sabotage, although capable of bring- gated, impose their k:nowledge(in the form of assump-
ing about adverse consequences, are not actions that tions and facts), past experiences, and future intentions
t t
I
I Administrative Policies and Orqanizationat Culture J
t t
f Time Conslraints Workload
Flow 01 Information 1, Warnings and Alarms
..
lnforrnatíon N_T,,;, O< ~
It f Presentatton Technologies Procedures
n errup IO~S and Access
and Distractlons I Teamwork
, Novel and Unanticipated Events I Hardware and
1Workgroup Culture J - Software
1
" . Au1omation
Multlple and Lockouls
Knowledge
Demands
r -, Shifting Objectives j Procedures J
Workspace and Equipment Design _I Reminders
r
CONTEXT I Environmental Conditions I BARRIERS
ADVERSE
I~_E_R_RO_R___.--- ... OUTCOME
to transforrn what was in fact inaccessible informa- of establishing links between human psychological
tion at fue time into neatly unfolding sequences of processes and the manifestation of adverse outcomes
events and deterministic schemes that are capable of across different work domains.
explaining any adverse consequence. These observer
and hindsight biases presumably do not bring us any 3 UNDERSTANDING HUMAN ERROR
closer to understanding the experiences of the actor in 3.1 A Modeling Framework: Human Fallibility,
the actual situation for whom there is no error- "the Context, and Barriers
error only exists by virtue of the observer and his or Figure 1 presents a simple modeling framework for
her position on fue outside of the stream of experi- demonstrating how human error arises and can result in
ence" (p. 66). adverse outcomes, There are three major components
Although this view is enlightening in its ability to in this model. The first component, human fallibility,
draw attention to the lirnitations of empiricist-based addresses the fundamental sensory, cognitive, and
paradigms that underlie many human' factors methods, motor limitations of humans that predispose them
it is also subject to some of the same criticisms that to error. The second component, context, refers
were raised in Section -1.1 in response to the current to situational variables that can affect the way in
trend toward perspectives that negate the existence which human fallibility becomes manifest. The third
of human error. Understanding both human fal1ibility component, barriers, concerns the various ways in
and the contexts in which humans must act keeps which human errors can be contained.
us on a pragmatic path capable of shaping design A number of general observations concerning this
and safety-related interventions, even as we strive modeling framework are worth noting. First, human
to find methods that can close the gaps between error is viewed as arising from an interplay between
objective and reconstructed experiences. As we shall human fallibility and context. This is probably the most
see in Section 4, the problems associated with defining 'intuitive way for practitioners to understand the causal- ;
human error can be partly overcome by shifting the ity of human error, Interventions that minimize human
emphasis to classification schemes that are capable - dispositions to fallibility, for example by placing fewer
712 DESIGN FOR HEALTH, SAFETY, AND COMFORT
memory demands on the human, are helpful only to the a stronger basis for linking underlying cognitive pro-
extent that they do not create new contexts that can, in ces ses with the external forro of the error, and thus
turn, create new opportunities .for human fallibility to should lead to more effective classifications of human
become manifest. Similarly, interventions intended to performance and human errors. As a simple illUstra_
reduce the error-producing potentíal of work contexts, tion of the cognitive engineering perspective, Table 1
for instance, by introducing new protocols for com- demonstrates how the same external expression ot an
munication, could unsuspectingly produce new ways error could derive from various underlying causes.
in which human fallibility can exert itself. Second, the Sociotechnical perspectives on human error focus
depiction of overlapping e)ements in the human falli- on the potential impact of management policies and
bility and context components of the model (Figure l ) organizational culture 00 shaping the contexts within
is intended to convey the interactive complexity that which people act. These "higher-order" contextual fac-
may exist among these factors. For example, memory tors are capable of exacting considerable influence
constraints may result in the use of heuristics that, in
certain contexts, may predispose the human to error;
these same memory constraints may also produce mis- Table 1 Examples of Different Underlying Causes 01
guided perceptions of risk likelihood. Similarly, train- the Same Externa' Error Mode
ing programs that dictate how work procedures should
be implemented could lead to antagonistic work group Situation: A worker in a chemícal processing plant clases
valve 8 instead of nearby valve A, which is the required
cultures whose doctrines afford increased opportunities
action as set out in the procedures. Although there are
for operational errors.
many possible causes of this error, consider the
Third, barriers capable of preventing the propaga- following five possíble explanations.
tion of errors to adverse outcomes could also affect the
context. This potential interplay between barriers and 1. The valves were close together and badly labeled.
context is often ignored or misunderstood in evaluat- The worker was not familiar with the valves and
ing a system' s risk potential. Fourth, system states or therefore chose the wrong one.
conditions that result frorn errors can propagate into Possible cause: wrong identification compounded by
adverse outcomes such as accidents, but only if the lack ot familiarity leading to wrong intention (once
gaps in existing barriers are aligned to expose such the wrong identification had oceurred the worker
windows of opportunity (Reason, 1990). The likeli- intended to elose the wrorig valve).
hood that errors will penetrate these juxtaposed barri- 2. The worker may have misheard instructions issued
ers, especially in high-risk work activities, is generally by the supervisor and thought that valve B was the
low and is the basis for the much larger number of near required valve.
misses that are observed compared to events with seri- Possible cause: eommunications failure giving rise to
ous consequences. Finally, this modeling framework is a mistaken intention.
intended to encompass various perspectives on human
3. Beeause of the clase proximity of the valves, even
error that have been proposed (CCPS, 1994)-in par-
though he intended to clase valve A, he inadvertently
ticular, the human factors and ergonomics, cognitive
. operated valve B when he reaehed tor the valves.
engineering, and sociotechnical perspectives. Possible cause: correet intention but wróng
In the human factors perspective, error is the result execution of aetion.
of a mismatch between task demands and human men-
tal and physical capabilities. Presumably this perspec- 4. The worker elosed valve B very frequently as part of
tive allows only general predictions of human error his everyday jobo The operation of A was embedded
to be made-primarily predictions of errors that are within a long sequenee of other operations that were
based on their external characteristics. For example, similar to those normally associated with valve B.
cluttered displays or interfaces that impose heavy The worker knew that he had to close A in this case,
demands on working memory are likely to overload but he was distracted by a colleague and reverted
back to the strong habit of operating B.
perceptual and memory processes (Section 3.2) and
Possible cause: intrusion of a strong habit due to
. thus possibly lead to the omission of actions or the
external distraction (correet íntention but wrong
confusion of one control with another. Guidelines that execution).
have been proposed for designing displays (Wickens
et al., 2004) are offered as a means for diminish- 5. The worker believed that valve A had to be closed.
ing mismatches between demands and capabilities and However, it was believed by the workforce that
thus the potentíal for error. In contrast, the cognitive despite the operating instructions, closing B had an
engineering perspectíve emphasizes detailed analysis effect similar to closing A and in fact produced less
of work contexts (Section 4) coupled with analysis of disruption to downstream production.
the human' s intentions and goals. Although both the Possible cause: violation as a result of mistaken
human factors and cognitive engineering perspectives information and an informal company culture to
on human error are very concerned with human infor- concentrate on production rather than safety goals
(wrong intention).
mation processing, cognitive engineering approaches
attempt to derive more detailed information about how Source: Adapted from CCPS (1994). Copyright 1994
humans acquire and represent information and how by the American Institute of Chemical Engineers, and
they use it to guide actions. This emphasis provides reproduced by permisSion of AIChE.
-- . __ . -- --- -----_--------------_-_--
HUMAN ERROR 713
,--_p_e_rc_e_:p_t_u_al_E_"_C_O_d_i_"_:.9
_ ___.11 Central proceSSin91 L.1 R_e_S_p_O_"_d_in_9......
Attennon
Resources
Response
Execution
Feedback
Figure 2 Generic model of human information processing. (Adapted trom Wickens et al., 2004.)
on the designs of workplaces, operating procedures, model of human information processing (Wickens
training programs, job aids, and communication pro- et al., 2004) that conceptualizes the existence of var-
tocols, and can produce excessive workload demands ious processing resources for handling the ftow and
by imposing multiple conflicting and shifting perfor- transformation of information (Figure 2).
mance objectives and by exerting pressure to meet According to this model, sensory information
production goals, often at the expense of safety con- received by the body's various receptor cells gets
siderations. How work cultures (the cultures associated stored in a system of sensory registers that has an
with those people responsib1e for producing products enormous storage capacity. However, this informa-
and services) become established is a complex phe- tion is available for further processing only briefly.
nomenon, and though numerous factors can play a Through the process of selective attention, subsets of
role, the strongest inftuence is most likely to be the this vast collection of information become designated
organizational climate (Section 9). Problematic work for further processing in an early stage of information
cultures are very resistant to change, and their remedi- processing known as perception. Here, information can
ation usually requires multiple interventions at both the become meaningful through comparison with informa-
management and operator levels over extended periods tion in long-term memory (LTM), which may result in
of time. a response or the need for further processing in a short-
Although the human factors and ergonomics, cog- term memory store referred to as working memory
nitive engineering, and sociotechnical perspectives (WM). A good deal of our conscious effort is dedi-
appear to suggest different approaches for predicting cated to WM activities such as visualizing, planning,
and analyzing human error, the study of human error evaluating, conceptualizing, and making decisions, and
wííl often require the collective consideration of these much of this WM activity depends on information that
different perspectives. Human capabilities and limita- can be accessed from LTM. Rehearsal of information
tions from a human factors and ergonomics perspective in WM enables it to be encoded into LTM; otherwise, it
provide the fundamental basis for pursuing more rig- decays rapidly. WM also has relatively severe capacity
orous cognitive engineering analyses of human error. constraints goveming the amount of information that
Similarly, the cognitive engineering perspective, in its can be kept active. The current contention is that within
requicements for more detailed analyses of work con- WM there are separate storage systems for accornmo-
texts, would be remiss to exclude sociotechnical con- dating visual information in an analog spatial form
'siderations (Chapter 10).
or verbal information in an acoustical form, and an
3.2 Human Fallibilíty attentional control system for coordinating these two
storage systems. Ultimately, the results of WMILTM
3.2.1 Human Information Processing analysis can lead to a response (e.g., a motor action
Ths basis for many human errors derives from funda- or decision), or to the revision of thoughts. Note that
mental limitations that exist in the human' s sensor)", although this sequence of information processing is
~ognitive, and motor processes (Chapter 5). These lim-: depicted in Figure 2 as flowing from left to right, in
ltations are best understood by considering a generic principle it can begin anywhere,
:.r"
With the exception of the system of sensory regis- the order for 50 mg oí a leukemia drug to 25 mg b
ters and LTM the processing resources in this model putting a line through the zero and inserting a "2" i~
may require attention. Often thought of as mental front of the "5." The resulting dose was perceived by
effort, attention is conceptualized here as a finite and the pharmacist as 250 mg and led to the death of a
flexible internal energy source under conscious control 14-year-old boy. The line that was meant to indicate
whose intensity can be modulated over time. Although a cross-out was not centered and turned out to be
attention can be distributed .among the information- much closer to the right side of the circle (due lo
processing resources, fundarnentallimitations in atten- psychomotor variability ; see Figure 1); thus, it could
tion constrain the capacities of these resources-that easily have been construed as just a badly written zero
is, there is only so much information that can undergo Also, when one considers that perception relies on both
perceptual coding or WM analysis. Focusing attention bottom-up processing (where the stimulus pattern is
on one of these resources will, in many cases, hand- decomposed into features) and top-down processing
icap other resources. Thus if a North American rents (where context and thus expectations are used for
a car with a manual transmission in Great Britain, recognition), the possibility that a digit was crossed
the experience of driving on the left-side of the road out may have countered expectations (i.e., it does not
may require substantial al1ocationof attention to per- usually occur).
ceptual processing in order to avoid collisions with If one were to presume that the pharmacist had
other drivers, perhaps at the expense of being able a hígh workload (and thus diminished resources for
to smoothly navigate the stick shift (which is now processing the prescription) and a relative lack oí
located to the left of the driver), or at the expense experience or knowledge concerning dosage ranges for
of using WM resources to keep adequate track of this drug, it is easy to understand how this error can
one's route. Whatever attention is allocated to WM come about. The dynamics of the error can be put into
may be needed for working out the cognitive spatial a more complete perspective when potential barriers
transformations required for executing left-hand and are considered, such as an autornatic checking system
right-hand tums. that could have screened the order for a potentially
Attention may also be focused almost exclusívely barmful dosage or interactions with other drugs, or
on WM, as often occurs during intense problem solv- a procedure that would have required the physician
ing or when planning activities. The ability to divide to rewrite any order that had been altered. Even if
attention, which is the basis for time sharing, is often these barriers were in place, which was not the case,
observed in people who may have leamed to rapidly there is a high likelihood that they would be bypassed.
shift attention between tasks. This skill may require In fact, if such a procedure were to be imposed on
knowledge of the temporal and knowledge demands physicians, routine violations would be expected given
of tbe tasks and the possibility for one or more of the the contexts within which many physicians work.
tasks having become automated in the sense that very
little attention is needed for their performance. Various 3.2.3 Long-Term Memory and Its Implications
dichotomies within the information-processing system for Human Error
have been proposed, for example, between the visual LTM has been described as a parallel distributed archi-
and auditory modalities and between early (perceptual) tecture that is being reconfigured continuouslythrough
versus later (central and response) processing (Figure selective activation and inhibition of massively inter-
2), to account for how people are able, in time-sbaring connected neuronal units (Rumelhart and McClelland,.
situations, to more effectively utilize theír processing 1986). These reconfiguration processes occur within
capacities (Wickens, 1984). distinct modules that are responsible for different rep-
Many . design implications arise from the errors resentations of information, such as mental images or
that human sensory and motor limitations can cause sentence syntax. In the process of adapting to new
OI contribute too Indeed, human factors studies are stimuli or thoughts, the complex interactions between
often preoccupied with deriving design guidelines for neuronal units that are produced give rise to the gen-
minimizing such errors. Knowledge concerning human eralizations and rules that are so critical to buman
limitations in contrast sensitivity, hearing, bandwidth performance. With regard to the forms of knowledge
in motor movement, and in sensing tactile feedback . stored in LTM, we usually distinguish between the
can be used to design visual displays, auditory alarms, general knowledge we have about the world, referred
manual control systems, and protective clothing (such to as semantic memory, and knowledge about events,
as gloves that are worn in surgery) that are less likely referred to as episodic memory,
to produce errors in detection and response. Much of When items of information, such as visual images,
the focus on human error, however, is on the role sounds, and thoughts based on existing knowledge,
that cognitive processing plays. Even seemingly simple are processed in WM at the same time, they become
situations involving errors in visual processing may associated with each other in LTM. The retrieval of
in fact be rooted in much more complex information tbis information from LTM will then depend on the
processing as illustrated in the following exarnple. strength of the individual items as well as the strengths
of their associations with other items. Increased fre-
3.2.2 Example: Medication Error quency and recency of activation are assumed to pro-
Consider the following prescription medication error, mote stronger (i.e., more stable) memory traces, which
which actually occurred. A physician opted to change are otherwise subject to negative exponential decays.
._---------------------- -_-_ .. _ _
... -
.. _
HUMAN ERROR 715
Much of our basic knowledge about things can be that provide a match to the inputs, and thus enable an
thought of as being stored in the form of semantic entire rule, by previous association with other items of
networks that are implemented through parallel dis- information in LTM, to be activated. Unfortunately,
tributed architectures. Other knowledge representation that rule may not be appropriate for the particular
schernes commonly invoked in the human factors lit- situation, Similarly, for instance in the case of a
erature are schemas and mental models. Schemas typ- radiologist who has recently encountered a large
ically represent knowledge organized about a concept number of tumors oí a particular type, the increased
or topic. When they reflect processes or systems for activation levels that are likely to be associated with
which there are relationships between inputs and out- this diagnosis may result in a greater tendency for
puts that the human can mentalIy visualize and exper- arriving at this diagnosis in future situations.
iment with (i.e., "run," like a simulation program), the
schemas are often referred to as mental models. The 3.2.4 Infonnation Processing
organization of knowledge in LTM as schemas or men- and Decision-Making Errors
tal models is also likely based on semantic networks.
The constraints associated with LTM architecture Human decision making that is not guided by nor-
can provide many insights into human fallibility mative prescriptive models (Chapter 8) is an activity
and how this fallibility can interact with situationa1 fraught with fallibility, especially in complex dynamic
contexts to produce errors. For example, implícit in environments. As illustrated in Figure 3, human limi-
the existence of parallel associative networks is the tations in decision making can arise from a number of
ability to recall both items of information and patterns information-processing considerations (Figure 2) that
(i.e., associations) of information based on partial directly or indirectly implicate LTM (Wickens et aL,
matching of this information with the contents of 2004). For example, if the information the human opts
memory. Because the contexts within which humans to select for WM activity, which may be guided by past
operate often produce what Reasori (1990) has termed experiences, is fuzzy or incomplete, intensive interpre-
cognitive underspecification, the implication ís that tation or integration of this information may be needed.
at sorne point in the processing of information the Also, any hypotheses that the decision maker generates
specification of information may be incomplete. It regarding this information will be highly dependent on
may be incomplete due to perceptual processing information that can be retrieved from LTM, and their
constraints, WM constraints, or LTM (i.e., knowledge) evaluation could require searching for additional infor-
limitations, or due to external constraints, as when mation. Although any hypothesis for which adequate
there is little information available on the rnedical support is found can become the basis for an action,
history of a patient undergoing emergency treatment the possible candidate actions that would need to be
or when piping and instrumentation diagrams have not evaluated in WM would first need to be retrieved from
been updated. LTM organization can overcome these LTM. In addition, the possible outcomes associated
Iimitations by retrieving sorne items of information with each action, the estimates of tbe likelihoods of
UNCERT AINTY
,-------------------,
DIAGNOSIS CHOICE
Working Al
Selective . ==:Perception Memory Action
Attention __. -------.... Outcome
-------....
r- ---------.,
: • Possible :
Long-Term , outcomes I
(A) Action
Feedback
Figure 3 Information-processing model ot decision making. (Adapted trom Wickens et al., 2004.)
l
716 DESIGN FOR HEALTH, SAFETY, AND COMFORl'
these outcomes, and the negative and positive impli- (Figure 2) are used, other aspects of human fal-
eations of these actions would also require retrieval libility (Section 3.2.6), and situational variables. A
from LTM. framework that distinguishes between skill-based
From an information-processing perspective, there rule-based, and knowledge-based levels of perfor~
are nurnerous factors that could constrain this decision- mance- Rasmussen' s SRK framework -emphasizes
making process, particularly factors that could influ- fundamentally different approaches to processing
ence the amount or quality of information brought into information and is thus particularly appealing for
WM and the retrieval of information from LTM. These understanding the role of human performance in
constraints often lead to shortcuts in decision making, analyzing and predicting different types of human
such as satisficing (Simon, 1966), whereby people opt errors (Rasmussen, 1986).
for choices that are good enough for their purposes Activities performed at the skill-based level are
and adopt strategies for sampling information that they highly practiced routines that require Iittle conseious
perceive to be most relevant. In general, the human's attention. Referring to Figure 2, these activities map
natural tendency to minimize cognitive effort (Sharit, perception directly to actions, bypassing WM. Follow-
2(03) opens the door to a wide variety of shortcuts
o
ing an intention for action that could originate in WM
or heuristics that are efficient and usually effective in or from environmental eues, the responses associated
negotiating environmental complexity, but under the with the intended activity are so well integrated with
right coincidence of circumstances can lead to inef- the activity's sensory features that they are elicited in
fective choices or actions that become designated as the form of highly automatic routines. Given the fre-
errors. For example, with respect to the cues of infor- quent repetitions of consistent mappings from sensory
mation that we pereeive, there is a tendency to over- features to motor responses, the meaning imposed on
weight cues occurring earlier than later in time or that o
perception by LTM can be thought of as hardwired to
change over time. WM will only allow for a limited the human's motor response system.
number of possible hypotheses, actions, or outcomes The rule-based level of performance makes use of
of actions to be evaluated, and LTM architecture will rules that have been established in LTM based on past
accommodate these limitations by making information experiences. WM is now a factor, as rules (of the
that has been considered more frequently or recently if-then type) or schemas may be brought into play
(the "availability" heuristic) more readily available and following the assessment of a situation or problem.
by enabling its partial-matching capabilities to classify More attention is thus required at this level of per-
eues as more representative of a hypothesis than may formance, and the partí al matching characteristics of
be warranted. Many other heuristics (Wickens et al., LTM can prove critical. When stored rules are not
2004), such as c.onfirmation bias (the tendency to con- effective, as is often the case when new or challenging
sider confirming and not disconfirming evidence when problems arise, the human is usually forced -to devise
evaluating hypotheses), cognitive fixation (remaining plans that involve exploring and testing hypotheses,
fixated OIl initial hypotheses and underutilizing sub se- and must continuously refine the results of these efforts
quent information), and the tendency to judge an event into a mental model or representation that can pro-
as likely if its features are representative of its category vide a satisfactory solution. At this knowledge-based
(e.g., judging a person as having a particular occupa- level of performance heavy demands on information-
tion based on the person' s appearance even though processing resources are exaeted, especially on WM,
the likelihood of having that occupation is extremely and performance is vulnerable to LTM architectural
low) derive primarily from a conservation of cogni- constraints to the extent that WM is dependent on LTM
tive effort. for problem solving.
An enormous investment by the human in WM In reality, many of the meaningful tasks that
aetivities (i.e., an extensive cornmitment to functioning people perform represent mixtures of skill, rule, and
in an attentional mode) would be required to expose knowledge-based levels of performance. Although
the biases that these heuristics can potentially induce. performance at the skill-based level results in a
It is important to note, however, that to exclude the significant economy in cognitive effort, the reduction
possibility that ahuman' s situational assessrnents are in resources of attention comes at a risk. For example,
in fact rational, explanations of human judgments and consider a task other than the one that is intended
behaviors on the basis of cognitive biases require a that contains features that are similar to those of the
sound understanding of the specific context (Fraser intended task. If the alternative activity is frequently
et al.. 1992). performed and therefore associated with skill-based
automatic response patterns, all that is needed is
3.2.5 levels of Human Performance and a context that can distract the human from the
Dispositions for Errors intention and allow the human to be "captured' by the
Considerations related to LTM architecture enable alternative (incorreet) task. This situation represents
many different types of human errors to be accounted example 4 in Table 1 in the case of an inadvertent
°
for by a few powerful principles, This few-to- closure of a valve. In other situations the capture
many .mapping between underlying memory mech- by a skill-based routine may result in the exclusión
anisms and different error types will be- infíu- of an activity. For example, suppose that task A
enced. by the nature of human performance, par- is performed infrequently and task B is performed
ticularly on how information-processing resources ° routinely at the skill-based level. If the initial steps
HUMAN ERROR 717
are identical for both tasks but task A requires an risk attitude, and confidence in intuitive abilities
additiona! step, this step is likely to be omirted can play a significant role in shaping the error
during execution of the task. Untimely interruptions modes, making these types of errors much harder to
are often the basis for omissions at the skill-based level predict. It is at this level of performance that we
of performance. In sorne circumstances, interruptions observe undue weights given to perceptually salient
or moments of inactivity during skill-based routines cues or early data, confirmation bias, use of the
may instigate thinking about where one is in the availability and representative heuristics (especially for
sequence of steps. By directing attention to routines assessing relationships between causes and effects),
that are not desígned to be examined, steps could underestimation and overestimation of .the likelihood
be performed out of sequence (reversa! errors) or be of events in response to observed data, vagabonding
repeated (Reason, 1990). (darting from issue to issue, often not even realizing
Many of the errors that occur at the rule-based that issues are being revisited, with essentially no
level involve inappropriate matching of either externa! effective movement through the problem space), and
cues or intemally generated information with the encysting (overattention to a few details at the expense
conditional components of rules stored in LTM. of other, perhaps more relevant information).
Generally, conditional cornponents of rules that have
been satisfied on a frequent basis or that appear 3.2.6 Other Aspects of Human Fallibility
to closely match prevailing conditions are more There are many facets to human fallibility, and all
likely to be activated. The prediction of errors at have the potential to contribute to human error. Por
this level of performance will thus require knowing example, personality traits that reflect dispositions
what other rules the human might consider, thus toward confidence, conscientiousness, and persever-
necessitating detailed knowledge not only about the ance could inftuence both the possibility for errors
task but also about the process (e.g., training or and the nature of their expression at both the rule-
experience) by which the person acquired rule-based and knowledge-based levels of performance, especially
knowledge, Mistakes in applying rules gene rally under stress. Overconfidence can lead to risk-taking
involve the misapplication of rules with proven behaviors and has been implicated as a contributory
success or the application of bad rules (Reason, factor in a number of accidents.
1990). Mistakes in applying rules with proven success Sleep deprivation and fatigue are forms of human
often occur when first exceptions are encountered. fallibility whose manifestations are often regarded
Consider the case of an endoscopist who rclies as contextual factors. In fact, in the maritime and
00 indirect visual information when performing a commercial aviation industries, these conditions are
colonoscopy. Based on past experiences and available ofien attributed to company or regulatory agency
knowledge, the sighting of an anatomical landmark rules goveming hours of operation and rest time.
during the performance of this procedure may be The effects of fatigue may be to regress skilled
interpreted to mean that the instrument is situated performers to the level of unskilled perfonners (CCPS,
at a particular location within the colon, when in 1994) through widespread .degradation of abilities
fact the presence of an anatornical deformity in that inelude decision making and judgment, memory,
this patient may render the physician' s interpretation reaction time, and vigilance. NASA has detennined
as incorrect (Cao and Milgram, 2000). These first that about 20% of incidents reported to its Aviation
exception errors often result in the decomposition of Safety Reporting System (Section 6.3), which asks
general rules into more specific rule forms and reftect pilots to report problems anonymously, are fatigue-
the acquisition of expertise. General rules, however, related (Kaye, 1999a). On numerous occasions pilots
usually have higher activation Jevels in LTM given have been found to fall asleep at the controls, although
their increased likelihood of encounter, and under they usually wake up in tinie to make the landing,
contextual conditions involving high workload and Another facet of human fallibility with impor-
time constraints, they will be the ones more likely to be tant implications for human error is situation aware-
·invoked, Rule-based mistakes that occur by applying ness (Chapter 20), which refers to a person's under-
bad (e.g., inadvisabJe) rules are also not uncommon, standing or mental model of the irnmediate environ-
as when a person who is motivated to achieve high ment (Endsley, 1995). As in the case of fatigue, situa-
production values associates particular work conditions tion awareness represents an aspect of human fallibility
with the opportunity for implernenting shortcuts in that can be heavily inftuenced by contextual factors.
operations. In principIe, any factor that could dísrupt a human's
At the knowledge-based level of performance, ability to acquire or perceive relevant data concerning
when needed associations or schemas are not available the elements in the environment, or compromise one's
in LTM, control shifts prirnarily to intensive WM ability to understand the importance of that data and
activities. This level of performance is often associated relate the data to events that may be unfolding in the
with large degrees of freedom that characterize how near future, presumably can degrade situation aware-
a human "moves through the problem space," and ness. Comprehending the importance of the various
suggests a rnuch greater repertory of behavioral types of information in the environment also implies
responses and correspondíng expressions of error. the need for temporal awareness-the need to be
Con textual factors that include task characteristics aware of how much time tasks require and how much
and personal factors that include emotional state, time is avaiJable for their performance (Grosjean and
718 DESIGN FOR HEALTH, SAFETY, AND COMFORT
Terrier, 1999). Thus, potentially many factors related A number of quantitative approaches to human
to both human fallibility and context can inftuence sit- error assessment (Section 5) employ concepts that
uation awareness. lncreased knowledge or expertise are related to context. For examplc, several of these
should allow for better overall assessments of situa- approaches use performance-shaping factors (PSFs)
tions, especially under conditions of high workload and to either modify the probability estimate assigned to
time constraints, by enabling elements of the problem an activity performed in error (Swain and Guttmann
and their relationships to be identified and considered 1983) or as the basis for the estimation of huma~
in ways that would be difficult for those who are less error (Embrey et al., 1984). Any environmental, indi-
familiar with the problem. In contrast, poor display vidual, organizational, or task-related factor that could
designs that make integration of data difficult can eas- inftuence human performance can, in principie, qualify
ily impair the process of assessing situations. In oper- as a PSF; thus PSFs appear to be related to contex-
ations involving teamwork, situation awareness can tual factors. These approaches, however, by virtue of
become disrupted by virtue óf the confusion created emphasizing probabilities as opposed to possibilities
by the presence of too many persons being involved for error, assume additive effects of PSFs on human
in activities. performance rather than interactive effects. In contrast
Finally, numerous affective factors can corrupt implicit to the concept of a context is the interactiv~
a human's information-processing capabilities and complexity among contextual factors. A sociotechni-
thereby predispose the human to error. Personal crises cal method for quantifying human error referred to as
could lead to distractions, and emotionally loaded STAHR (Phillips et al., 1990) is somewhat more con-
information can lead to the substitution of relevant sistent with the concept of context than approaches
information with "information trash." Similarly, a based on PSFs. This method utilizes a hierarchical net-
human's susceptibility to panic reactions and fear work of inftuence diagrams to represent the effects of
can impair information-processing activities critical to direct inftuences on human error, such as time pres-
human performance. sure and quality of training, as well as the effects of
less direct inftuences, such as organizational and pol-
.3.3 Context icy issues, which project their inftuences through the
Human actions are embedded in contexts and can more direct factors, However, while STAHR imposes
only be described meaningfully in reference to the a hierarchical constraint on inftuences, the concept of
details of the context that accompanied and produced context implicit to Figure 1 imposes no such con-
them (Dekker, 2005). The possibility for human falli- straint, thus enabling influences to be represented as
bility to result in human error as well as the expression an unconstrained network (Figure 4).
of that error will thus depend on the context in which Generally, the emphasis on predicting the possi-
task activities occur. Although the notion of a context bility for error as opposed to the probability of error
is often taken as obvious, it is not easy to define, Iead- relaxes the assessments required of contextual factors.
ing to commonly encountered altemative expressions, In makíng these assessments, sorne of the possible
such as scenario, situation, situational context, situa- considerations could inelude the extent to which a con-
tional details, contextual features, contextual dynam- textual factor is present (Le., the level of activation of a
ics, contextual factors, and work contexto Designers of network node) and the extent to which it can inftuence
advanced computing applications often speak in terms other factors (i.e., thelevel of activation of a network
of providing functionalities that are responsive to var- are), as illustrated in Figure 4. Temporal characteris-
ious user contexts. Building on a definition of context tics underlying these inftuences could also be included,
proposed by Dey (2001) in the domain of context- Also, as conceptualized in Figure 1, contextual factors
aware computer applications, context is defined as any can be refined to any degree of detail, and practitioners
information that can be used to characterize the sit- and analysts would need to determine for specific task
uation of a person, place, or object, as well as the domains of interest the appropriate level of contextual
dynarnic interactions among these entities. Thís defini- analysis. For example, the introduction of new technol-
tion of context would regard a process such as training ogy into activities involving teamwork (Section 7.3)
as an entity derived from these inieractions and would would require the characterizatíon of each person's
al so encompass information conceming how situations role with respect to the technology as well as anal-
are developing and the human's responses to these sit- ysis of how team communication may become altered
uations. as a consequence of these new roles. Links to other
Figure 1 reveals sorne representative contextual con textual factors come to mind immediately. The
factors. In this depiction, the presumption is that creation of new tasks may result in fragmented jobs
hígher-order context-shaping factors can influence 'that impose higher workload demands and less reli-
contextual factors that are more directly linked to able mental models, due to the difficulty in forming
human performance. Contexts ultimately derive from meaningful associations in memory. These factors, in
the characterization of these factors and their inter- turn, can affect adversely communication among team
actions. Analysis of the interplay of human fallibility members. New training protocols that do not antici-
and context as a basis for understanding human error pate many of these inftuences may further predispose .
will be beneficial to the extent that relevant contextual the human to error by directing attention away from
factors can be identified and analyzed in detail. important eues.
HUMAN ERROR 719
CONTEXT
• • •
11111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111111
Rules
• •• •••
Expertise
HUMAN
FALLlBILlTY
Figure 4 Influences between contextual factors representing part of a relevant work context, and their potential interplay
with representative human fallibility factors. Activation levels of contextual factors are denoted by different degrees of
shading and their degrees of ínfluence are denoted by arrow wídths. Temporal characteristics assocíated with these
influences coutd also be included. Human fatlibility factors can affect contextual factors as well as their influences.
Influences among human fallibility factors are not depicted.
In sorne models of accident causation, the concept two loosely related concepts or dimensions, interactive
. of a triggering event is used to draw attention to a complexity and coupling, whose sets of attributes gov-
"spark," such as a distraction or a pipe break, which em the potential for adverse consequences. lnteractive
sets off a chain of events (including human errors) complexity can be categorized as either complex or
that can ultimately lead to an accident. As defined linear and applies to all possible system components,
here, a trigger represents just another contextual factor. including people, materials, procedures, equipment,
Sorne triggering events, such as the random failure of a design, and the environment. The relatively stronger
pump, may not necessarily have any contextual factors presence of features such as reduced proximity in the
influencing it, whereas other triggers (e.g., a disruption spacing of system components, increased interconnec-
in a work process resulting from a late discovery that tivity of subsystems, the potential for unintended or
a needed tool is absent) could very well be influenced unfamiliar feedback loops, the existence of multiple
by other contextual factors. Investigations of accidents and interacting controls (which can be administrative
(Section 10) are often aimed at exposing multipIe as well as technological), the presence of information
chains of causal events and identifying criticaI paths that tends to be more indirect and incomplete, and the
whose disruption could ha ve prevented the accident. inability to easily substitute peopIe in task activities
Similarly, there may be nades or parhs within the predispose systems toward being complex as opposed
network of contextual factors (Figure 4) that may to linear. Complex interactions are more likely to be
be more responsive to human fallibility, and closer produced by complex systems than linear systems, and
examination of these nodes and their links may inform because these interactions tend to be less perceptible
the analyst of strategies for reducing human error or and comprehensible the human's responses to prob-
adverse events. lems that occur in complex systems can often further
Finally, the possibility also exists for describing increase the system's interactive complexity.
larger-scale work domain contexts that are capable Most systems can also be characterized by their
of bringing about adverse outcomes through their degree of coupling. Tight1y coupled .sysrems are much
interplay with human fallibility. In this regard, the less tolerant of delays ÍII: system processes than are
víew, of Perrow (1999), which constitute a system the- .loosely coupled systems and are much more invariant
ory of accidents, have received considerable attention. to materials and operational sequences. Although each
According to Perrow, the structural analysis of any sys- type of system has both advantages and disadvantages,
tem, whether technological, social, or political, reveals loosely coupled systerns provide more opportunities
~
..
for recovery from events with potentially adverse the many contexts and associated problems that can
consequences, often through creative, flexible, and arise from interactions with the systems that they
adaptive responses by people. To compensate for the design (Section 7.3.1). Although it may make more
fewer opportunities for recovery provided by tightly sense to have systems sueh as CPOEs monitored
coupled systems, these systems generally require more by practitioners and other workers for their error-
built-in safety devices and redundancy than do loosely inducing potential rather than have designers attempt
coupled systerns. to anticipate all the contexts associated with the Use
Although Perrow' s aecount of technological disas- of these systems, this imposes the added burden of
ters focuses on the properties of systems thernselves ensuring th~t mechanisms are. in place .fo~ collecting
rather than human error associated with design, oper- the appropnate data, commurucatmg this mformation
ation, or management of these systems, many of the to designers, and validating that the appropriate.
catastrophic accidents chronieled by Perrow do in faet interventions have been incorporated.
concem interactions between technological, human Many of the electronic information devices (inc1ud-
factors, organízational, and sociocultural systems, and ing CPOEs) that are currently in use in complex sys-
technical systems are in their own rigbt econornic, tems such as health care were implemented under the
social, and political eonstructs. Thus, despite the virtue assumption that they would decrease the likelihood of
in his theory of dispelling such accidents as having human error. However, the benefits of reducing or even
resulted from human error, his model has been crit- eliminating the possibility for eertain types of errors
icized for its marginalization of factors at the root often come at the risk of new errors, exemplifying how
of technological aecidents (Evan and Manion, 2002). the introduction of barriers can create new windows of
These criticisms, however, do not preelude the possi- opportunity for errors through the alteration of exist-
bility of augmenting Perrow's model with additional ing contexts (Section 3.1). For example, in hospital
perspectives on system processes that would endow it systems the reliance on information in electronic form
with the eapability for providing a reasonably com- can disturb critical cornmunication flows and is less
pelling basis fOI predisposing the human to error. likely than face-to-face communication to provide the
cues and other infonnation necessary for constructing
3.4 Barriers . appropriate models of patient problems.
Various methods exist for building in barriers to human One of the most frequently used barriers in
error. For example, computer-interactive systems can industry-the written work procedure-is also one
force the user to correct an invalid entry prior to that is highly vulnerable to violation. Many of
proceeding, provide warnings about actions that are the procedures designed for high-hazard operations
potentially error inducing, and employ self-correction inelude warnings, contingencies (information on when
algorithms that attempt to infer the user' s intentions. and how to "back out" when dangerous conditions
Unfortunately, each of these methods can also be arise during operations), and other supporting features.
breached, depending on the context in which it is used. To avoid the recurrence of past incidents, these
Forcing functions can initiate a process of backtracking procedures are updated continuously. Consequently,
by the user that can lead to total confusion and they grow in size and complexity to the point
thus more opportunity for error (Reason, 1990), and wherc they can contribute to information overload,
warnings can be ignored under high workloads. increasing the possibility of missing or confusing
The facilitation of errors by computer-interactive important information (Reason, 1997). Procedures that
systems was found to occur in a study by Koppel et al. disrupt the momentum of human actions are especially
(2005) 00 the use of hospital-computerized physícian vulnerable to violation.
order-entry (CPOE) systems, contradícting widely Humans themselves are quite adept at detecting and
beld views that these systems significantly reduce correcting many of the skill-based errors they make
medication preseribing errors. In this study, errors and are thus often relied upon to serve as barriers.
were grouped into two categories: (1) information Self-correctíon, however, implies two conditions: that
errors arising from the fragmentation of data and the human depart from automated processing, even
the faiIure to integrate information across the various if only momentarily, and that the human invest
hospital information systems, and (2) human-maehine attentional resources periodically to check whether the
interface flaws that fail to adequately consider the intentions are being met and that cues are available to
practitioner' s behaviors in response to the constraints alert one to deviation from intention (Reason, 1990).
of the hospital's organizational work structure. An This would apply to both slips and omissions of
example of an error related to the first category is actions. Redundancy in the form of eues presented .
when the physician orders new medications or modifies in multiple modalities is a simple and very effective
existing medications. If current doses are not first . way of increasing a person' s likelihood of deteeting
discontinued, the medications may actually become and correcting these types of errors. This strategy is
increased or decreased, or be added on as duplicative illustrated in the case of the ampoule-swap error in
or conflicting medication. Detection of these errors is hospital operating rooms (Levy et al., 2000). Many
hindered by ñaws in the interface that may require drug solutions are contained in ampoules that do DOt
20 screens for viewing a single.patient's medications. vary much in size and shape, often contain clear
Complex organizational systems such as hospitals can liquid solutions, and have few distinguishing features.
make it extremely difficuIt for designers to anticipate If an anesthesiologist uses the wrong· ampoule to
J{UMANERROR 721
fi11a syringe and inadvertently "swaps in" a risky conducive to human error as well as to its detection and
drug such as potassium chloride, serious consequences correction. This phenomenon is routinely demonstrated
could ensue. Contextual faetors such as fatigue and in large-seale hospital systems where one encounters
distractions make it unreasonable to expect medica! an assortment of patient problem scenarios, a variety
providers to invest the attentional resources necessaf)' of health care services, complex flows of patient infor-
for averting these types of errors. Moreover, the use of mation across various media on a continua! 24-hour
warning signs on bins that store ampoules containing basis, and a large variability in the skilllevels of health
"risky solutions" are poor solutions to this problem, care providers, who must often perform under eondi-
as they require that the human maintain knowledge tions of overload and fatigue while being subjected to
in the head -specifically, in WM- thus making this various administrative constraints. The complex inter-
information vulnerable to memory loss resulting from actions that arise under these circumstances provide
delays or distractions between retrieving the ampoule multiple opportunities for human error, arising from
and preparing the solution. The more reliable solution missed or misunderstood information or confusion in
that was suggested by these investigators was to following treatment protocols. Fortunately, there usu-
provide tactile eues on both the storage bins and ally exist multiple layers of redundancy in the form
the ampoules. For example, wrapping a rubber band of altemative materials (e.g., medications and equip-
around the ampoule following its removal from the bin ment), treatment schedules, and health care workers to
provides an alerting cue in the form of tactile feedback thwart the serious propagatíon of many of these errors.
prior to loading the ampoule into the syringe. Thus, despite a number of constraints that exist in hos-
Not surprisingly, the human's error detection abil- pital systems, particularly in the provision of critica!
ities are greatly reduced at the knowledge-based level care, these systems are sufficiently loosely coupled to
of performance. Error detection in these more complex overcome many of the risks that arise in patient care,
situations will depend on discovering that the wrong including those that are generated by virtue of dis-
goal has been selected or recognizing that one's move- continuities or gaps in treatment (Cook et al., 2000).
ment through the problem space is not consistent with . However, even if adverse consequences are indeed
the goal. In this regard, strategic errors (e.g., in goal averted in many of these cases, one must acknowl-
definition) are expected to be much harder to discover edge the possibility that the quality of patient care may
than tactical errors (e.g., in choosing which subsystem become significantly compromised in the process.
to diagnose). Human error detection and recovery at Finally, there is always the possibility that the
the knowledge-based level of performance may in fact perceived presence of barriers such as intelligent
represent a highly evolved forro of expertise. Interest- sensing systems and corrective devices may actually
ingly, whereas knowledge-based errors decrease with increase a person's risk-taking behavior. Adjusting
increased expertise, skil1-based errors increase. Also, risk-taking behavior to maintain a constant leve1 of
experienced workers, as compared to beginners, tend ro risk is in line with risk-homeostasis theory (Wilde,
disregard a larger number of errors that have no work- 1982). These adjustments presume that humans are
related consequences, suggesting that with expertise reasonabIy good at estimating the magnitude of risk,
comes the ability to apply higher-order criteria for which generally does not appear to be the case.
regulating the work system, thus enabling the alloca-
tion of attention to errors to occur on a more selective Nonetheless, a disturbing implieation of this theory
basis (Amalberti, 2001). is the possibility that interventions by organizations
directed at improving the safety clirnate could, instead,
A very cornmon barrier to human error is having
other people available for error detection. As with result in work cultures that promote attitudes that are
hardware eomponents, human redundancy will usually . not conducive to safe operations.
lead to more reliable systems. However, successful
human redundancy often requires that the other people 3.5 Example:Wrong-Site Surgery
be external lo the operational situation, and thus
possibly less subject to tendencies such as cognitive Wrong-site errors in health care encompass surgical
fixation. In a study of 99 simulated emergeney procedures performed on a wrong part of the body,
scenarios involving nuclear power plant crews, Woods wrong side of the body, wrong person, or at the
(1984) found that none of the errors involving wrong level of a correctly identified anatomical síte.
diagnosis of the system state were detected by the The Joint Cornmission on Accreditation of Healtheare
operators who made them and that only other people Organizations (JCAHO) considers wrong-site surgeries
were able to detect a number of them. -In contrast, to be sentinel events that require immediate inves- .
half the errors categorized as slips (i.e., errors in tigation and response. As of March 2000, JCAHO
execution of correet intentions) were detected by the has reported wrong-site surgery to be the fourth most
operators who made them. These results also suggest commonly reported sentinel event, following patient
that team members can often be subject to the same suicide, medication error, and operative or postoper-
error-producing tendencies as individuals. ative complications. It seems inconceivable that this
Barriers to human error need not always be present type of error, which carries potentially devastating
by designo As implied in Perrow's system theory consequences, could become a cornmon occurrence
of accidents (Section 3.3), a complex mixture of a in organizations comprised of so many highly trained
systems properties can produce conditions tbat are practitioners. While human fallibility, as always, plays
722 DESIGN FOR HEALTH, SAFETY, AND COMFOltT
a fundamental role, its interplay with contextual fac- ?n x-rays or the patient' s char.t can prove ínadequate
tors and existing barriers suggests that these errors are In cases where the data are mcorrect or associated
more complex than they appear. with the wrong patient, and patient involvement is nor
A common factor in wrong-site surgery is the always possíble, depending on the patient's condition
involvement of multiple surgeons on a case. Each of Violation of these barriers ís not uncommon. Som~
the various physicians has a relatively narrow focus of surgeons see signing as a waste of time and a
attention (e.g., the cardiologist is focused on whether practice that could contaminate the operative site
the heart can withstand surgery), which decreases They are insistent that wrong-site surgery error~
the likelihood that the patient wilI be surrounded would not happen to them and that the focu.,
by health care providers who are knowledgeable ?f the. I!ledical profession should be o~ ridding
about the case and thus limits the benefits of human lt~elf of lll~ompetent surgeons rather than .mstituting
redundancy. Another factor in wrong-site surgery is wide-reaching programs (Prager, 1998). This attitude
the need to perform multiple procedures during a however, is not surprising in a profession that ha;
single trip to the operating room. This factor provides relied largely on people avoiding mistakes rather than
the necessary distractions for a slip. The likelihood creating systems to minimize them. It a1so reflects a
that a distraction could result in an unintended action very traditional perspectíve to human error whereby
is increased to the extent that the surgeon has the responsibility or blame for errors is placed solely
"frequently" or "recently" performed surgeries at the on those who committed them and suggests that a
unintended site or when patient care is transferred culture shift among surgeons may be needed. To utilizs
to another surgeon. Fatigue, sIeep deprivation, and the surgeon's time more efficiently, for hospitalized
unusual patient characteristics such as massive obesity patients i?lP~eT?~ntation II?ight involve th~ operating
(which could alter the positioning of the patient) surgeon initializing the intended operatrve site at
are also capable of promoting unintended actions by the time consent is obtained, thus requiring that the
disrupting the surgeon's focused attention. physician be present during consent. The JCAHO has
Presurgical procedures and problems with the way constructed a uníversal protocol for eliminating wrong.
that team members communicate during an operation site surgery which en sures that the surgical site is
can also contribute to the occurrence of wrong-site marked while the patient is conscious and that there ís
errors. Ideally, an entire team should be required to a final pause and verification among all surgical team
verify that the correct patient and the correct limb members to ensure that everyone is in agreement with
have been prepared for surgery. However, when the the procedure. This protocol became effective in July
surgical team fails to review the patient record or 2004 for alI JCAHO-accredited hospitals.
image data in the time period irnmediately prior to the
surgery, memory conceming the correct surgical site
4 ERROR TAXONOMIES ANO PREOICTING
can become flawed. Incomplete or inaccurate cornmu- HUMAN ERROR
nication among surgical team members can also occur
when sorne team members are excluded from partici- Many areas of scientific investigation use classification
pating in the site verification process, team members systems or taxonomies as a way of organizing knowl-
exchange roles during the day of surgery, or when the edge about a subject matter. In the case.of human error,
entire team depends exclusively on the surgeon to iden- the taxonomies that have been proposed have theoret-
tify the surgical site (the latter often occurs in work ical as well as practical value. The taxonomies that
cultures that accept the surgeon's decision as final). emphasize observable behaviors are prlmarily of prac-
Many of these communication problems become mag- tical value. They can be used retrospectively to gather
nified under time constraints stemming from pressure data on trends that point to weaknesses in design,
from hospital administrators to speed things up. training, and operations, as well as prospectively, in
A tactie that has recently received considerable conjunction with detailed analyses of tasks and sit-
attention is marking the operative site and involving uational contexts, to predict possible errors and to
the patient in the process. However, even this seem- suggest countermeasures for detecting, minimizing, or
ingly straightforward policy can be problematic. If sur- eliminating these errors. Human error taxonomies can
geons were to employ their own marking techniques, also be directed at specific tasks or operations. For
such as "No" on the wrong limb or "Yes" on the proper example, a taxonomy could be developed for the pur-
site, confusion may occur to the point of increasing pose of characterizing all the various observable ways
the likelihood of wrong-site surgery. Standardization that a particular task can be performed incorrectly,
is thus critical, and the recommended procedure is for analogous to the use of failure mode and effects anal-
the surgeon to initialize the operative site. This bar- ysis (Kumamoto and Henley, 1996) to identify a com-
rier alone, however, is insufficient. For example, if the ponent' s failure modes and their corresponding causes
marked site is draped out of the surgeon' s field of view and effects. In the health care industry, the diversity
and the surgeon does not recall whether the site was of medical procedures and the variety of circumstances
or was not marked, the possibility for error still exists. under which these ptocedures are performed rnay, in
Thus, a verification checklist should also be in place fact, call fOI highly specific error taxonomíes.
that ineludes a11 documents referencing the intended For more cognitively complex tasks, it may be
procedure and site, informed consent, and direct obser- possible to classify errors according to stages of infor-
vation of the marked operative site. Strlct reliance mation processing (Figure 2), thereby differentiating
HUMAN ERROR 723
errors related to perception from errors related to fail- "trap" people into particular types of skill-based slips
ures in working memory. However, many of these and lapses and sorne forms of rule-based mistakes.
errors of cognition can only be inferred from assump- However, the multidimensional complexity surround-
tions concerning the human' s goals and observed ing actual work situations and the uncertainty associ-
behaviors, and to sorne extent from con textual factors. ated with the buman' s goals, intentions, and emotional
The characterization of performance as skill-, rule-, and attentional states introduce many layers of guess-
or knowledge-based (Section 3.2.5) has proven par- work into tbe process of establishing reliable map-
ticularly useful in thinking about the ways in which pings between human fallibility and context. In 1991,
ínformatíon-processing failures can arise, in light of Senders and Moray stated: "To understand and predict
the distinctions in information-processing activities errors ... usually requires a detailed task analysis" (p.
that are presumed to occur at each of these levels. 60). Nothing has changed since to diminish the validity
Generally, taxonomies that focus on the cognitive or of this assertion. In fact, the current emphasis on cog-
causal end of the error spectrum have the ability to nitive task analysis (CTA) techniques and our greater
propose types of errors that might occur under var- understanding of mechanisms underlying human error
ious circumstances and thus can shape or augment have probably made the process of predicting human
our understanding of human limitations in informa- error more laborious than ever, as it should be. Expec-
tion processing. tations of shortcuts are unreasonable; error prediction
A very simple error taxonorny that bears a long by its very nature should be a tedious process and will
history (Sanders and McCormick, 1993) differenti- often be influenced by the choice of taxonomy.
ates errors of omission (forgetting to do something) Task analysis (TA), which is fundamental to error
from errors of commission (doing something incor- prediction, describes the human' s involvement with
rectly). Errors of commission are often further cate- a system in terms of task requirements, actions, and
gorized into errors related to sequence, timing, sub- cognitive processes (Chapter 14). It can be used to pro-
stitution, and actions not included in a person's cur- vide a broad overview of task requirernents (that are
rent plans (Hollnagel, 1993). Sequence errors inelude often useful during the preliminary stages of product
actions that are repeated (which may result in restarting design) or a highly detailed description of activities.
a process) or are reversed (which may result in jump- These descriptions could inelude time constraints and
ing ahead in a sequence). Timing errors refer to actions activity time lines; sequential dependencies among
that do not occur when they are required; thus they activities; alternative plans for performing an opera-
may occur prematurely or after sorne del ay. Substitu- tion; contingencies that may arise during the course
tion errors refer to single actions or sets of actions that of activities and options for handling these contingen-
are performed in place of the expected action or action cíes; tbe feedback available at each step of the process;
set. Errors involving the inclusion of additional actions characterizations of information flow between differ-
are referred to as intrusions when they are capable of ent subsystems; and descriptions of displays, controls,
disrupting the planned sequence of actions. Disrup- training, and interactions with other people. Tabular
tions can lead to capture by the sequence, branching formats are often used to illustrate the various relation-
to an incorrect sequence, or overshooting the action ships between these factors and task activities. Many
sequence beyond the satisfaction of irs objective. different TA methods exist (Kirwan and Ainsworth,
Figure 5 and Tables 2 to 4 illustrate several other 1992; Luczak, 1997; Shepherd, 2000) and identifying
error taxonomies. The flowchart in Figure 5 classifies an appropriate method for a particular problem or work
different types of human errors that can occur under dornain can be critical.
skill-, rule-, and knowledge-based levels of perfor-. In CTA, tbe interest is in determining how the
manee. This flowchart seeks to answer questions con- human conceptualizes tasks, recognizes critical infor-
cerning how an error occurred. Similar flowcharts are mation and patterns of cues, assesses situations,
provided by the author to address the more preliminary makes discriminations, and uses strategies for solving
lSsue in the causal chain (i.e., why an error occurred) problems, forming judgments, and making decisions.
as well as the external manifestation of the error (i.e., Successful application of CTA for enhancing sys-
whar type of error occurred). Reason' s (1990) taxon- tem performance will depend on a concurrent under-
omy (Table 2) also exploits the distinctions among standing of the cognitive processes underlying human
skill-, rule-, and knowledge-based levels of perfor- performance in the work domain and the con-
mance, but draws attention to how error modes related straints on cognitive processing the work domain
to skill-based slips and lapses differ from error modes imposes (Vicente, 1999). In developing new systems,
related to rule- and knowledge-based mistakes. The meeting this objective may require multiple, coordi-
taxonomies presented in Tables 3 and 4 demonstrate nated approaches. As Potter et al. (1998) have noted:
various schemes for classifying errors based on stages "No one approach can capture the richness required
of information processing .. for a comprehensive, insightful CTA" (p. 395).
In addition to their usefulness for analyzing acci- As with TA, many different CTA techniques are
dents for root causes (Section 10.2), error taxonomies presently available (Hollnagel, 2003). TA and CTA,
t~at.ernphasize cognitive or causal factors have pre- however, should not be viewed as mutually exclusive
dlctIve value as weIl. Predicting human error, how- enterprises-in fact, the case could be made that
ever, is a difficult matter. It may indeed be possible TA methods that incorporate CTA represent "good"
to construct highly controlled experimental tasks that task analyses. As anticipating the precise time and
724 DESIGN FOR HEALTH, SAFETY, AND COMFORT
START
-
The aet is not performed wilh
Is the situation a ---+ adequate precision (lime, force, .. Manual
variabilily
Bu! the operator spatial, aceuracy)
rouline situation for Yes _..
whieh the operator exeeutes a skilled
aet inappropriately
has highly skilled
routines?
---+
The act performed al wrong place,
componenl in spite of proper _ .. Topographic
misorientation
mtentíon
Yes
Familiar pattern
No
L{ Other slip of
memory
not recognized No
-
-
The situation is unique,
,
unknown, and calls tor Operator responds
No lo familiar cue which Yes Familiar
operator's functional
ts incomplete par! of association
analysis and plannlng. available shortcut
Ooes operator realize informalion
this?
Yes
~ Information not seen or sought
Ooes the operator
correct\y collect the No J
information available ....\. Information assumed not observed )
for his or her
analysis?
-+( Information misinferpreted )
Yes
Yes
,
Other,
I specify I
-
Figure 5 Decisión flow diagram tor analyzing an event Into one ot 13 types of human error. (From Rasmussen, 1982.;·
copyright 1982, with permission fromEtsevíer.l
HUMAN ERROR 725
Table 2 Human Error Modes Associated with Table 3 External Error Modes Classified According
Rasmussen's SRK Framework to Stages 01 Human Information Processing
Skíll-Based Performance 1. Activation/detectíon
Inattention Overattention 1.1 Fails to detect signallcue
1.2 Incomplete/partíal detectíon
Double-capture slips Omissions 1.3 Ignore signal
Omissíons following Repetitíons 1.4 Signal absent
interruptions Reversals 1.5 Faíls to deteet deterioration of situation
Reduced intentionality 2. Observation/data collection
Perceptual confusions 2.1 Insufficient information gathered
loterference errors 2.2 Confusíng information gathered
. 2.3 Monitoring/observation omitted
Rule-Based Performance 3. Identification of system state
Misapplication of Good Application of Bad Rules 3.1 Plant-state-identification failure
Rules . 3.2 Incomplete-state identification
3.3 Incorrect-state identification
First exceptions Encoding deficiencies 4. Interpretation
Countersigns and Action deficiencies 4.1 Incorrect interpretation
nonsigns Wrong rules 4.2 Incomplete interpretation
Informationaloverfoad Inelegant rules 4.3 Problem solving (other)
Rule strength Inadvísable rules 5. Evaluation
General rules 5.1 Judgment error.
Redundancy 5.2 Problem-solving error (evaluatíon)
Rigidity 5.3 Fails to define criteria
Knowledge-Based Perlormance 5.4 Fails to carry out evaluation
6. Goal selectíon and task definitíon
6.1 Fails to defíne goal/task
Selectivity Problems with complexity
Workspace limitations Problems with delayed 6.2 Defines incomplete goal/task
Out of sight, out of mind feedback
6.3 Defines incorrect or ínappropriate goal/task
Confirmation bias Insufficient consideration of 7. Procedure selection
Overconfidence processes in time 7.1 Selects wrong procedure
Biased reviewing Difficulties with exponential 7.2 Procedure inadequately formulated/shortcut
invoked
lIIusory correlation developments
Halo effeets Thinking in causal series . 7.3 Procedure contains rule violation
Problems with causality and not causal nets 7.4 Fails to seleet or identify procedure
Thematic vagabonding
8. Procedure execution
8 .1 Too early!late
Encystíng
8.2 Too much/little
Source: Reason (1990). 8.3 Wrong sequen ce
8.4 Repeated action
8.5 Substitution/intrusion error
8.6 Orientation/misalignment error
mode of error is generalIy unrealistic, the use of 8.7 Right action on wrong objeet
TA techniques should be directed at uncovering the 8.8 Wrong action on right object
possibility for errors and prioritizing these possibilities. 8.9 Check omitted
Given what we can surmise about human fallibility, 8.10 Check fails/wrong check
the contexts within which human activities occur, 8.11. Check místimed
and the barriers that may be in place, the relevant 8.12 Communication error
questions are then as follows: What kinds of actions 8.13 Act performed wrongly
by peopIe are possible or even reasonable tbat would, 8.14 Part of aet performed
by one's definition, constitute errors? What are the 8.15 Forgets isolated act at end of task
possible consequences of these errors? What kinds 8.16 Accidental timíng with other
evenVcircumstance
of barriers do these errors and their consequences 8.17 Latent error prevents execution
call for? Depending on whether the analysis is to be 8.18 Actíon omitted
applied to a product or process that is still in fue 8.19 Information not obtained/transmitted
conceptual stages, to a newly implemented process, 8.20 Wrong information obtaíned/transmitted
or to an existing process, broad applications of TA 8.21 Other
techniques that may inelude mock-ups, walkthroughs,
simulations, interviews, and direct observations are Source: Kirwan (1994).
needed to identify the relevant contextual elements.
In-depth task analyses that incorpórate CTA techniques
cÓllldthen provide fue details necessary for evaluating Even when applied at relatively superficial lev-
the various possibilities for interplay between context els, TA· techniques are well suited for identifying
and human fallibility (Sharit, 1998). mismatches between demands imposed by the work
726 DESIGN FOR HEALTH, SAFETY, AND COMFORT
Table 4 Human Error Classification Scheme Table 5 Part of a Hierarchical Task Analysis
Associated with Filling a Chlorine Tanker
1. Observation of system state
• Improper rechecking of correct readings O. Fíll tanker with chlorine.
• Erroneous interpretation of correet readíngs Plen: Do tasks 1 to 5 in order.
• Ineorrect readíngs of appropriate state variables 1. Park tanker and check documents (not analyzed).
• Failure to observe sufficient number of variables 2. Prepare tanker for filling.
• Observatíon of inappropriate state variables Plan: Do 2.1 or 2.2 in any order, then do 2.3 to 2.5 in
• Failure to observe any state variables order.
2. Choice ot hypothesis 2.1 Verify tanker is empty.
• Hypotheses could not cause the values of the Plan: Do in order:
state variables observed 2.1.1 Open test valve.
• Much more likely causes should be considered 2.1.2 Test far C12•
first 2.1.3 Close test valve.
• Very costly place to start 2.2 Check weight of tanker.
• Hypothesis does not functionally relate to the 2.3 Enter tanker target weight.
variables observed 2.4 Prepare fill line.
3. Testing of hypothesis Plan: Do in arder:
• Stopped befo re reaching a conclusion 2.4.1 Vent and purge lineo
• Reached wrong conclusíon . 2.4.2 Ensure main CI2 valve is closed.
• Considered and discarded correct conclusion 2.5 Connect main CI2 fillline.
• Hypothesis not tested 3. Initiate and monitor tanker filling operation.
4. Choice of goal . Plan: Do in arder:
• Insufficient specification of goal 3.1 Initiate filling operation.
• Choice of counterproductive or nonproductive Plan: Do in arder:
goal 3.1.1 Open supply line valves.
• Goal not ehosen 3.1.2 Ensure tanker is filling with chlorine.
5.. Choice of procedure 3.2 Monitor tanker filling operation.
• Choice would not fully achieve goal Plan: Do 3.2.1, do 3.2.2 every 20 minutes; .
• Choice would achieve incorrect goal on initial weight alarm, do 3.2.3 and 3.2.4;
• Choice unnecessary for achieving goal on final weight alarm, do 3.2.5 and 3.2.6.
• Procedure not chosen 3.2.1 Remain within earshot while tanker is
6. Execution of procedure filling.
• Required stop omitled 3.2.2 Check road tanker.
• Unnecessary repetition of required step 3.2.3 Atterid tanker during last filling ot 2 or 3
• Unnecessary step added tons.
• Steps executed in wrong order 3.2.4 Cancel initial weight alarm and remain at
• Step executed too early or too late contrals.
• Control in wrong position or range 3.2.5 Cancel final weight alarm.
• Stopped befo re procedure complete 3.2.6 Close supply valve A when target weight
• Unrelated inappropriate step executed is reached.
4. Terminate filling and release tanker.
Source: Rouse and Rouse (1983). 4.1 Stop filling operation.
Plan: Do in arder:
4.1.1 Close supply valve B.
context and the human' s capabilities for meeting these 4.1.2 Clear lines.
demands. Although hypothesizing specific.error forms 4.1 .3 Close tanker valve.
will become more difficult at this level of analysis, 4.2 Disconnect tanker.
windows of opportunity for error still can be readily Plan: Repeat 4.2.1 five times, then do 4.2.2 to
exposed that, in and of themselves, can suggest coun- 4.2.4 in arder.
termeasures capable of reducing risk potential. For 4.2.1 Vent and purge lines.
example, these analyses may determine that there is 4.2.2 Remove instrument air from valves.
insufficient time to input information accurately into a 4.2.3 Secure blocking device on valves.
4.2.4 Break tanker connections.
computer-based documentation system, that the design 4.3 Store hoses.
of displays is likely to evoke control responses that 4.4 Secure tanker.
are contraindícated, or that sources of information on Plan: Do in arder:
which high-risk decisions are based contain incom- 4.4.1 Check valves tor leakage.
plete or ambiguous information. This coarser approach 4.4.2 Secure lag-in nuts.
to predicting errors or error-inducing conditions that 4.4.3 Close and secure dome.
derives from analyzing demand-capability mismatches 4.5 Secure panel (not analyzed).
-can also highlight contextua] and cognitive consid- 5. Document and report (not analyzed).
erations that can fonn the basis for a more focused
applicañon of TA and CfA techniques. Source: CCPS (1994). Copyright 1994 by the American
Table'5 depicts a portion of a type of TA known as Institute of Chemical Engineers, and reproduced by
a hierarchical task analysis (HTA) that was developed permission of AIChE.
lf(JMAN ERROR 727
for analyzing the task of filling a storage tank with be defined as a ratio of the number of observed
chlorine from a tank truck. The primary purpose occurrences of the error to the number of opportunities
of this TA was to identify potential human errors for that error to occur. Based on this definition, it can
that could contribute to a major flammable release be argued that with the exception of routine skill-based
resulting either from a spill during unloading of tbe activities, estimates of HEPs are not easily attainable or
truck or from a tank rupture. Table 6 illustrates the likely to be accurate. Assuming that most organizatíons
use of this HTA for predicting external error modes. would be more interested in gauging tbe possibility
The error taxonomy shown in Table 3 can easily be for human error and understanding its causality and
adapted for predicting the types of errors listed in consequences, the more compelling question is: Why
Table 6. This taxonomy can also be linked to more do we need a quantitative estimate?
underlying psychological mechanisms, allowing errors The catalyst behind quantification of human error
with identical or similar external manifestations to was the mandate for industries involved in high-
be distinguished and thus adding considerable depth hazard operations to perform probabilistic risk analy-
to the understanding of potential errors predicted ses (PRAs). Most industries that carry out such assess-
from the TA. As discussed in Section 5.4, this ability ments, such as the chemical processing and nuclear
not only results in more accurate quantification of power industries, are concemed about hazards aris-
error data but also provides the basis for more ing from interactions among various system events,
effective error-reduction strategies. An example of including hardware and software failures, environmen-
such a scheme is the human error identification in tal anomalies, and human errors that are capable of
systems technique (HEIST), which classifies external producing injuries. fatalities, disruptions to production,
error modes according to the eight stages of human and plant and environmental damage. The two primary
information processing listed in Table 3. The first hazard analysis techniques that have become associ-
colurnn in a HEIST table consists of acode whose ated with PRAs are fault tree (Ff) analysis and event
initial letter(s) refers to one of these eight stages. The tree (ET) analysis. The starting point for each of tbese
next letter in the code refers to one of six general PSFs: methods is an undesirable event. Other hazard anal-
time (T), interface (1), training/experience/familiarity ysis techniques (CCPS, 1992) or methods based on
(E), procedures (P), task organization (O), and task expert opinion are often used to identify these events.
complexity (C). The external error modes are then Ffs utilize Boolean logic models to depict the rela-
linked to underlying psychological error mechanisms tionships among hardware, human, and environmental
(PEMs). Many of these mechanisms are consistent events that can lead to the undesirable top eventoWhen
with the failure modes that appear in Reason's error Ffs are used as a quantitative method, basic events (for
taxonomy (Table 2). which no further analysis of the cause is carried out)
Table 7 presents an extract from a HEIST tablc are assigned probabilities or occurrence rates, which
corresponding to two of the eight stages of human are then propagated into a probability or rate mea-
information processing listed in Table 3: activa- sure associated with the top event (Dhillon and Singh,
tionldetection and observation/data collection. For 1981). The contributions of each oftbe singular events
these two stages of information processing, more to .the top event can also be computed, making this
detailed explanations of the PEMs listed in the HEIST technique very suitable for cost-benefit analyses that
table may be found in Table 8. A complete HEIST can be used as a basis for specifying design interven-
table and the corresponding listing of PEMs can be tions. As a qualitative analysis tool, Ffs can identify
found in Kirwan (1994). . the various combinations of events (or cut sets) that
On a final note, task analysts contending with could lead to the top event; for many applications
complex systerns wilI often need to consider various this information is sufficiently revealing for satisfying
properties of the wider system or subsystem in which safety objectives.
human activities take place. As Shepherd (2000) has Whereas a Fl' represents a deductive, top-down
stated: "Any task analysis method which purports to decornposition of an undesirable event (sucb as a
serve practical ends needs to be carried out beneath a loss in electrical power), an ET corresponds to an
general umbrella of systems thinking" (p. 11). There inductive analysis that determines how this undesirable
are a variety of ways in which systems can be .event can propagate. These trees are thus capable of
characterized or decornposed (Sharit, 1997), and for . depicting the various sequen ces of events that can
any particular system these various descriptions could become triggered by tbe initiating event, as well as
lead to the consideration of different activities for the risks associated with each of these sequences.
analysis as well as different strategies for performing Figure 6 illustrates a simple ET consísting of two
these analyses. . operator actions and two safety systems. When ETs
are constructed to address only sequences of human
5 QUANTIFYING HUMAN ERROR actions in response to the initiating event, the ET
is sometimes referred to as an operator action event
5.1 Historical Antecedents
tree (OAET). In OAETs, each branch of the tree
Quantifying human error presumes that a probability represents either a success or an HEP associated
"ican be attached to its occurrence. Is this a realistic with the required actions specified -along the column
endeavor? An objective assignment of probabilities to headings. Thése trees can easily accommodate paths
eVents requires that human error probability (HEP) signifying recovery from previous errors. In rnany
o o
cr
(1)
~ o
a '"o
o'
:::l 3
o ;:;
3
;:; a
ro
c.
o
(JI
rnm
-a.o........a
-v
o'
:::J
z o
o :::l
:::l o
ro zr
(1)
.()
"
o
o
:::l
orn
o (JI
3.g
3 (JI
ro :::l
:::l o
ct!)l
s
a.
e
en
ro
o
:r
ro
o
z;
~
m
a,
JJ
a
e
a
s
JJ
g
3
3
(1)
:::l
C.
a
o'
:::l
rn
HUMAN ERROR 729
AT1 Does the signal Action omitted Signal timing Alter system configuration to present signal
occur at the or performed deficiency, failure appropriately; generate hard copy to aid
appropriate time? eithertoo of prospective prospective memory; repeat signal until action
Could it be early or too memory has occurred.
delayed? late
A11 Could the signal Action omitted Signal failure Use diverselredundant signal sources; use a
source fail? or performed higher-reliability signal system; give training and
too late ensure that procedures incorporate
investigation checks on "no signal."
Al2 Can the signal be Action omitted Signal ignored Use diverse signal sources; ensure higher signal
perceived as reliability; retrain if signal is more reliable than it
unreliable? is perceived to be.
AI3 Is the signal a Action omitted, Signal-detection Prioritize signals; place signals in primary (and
strong one, and is or performed failure unobscured) location; use diverse signals; use
it in a prominent too late, or multiple-signal coding; give training in síqnal
location? Could wrong act priorities; make procedures cross-reference the
the sigrial be performed relevant signals; increase signal intensity.
contused with
another?
AI4 Ooes the signal rely Action ornitted Communication Provide physical backup/substitute signal; build
on oral or performed failure, lapse of required communications requirements into
communication? too late memory procedures.
AE1 Is the signal very . Action omitted Signal ignored Give training tor low-frequency events; ensure
rare? or performed (false alarm), diversity of signals; prioritize signals into a
too late stereotype hierarchy of several levels.
fixation
AE2 Ooes the operator Action omitted lnadéquate mental Training and procedures should be amended to
understand the or performed model ensure that significance is understood.
significance of too late
the signal?
AP1 Are proced ures Action omitted Incorrect mental Procedures must be rendered accurate, or at least
clear about or performed model made more precise; give training if judgment is
action following either too requíred on when to act.
the signal or the early or too
previous step, or late
when to start the .
task?
A01 Ooes actívation rely Actíon omitted Prospective Proceduralize task, noting calling conditions,
on prospective or pertormed memory failure timings of actions, etc.; utilize an interlock
memory (i.e., either too late system preventing task from occurring at
remembering to or too early undesirable times; provide a later cue;
do something at emphasize this aspect during training.
a future time, with
no specific cue or
signar at that later
time)?
A02 Will the operator Action omitted Lapse of rnemory, Training should prioritize signal importance;
have other duties orperforrhed memory failure, improve task organization tor crew; use memory
to perform too late signal-detection aíds; use a recurring signal; consider
concurrently? Are failure automation; utilize flexible crewing.
there likely to be
distractions?
Couldthe
operator become
incapacitated?
A03 Will the operator Action omitted Lapse of rnemory, Improve task and crew organization; use a
have a very high or pertormed other memory . recurring signal; consider automation; utilize
or low workload? either too late failure, flexible crewing; enhance signal salience.
or too early signal-detection
failure
(continued over/eaf)
730 D&<;¡IGNFOR HEALTH, SAFETY, AND COMFORT
Table 7 (conUnued)
System
Error-Identifier Cause/Psychological Error-Reduction
Code Prompt External Error Mode Error-Mechanism Guidelines
A04 WiU it be clear who Action omitted Crew-coordination Emphasize task responsibility in training and task
must respond? or performed failure allocation among crew members; utilize team .
too late traininq.
AC1 Is the signal highly Aetion omitted, Cognitive overload, Simplify signal; autornate system response; give
complex? or wrong act inadequate adequate training in the nature of the signal;
performed mental model provide online, automated, diagnostic support;
either too late develop procedures that allow rapid analysis of
or too early the signar (e.q., use of ñowcharts),
AC2 ls the signar in Aetion omitted Confirmation bias, Procedures should ernphasize disconfirming as
conflict with the or wrong act signal ignored well as confirmatory signals; utilize a shift
current performed technical advisory in the shift structure; carry
diagnostic out problem-solving training and team training;
mindset? utilize diverse signals; implement autornatlon,
AC3 Could the signal be Aetion Farníltar-assoclaticn Training and procedures could involve display of
seen as part of a performed too shortcut/ signals embedded within mimics or other
different signal early or wronq stereotype representations showing their true contsxts or
set? Or is, in fact, aet performed takeover range of possible contexts; use fault-symptom
the signal part of matrix aids; etc.
a series of signals
to which the
operator needs to
respond?
OT1 Could the Failure to act, or Inadequate mental Procedure and training should specify the prionty
information or action model/ and timing of checks; present key information
check occur at performed inexperience/ centrally; utilize trend displays and predictor
the wrong time? either too late crew displays if possible; implement team training_
or too early, or coordination
wrong act failure
performed
011 Could important Aetion omitted Signal failure Use diverse signal sources; maintain backup
information be or performed power supplies for signals; have periodic
missing due to either too late manual checks; procedures should specify
instrument or too early, or action to be taken in event 01 signal failure;
failure? wrong act engineer automatic protection/action; use a
performed higher-reliability system.
012 Could information Aetion omitted Erroneous signal Use diverse signal sourees; procedures should
sources be or performed speeify cross-ehecking; design
erroneous? either too late system-self-integrity monitoring; use
or too early, or higher-reliability signals_
wrong act
performed
013 Could the operator Action omitted Mistakes Ensure unique eoding of displays,
seleet an or performed alternatives, eross-refereneed in procedures; enhance
incorrect but either too late spatial discriminability via coding; improve tralrunq.
similar or too early, or misorientation,
information wrong aet topoqraphic
source? performed misorientation
014 Is an information Action omitted Communication Use diverse signals from hardwired or softwired
source aceessed orperformed failure displays; ensure baekup human corroboration;
onty vía oral either too late design communícation protocola
communication? or too early, or
wrong act
performed
015 Are any information Action omitted Misinterpretation, Use task-bassd displays; design symptom-based
sources or performed mlstakes diagnostic aids; utilize diverse information
ambiguous? either too late alternatives sourees; ensure clarity 01 information displayed;
or too early, or utilize alarm conditioninq,
wrong act
performed
(continued overfeaf)
HUMAN ERROR 731
Table 7 (continued)
System
Error-Identifier Cause/Psychological Error-Reduction
Code Prompt External Error Mode Error-Mechanism Guidelines
016 Is an information Action omitted Information Centralize key data; enhance data access; provide
source difficult or or performed assumed training on importance of verification of signals;
time-consuming too late, or enhance procedures.
to access? wrong act
performed
017 Is there an Action omitted Information Prioritize information displays (especially alarms);
abundance of or performed overload utilize overview mimics (VDU or hardwired); put
information in the too late training and procedural emphasis on
scenario, some of data-collection priorities and data management.
which is
irrelevant, or a
large part ot
which is
redundant?
OEl Could the operator Action omitted Confirmation bias, Provide training in diaqnostlc skills; enhance
focus on key or performed tunnel vision procedural structuring of diagnosis,
indication(s) too late, or emphasizing checks on disconfirrning evidence;
related to a wrong act implement a staff-technical-advisor role;
potential event performed present overview mimics of key parameters
while ignoring showing whether system integrity is improving
other information or worsening or adequate.
sources?
OE2 Could the operator Action omitted Thematic Provide training in fault diagnosis; provide team
interrogate too or performed vagabonding, training; put procedural emphasis on required
many information too late risk-recognition data collection time frames; implement
sources for too failure, high-Ievel indicators (alarms) of system-integrity
long, so that inadequate deterioration.
progress toward mental model
stating
identification or
action is not
achieved?
OE3 Could the operator Action omitted Need tor Provide procedural guidance on checks required,
fail to realize the or pertormed information not training, use of memory aids, use of
need to check a either too late prompted, attention-gaining devices (false alarms, central
particular or too early, or prospective displays, and messages)
source? Is there wrong act memory failure
an adequate cue performed
prompting the
operator?
OE4 Could the operator Action omitted Overconfidence, Provide training in diagnostic procedures and
terminate the or performed inadequate verification; provide procedural specification of
data collec- either too mental model, required checks, etc.; implement a
tion/observation early or too incorrect mental shift-technical-advisor role.
early? late, or wrong model, familiar-
act performed association
shortcut
OE5 Could the operator Action omitted Failure to consider Ensure trainíng for, as well as procedural noting
fail to recognize or performed speciaí of, specíal circumstance; STA; give local
that special eíther too late circumstances, warnings in the interface displays/controls.
circumstances or too early, or slip of memory,
apply? wrong act inadequate
performed mental model
OPl Could the operator Action omitted Rule violation, Provide training In use of procedures; involve
fail to follow the or wrong act risk-recog nition operator in development and verification of
procedures performed failure, produc- procedures.
entirely? tion-safety
conflíct,
satety-culture
defícíency
(continued over/eaf)
732 DESIGN FOR HEALTH, SAFETY, AND COMFORT
Table 7 (contínued)
System
Error-Identifier Cause/Psychological Error-Reduction
Code Prompt External Error Mode Error-Mechanism Guidelines
OP2 Could the operator Action omitted Forget isolated act, Ensure an ergonomic procedure design; utilize
forget ene or or performed slip of memory, tick-off sheets, place keeping aids, etc.; provide
more items in the either too place-Iosing error team training to emphasize checking by other
procedures? early or too team member(s).
late, or wrong
act performed
001 Will the operator Action omitted Lapse of memory, Training should prioritize signal importances;
(A02) have other duties or performed memory failure, devetop better task organization for crew; use
. to perform too late signal-detection memory aids; use a recurring signal; consider
. concurrently? Are failure automation; use flexible crewing.
there likely to be
distractions?
Could the
operator beco me
incapacitated?
002 Will the operator Action omitted Lapse of memory, Establish better task and crew organization; utilize
(A03) have a very high orperformed other rnernory a recurring signal; consider automation; use
or Jow workload? either too late failure, flexible crewlnq; enhance signal salience.
ortoo early slqnal-detectlon
failure
003 WiII it be olear who Action omitted Crew-coordination Improve training and task allocation among crew;
(A04) must respond? or perforrned failure provide team training.
too late
004 Could information Failure to act, or Crew-coordination Develop robust shift-handover procedures;
collected fail to wrong action failure training; provide team training across shift
be transmitted performed, or boundaries; develop robust and auditable
effectively across action data-recording systems (Iogs).
shift-handover performed
boundaries? either too late
or too early, or
an error of
quality (too
little or too
much)
OC1 Ooes the scenario Failure to act, or Cognitive overload Provide emergency-response training; design
involve multiple wrong action crash-shutdown facilities; use flexible crewing
events, thus performed, or strategies; implement shift-technlcal-advisor
causing a high action role; develop emergency operating procedures
levelof performed able to deal with múltiple transients; engineer
complexity or a eithertoo automatic information recording (trends, logs,
high workload? earty or too printouts); generate decision/dtaqnostic support
late facilities.
PRA applications, Fls and ETs are coinbined-each other system components, including other humans,
major column oí the ET can represent a top event may have a marked effect on the outcomes of
whose failure probability can be computed through the PRAs required developing methods for assessing
evaluation of a corresponding Fl' model (Figure 7). human reliability, thus establishing the field of human
Quantitative solutions to Fls or ETs that address reliability analysis (HRA). A variety of methods
only machine or material components are ultimately of HRA are currentIy available that range from
dictated by well-docurnented mathematical methods relatively quick assessment procedures to those that
for computing component reliability, either in terms involve detailed analyses (Kirwan, 1994). Almost al1
of the probability that the component or subsystem these methods rely on the idea of PSFs, discussed
functions normally each time it is used or in terms of earlier (Section 3:3); the methods differ, however, in
the probability that the component will not fail during how PSFs are used to generate HEPs for various
sorne prescribed time of use (Kapur and Lamberson, activities. To illustrate the different approaches to
1977). The realization that human interaction with deriving HEPs that these methods can take, two
HVMANERROR 733
Table 8 Psychological Error Mechanisms lor Two 01 the Stages 01 Inlormation Processing Presented in Table 7
ActivafionlDetecfion
1. Vígifance fai/ure: lapsa ot attention. Ergonomic design of interface to allow provision ot effective attention-gaining
measures; supervision and checking; task-organization optimization, so that the operators are not inactive for long
periods and are not isolated.
2. Cognitive/stimulus over/oad: too many signals present for the operator to cope with. Prioritization of signals (e.g.,
high-, medium-, and low-Ievel alarms); overview displays; decision-support systems; simpl1fication of signals;
flowchart procedures; simulator training; automation.
3. Stereotype fixation: operator fails to realize that situation has deviated from norm. Traininq and procedural emphasis
on range of possible symptoms/causes; fault-symptom matrix as a job aid; decision support system; shift technical
advisor/supervision.
4. Signal unrelíable: operator treats signal as false due to its unreliability. Improved signal reliability; diversity 01signals;
increased level of toferance on the part of the system, or delay in effects of error, which allows error detection and
correction (decreases "coupling"); training in consequences associated with incorrect false-alarm diagnosis.
5. Signal absenf: signal absent due to a maíntenanee/calíbration failure or a hardware/software error. Provide signal;
redundancy/diversity in signaling-design approach; pracedures/training to allow operator to recognize when signal
is absent.
6. Signa/-discriminatían fai/ure: operator fails to realize that the signal is different. Improved ergonomics in the interface
design; enhanced training and procedural support in the area of signal differentiation; supervision checking.
Observation/Data Collection
7. Attention fai/ure: lapse of attention.
8. Multiple signal coding: enhanced alarm salience; improved task organization with respeet to backup crew and rest
pauses.
9. Inaccurate recal!: operator remembers data incorrectly (usually, quantitative data). Nonreliance on memorized data,
which would necessitate better interface design - as data are received, they can either be acted on while still
present on a display (controls and displays are co-Iocated) or at feast be logged onto a "scratch pad"; sufficient
displays for presenting all information necessary for a decision/action simultaneously; printer usage; training in
nonreliance on memorized data.
10. Confirmatíon bias: operator only selects data that confirm given hypothesis and ignores other disconfirming data
sources. Problem-solving training; team training [including training in the need to question deeisions, and in the
ability of the team leader(s) to take constructive criticism]; shift technical advisor (diverse, highly qualified operator
who can "stand back" and consider alternative diagnoses), functional procedures: high-Ievel ínformation displays;
simulator training; high-Ievel alarms for system-integrity degradation; automatic protection.
11. Thematic vagapondíng: operator flits from datum to datum, never actually collating it meaningfully. Problem-solving
training; team training; simulator training; functional-procedure specification for decision-timing requirements;
high-Ievel alarms for system-integrity degradation.
12. Encystment: operator focuses exclusívely on only one data source. Problem-solving training; team training
[including training in the need to question decisions and in the ability of the team leader(s) to take constructive
criticism]; shift technical advisor; functional procedures; high-Ievef information dispfays; simulator training;
high-Ievel alarms for system-integrity degradation.
13. Stereotype fixation revisited: need for information is not prompted by either memory or procedures. Emergency
procedure enhancements, and emphasis of key symptoms and índicators to be checked; team training;
problem-solving training; alarm reprioritization; simulator training.
14. Crew-functioning problem: allocation 01 responsibility or priorities is unclear, with the result that data
collection/observation fails.
15. Cognitíve/stimulus overtoed: operator too busy, or being bombarded by signals, with the result that effective data
eollecfion/observation fails. See item 2. .
welI-known techniques that were originally developed these actions; and finalIy, these HEPs are aggregated to
for application in the nuclear power industry will be derive probabilities of task failure, whích reflect human
described. rel iability.
THERP is a highly systematic procedure. Its
5.2 THERP initial steps are directed at establishing which work
The technique for human error rate prediction, gen- activities will require emphasis and the time and
eraUy referred to as THERP, is detailed in a work skill requirements and concerns for human error
by Swain and Guttmann (1983) sponsored by the associated with these activities. Factors related to
tI.S. Nuclear Regulatory Commission. Its methodol- error detection and the potential for error recovery
ogy is driven by decomposition: Human tasks are are also determined. The results of these efforts are
first decomposed into clearly separable actions or sub- represented by a type of event tree referred to as a
tasks, HEP estimares are then assigned to each of probability tree. Each relevant subtask in a probability
---.---_ _
...
734 DESIGN FOR HEALTH, SAFETY, AND COMFORl'
F, = 0.0102
F.= 0.0024
Fs= 0.0097
Fa=0.0009
F.= 0.0009
Sr= 1- Fr FT= EF¡
= 0.9275 =0.0725
Figure 8 HRA event tree correspondínq to a nuclear power control room task that includes one recovery factor.
(FromKumamoto and Henley, 1996; © 2004 IEEE.)
events comprising the probability tree (Swain and to the occurrence of an error, co-workers potentially
Guttmann, 1983). capable of catching or discoveríng (in time) a fellow
The final steps of THERP consider the ways in worker's errors, and various types of scheduled walk-
which errors can be recovered and the kinds of design through inspections, As with conventional ETs, these
Ínterventionsthat can have the greatest impact on task recovery paths can easíly be represented in HRA
Successprobabílíty. Common recovery factors inelude probability trees (Figure 8). In the case of annunciators
thepresence of annunciators that can alert the operator or inspectors, the relevant failure limb is extended
736 DESIGN FOR HEALTH, SAFETY, AND COI\1FORT
into two additional limbs: one failure limb and one can reflect relatively low-Ievel actions that canoot be
success limbo Thc probability that the human responds further decomposed, as well as more broadly defined
successfully to the annunciator or that the inspector actions that encompass many of these lower-Ievel
spots the operator's error is then fed back into the actions. This increased f1exibility, however, comes
success path of the original tree. In the case of recovery at the expense of· a greatly reducecI emphasis on
by fellow team members, BHEPs are modified to t~sk analysis and an inereased relianee on subjec_
CHEPs by considering the degree of dependency tíve assessments.
between thc operator and one or more fellow workers SLIM assumes that the probability that a human
who are in a position to notice the error. The effects will carry out a particular task or aetion successfulIy
of recovery factors can be determined by repeating the depends on the combined effects of a number of
computations for total task faiIure. relevant PSFs. For each action under consideratian
In addition to considering error recovery factors, task domain experts are required to identify th~
the analyst can choose to perform sensitivity analysis, relevant set of PSFs; assess the relative importance
One approach to sensitivity analysis is to identify (or weights) of each of these PSFs with respect to
the most probable errors on the tree, propose design tbe likelihood of sorne potential error mode associated
. modifications corresponding to those task elements, with the action; and independent of this assessment,
estimate the degree to which the corresponding rate how good or bad each PSF actualIy is. Relative
HEPs would become reduced by virtue of these importance weights for the PSFs are derived by' asking
modifications, and evaluate the effect of these design each analyst to assign a weight of 100 to the mosr
interventions on the computation of the total task . important PSF, and tben assign weights ranging from
failure probability. The final step in THERP is to O to 100 to each of the remaining PSFs based on the
incorporate tbe results of tbe HRA into system risk importance of tbese PSFs relative to the one assigned
assessments such as PRAs. the value of 100. Normalized weights are derived by
An obvious deficíency of THERP is its inabil- dividing eaeh weight by the sum of the weights for
ity to handle human errors that have a more com- al! the PSFs. The judgcs then rate each PSF on each
plex cognitive basis. Despite attempts to embel1ish action or task, witb the lowest scale value indicating
THERP [e.g., through "sneak analysis" rnethods that that the PSF is as poor as it is likely to be under
may enable the analyst to identify decision making real operating conditions, and the highest scale value
errors (Hahn and deVries, 1991)], THERP's underIy- indicating that the PSF is as good as it is likely to
ing emphasis on decomposition and subsequent aggre- be in terms of promoting successful task performance.
gation of individual actions has been questioned. For The likelihood of success for each human action is
example, Hollnagel (1993) has argued that human reli- detcrmined by summing the product of the normalized
ability cannot be accounted for by considering "each weights and ratings for each PSF, resulting in numbers
.action on its own" but rather, by considering "actions (SUs) that represent a scale of success likelihood .
. as a whole sequence," and has developed an alternative The SUs are useful in their own right. For example,
approach to HRA based on a modeling framework for if the actions under consideration represent alternative
predicting cognitive reliability that can also be used modes of response in an emergency scenario, the
to support system risk assessments (Hollnagel, 1998). analyst may be interested in determining which types
Altbough THERP's inability to adequately address of responses are least or most likely to succeed.
more cognitively complex tasks and the underlying However, for the purpose of conducting PRAs, SLIM
causality of human error tends to cast it as shallow, converts the SUs to HEPs. An estimate of the HEP is
the insights concerning system operations acquired derived using the following relationship:
tbrough THERP's attention to detail ultimately are
likely to make it more usefu] than the numbers it probability of success =a x SU +b
malees available to quantitative risk assessments such
as PRAs. In this respect, THERP shares many of where HEP is 1- the probability of success. To derive
tbe characteristics of PRAs: The quantitative prod- the two eonstants in this equation, the probabilities of
ucts provided by PRAs are often considered to be success must be available for at least two tasks taken
less important than the ability for these risk assess- from the cluster of tasks for which the relevant set
ments to identify defieiencies in design, provide a of PSFs was identified. However, methods exist for
better understanding of interdependencies among sys- deriving HEPs even if information on such "reference"
tems and operations, and offer insights for improving . tasks is not available. Methods also exist for deriving
procedures and operator training (Kumamoto and Hen- upper and lower uncertainty bounds for these HEPs,
ley, 1996). which PRAs typically require.
Multiattribute utility decomposition (MAUD) pro-
5.3 SLIM-MAUO vides a user-friendly computer-interactive environment .
The success likelihood index methodology (SUM) rep- for implementing SUMo This feature ensures that
resents another procedure for deriving HEPs (Embrey many of the assumptions that are critical to the the-
et al., 1984). In contrast to THERP, SLIM allows oretical underpinnings of SLIM are met. Por exarnple,
the analyst to focus on any human action or task. MAUD can determine if the ratings for the vari-
Consequently, this method can provide inputs into ous PSFs by a given analyst are independent of one
PRAs at various system )eve)s; that is, tbe HEPs another and whether the relative importance weights
HUMAN ERROR 737
elicited for the PSFs are consistent with the analyst's 6 INCIDENT REPORTING SYSTEMS
preferences. In addition, MAUD provides procedures 6.1 Design, Data Collection, and Management
forassisting the expert in identifying the relevant PSFs. Considerations
Furtherdetails concerning SLIM-MAUD are provided Information systems aIlow extensive data to be col-
in Embrey et al. (l984) and Kirwan (1994). lected on incidents, accidents, and human errors, and
thus afford excellent opportunities for organizations
to learn. The distinction between accidents and inci-
5.4 Human Error Data dents varies among authors and government regulatory
agencies. Generally, accidents imply injury to persons
or reasonable damage to property, whereas incidents
As indicated in the discussion of THERP, fundamental usually involve the creation of hazardous conditions
data 00 HEPs can come from a variety of sources. that if not recovered could lead to an accident. Aed-
Ideally, HEP data should derive from the relevant dents and adverse events are terms that are often used
. operating experience or at least from similar industrial interchangeably, as are incident, near miss, and clase
experiences. However, as Kirwan (1994) notes, a callo
numberof problems are associated with collecting this Capturing information on near misses is particularly
type of quantitative HEP data. For example, many advantageous. Depending on the work domain, near
workers will be reluctant to report errors due to the misses may occur hundreds of times more often than
threat of reprisals, and mechanisms for investigating adverse events. If near misses are regarded as events
errors are often nonexistent. that did not result in accidents by virtue of chance fac-
Even if these problems could be overcome, there . tors alone, the contexts surrounding near misses should
are still other issues to contend with conceming the be highly predictive of accidents. The reporting of near
collection of useful HEP data. One problem is that misses, especially in the form of short event descrip-
errors that do not lead to a violation of a company's tions or detailed anecdotal reports, would then pro vide
technical specifications or that are recovered almost a potentially rieh set of data that could be used as
irnmediately will probably not be reported. Also, data a basis for proactive interventions. Moreover, fewer
on errors associated with very low probability events, barriers exist in reporting them (Barach and Small,
asin the execution of recovery procedures following an 2000). However, to anticipate hazardous scenarios and
accident, may not be sufficiently available to produce provide the proactive accident prevention function nec-
essary for enabling organizations to improve continu-
reliable estimates and thus often require simulator ously, incide/u reporting systems (IRSs) must be capa-
studies for their generation. Finally, error reports are ble of identifying the underlying causes of the reported
usually confined to the observable manifestations of an events.
error (the external error modes). Without knowledge The role of management is critical to the successful
of the underlying cognitive processes or psychological development and implementation of an IRS (CCPS,
mechanisms, errors that are in fact dissimilar (Table 1) 1994): Management not only allocates the resourees
may be aggregated. This would not only corrupt the for developing and maintaining the system but ean
HEP data but could also compromise error-reduction a1so influence the development of work cultures that
strategies. may be resistive to the deployment of IRSs. In par-
In a study covering over 70 incidents in the ticular, organizations that have instituted "blame cul-
British nuclear industry, ít was possible to compile tures" (Reason, 1997) are unlikely to advocate IRSs
data on external. error modes, PSFs, and psycho- that emphasize underlying causes of errors, and work-
logical error mechanisms, and to derive 34 differ- ers in tbese organizations are unlikely to volunteer
ent HEPs (Kirwan et al., 1990), suggesting the pos- information to these systems. Ultimately, manage-
sibility for collecting reasonably accurate operational- ment's attitudes concerning human error causation will
experience human error data. More typically, HEPs be reflected in the data that will be collected. The
are derived from other sources, including expert judg- adoption of a system-induced perspective on human
ments, laboratory experiments, and simulator studies. error that is consistent with Figure 1 would imply the
Table 9 presents examples of HEP data from several need for an information system that emphasizes the
of these sources. Additional data on HEPs that inelude collection of data on possible causal factors, ineluding
organizational and management policies responsible
upper and lower uncertainty bounds and the effects of for creating the latent conditions for errors. Data on
PSFs on nominal HEPs may be found in Swain and near misses would be viewed as indispensable for pro-
Guttmann (1983) and Gertman .and Blackman (1994). viding early warnings about how the interplay between
More recently, Kirwan (1999) has reported on the con- human fallibility and situational contexts can penetrate
struction of a HEP database in the UK referred to as barriers. System-based perspectives to human error are
CORE-DATA (computerized operator reliability and also conducive to a dynamic approach to data col-
error database) for supporting HRA activities. CORE- Iection-e-if the methodology is proving inadequate in
DATA currently contains a large number of HEPs; accounting for or anticipating human error, it will
its long-term objective is to apply its data to new probably be modified (Figure 9).
industrial contexts through the development of extrap- Worker acceptance of an IRS that relies on vol-
olation rules. untary reporting entails that the organization meet
738 DESIGN FOR HEALTH, SAFETY, AND COMFORT
Table 9 (confinuecl)
Error Probability.
output states (healthy or unnealthy), A premeiure diagnosis implied that the operator
identified the faulty unit without first havíng carried out enouqh tests conclusively to
determine which unit was faulty (irrespective of whether the premature diagnosis was
correct or not: the task cannot afford the operator to make premature guesses) .•
13. Failure to carry out a one-step calculation correctly 0.01
14. Failure to carry out a seven- to 13-step calculation correctly 0.27
Simulator-Derived Data
15. Emergency manual trip in a nuclear control room 0.2
Prior to a fault appearing, the operatar would be occupied with normal operations in a
simulated control room. Initially, when a fault appeared, the operator was expected to try to
control the fault, but it quickly became apparent that this was not possible, the operator
was required instead to shut down (trip) the plant, The faults in question comprised a
control-red runout, a blower failure, a gas-temperature rise, and a coolant-flow fault.
Tripping the plant required a single pushbutton activation. The fault rate in this scenario .
was 10 signals per hour (normally, it would have been on the order of 1 in 10,000 hours).
The operator had only 30 to 90 seconds to respond by tripping the reactor, during which
time the operator would have had to detect and diagnose the problem and then take action
almost tmmediately.
16. Omission of a procedural step in a nuclear control room 0.03
This HEP is based on a number of different scenarios, which were taced by shift teams in a
full-scope nuclear power plant (NPP) simulation in the United States. The shift teams, all of
whose members were being recertified as NPP operators, were required to deal with a
number 01 emergency scenarios.
17. Selection of wrong control (discrimination by label only) 0.002
This HEP, which was derived from a number of NPP simulator scenarios, was based on 20
incorrect (unrecovered) selections trom out of a total 01 11,490 opportunities tor control
selection.
18. Selection of wrong control (functionally grouped) 0.0002
As aboye, but this time the HEP is based on only four unrecovered errors out of 27,055
opportunities for error.
19. Equipment turned in wrong direction 0.00002
As aboye, based on the unrecovered errors again, and with equipment that does not violate
a population stereotype (i.e., with normal, expected turning conventions).
Source: Kirwan (1994).
three requirements: exact a minimal use of blame; investigation need to be eapable of addressing in detail
ensure freedom from the threat of reprisals, and pro- all considerations related to human fallibility, eontext,
vide feedback indicating that the system is being used and barriers that affect the incident, Thus, training
toaffeet positive ehanges that can benefit alI stakehold- may be required for recognizing that an incident has
ers, Accordingly, workers wouId probably not report in faet oecurred and for providing full descriptions
theoccurrence of accidental damage to an unforgiving of the event. Training would also be neeessary for
!llanagementand would discontinue voluntarily offer- ensuring that these data are input correctly into the
~nginformation on near misses if insights gained frorn information system and for verifying that the system's
~nterventionstrategies are not shared (CCPS, 1994). It knowledge base adequately supports the representation
IS therefore essential that reporters of information per- of this information. Analysts would need training
ceive IRSs as error management or learning tools and on applying the system's tools, including the use of
not as disciplinary instruments. any modeling frameworks for analyzing causality of
In addition to these fundamental requirements, two human error and on interpreting the results of these
other issues need to be considered. First, consistent applieation tools. They would also need training on
with user-centered design principIes (Nielsen, 1995), generating summary reports and recornmendations and
potential users of the systern should be involved in 00 making modifications to the system's database and
us design and implementation as they would with inferential tools if the input data imply the need for
any newly designed (or redesigned) product, although such adjustments. Aecess control would aIso need to
.for very large populations of potential users this be addressed. For eaeh category of system user (e.g.,
m~~ not be practica]. Second, effective training is manager, human factors analyst, employee) a reading
cnucaI to the systern's usefulness and usability. When áuthority (who is allowed to retrieve infonnation from
human errors, near misses, or ineidents oceur, the the system) and a writing authority (who is allowed to
people who are responsible for their reporting and update the database) need to be specified.
740 DESIGN FOR HEALTH, SAFETY, AND COMFOR:r
/
1
Data Collection
Modelof System Characteristics
f---+ Human Error 1-+ ................................................ _ ......................... 1+- Organization-wide
Safety Culture
~
Causation
• Types of data collected
• Method of collectíon,
storage, and
processing
• Interpretatíon
• Technical • Generation of generic Workforce
Methods 1-+ and specífic error- I+- Acceptance
• Training reductíon strategies and Support
• Implementation
• Effectiveness monitoring
• Feedback
1
Continuous Improvement
and Organizational Learning
...........................................................................................
• Safety
• Environmental impact
• Quality
• Production losses
1
• Regulators
• Shareholders
• General public
Figure 9 Data collection system tor error management. (Adapted from CCPS, 1994. Copyright 1994 by the American
Institute of Chemical Engineers, and reproduced by permíssion of AIChE.)
Data for input into IRSs can be of two types: quanti- inforrnation. Relevance will depend on how the sys-
tative data, which lend themselves more easily to cod- tem will be used. If the objective is to analyze statistics
ing and classification, and qualitative data in the forrn on accidents in order to assess trends, a limited set of
of free-text descriptions. Kjellén (2000) has specified .data on each accident or near miss would be sufficient
the basic requirements for a safety information system and the nature of these data can often be specified
in tenns of data collection, distribution and presen- in advance. However, suppose that the user is inter-
ested in querying the system regarding the degree to
tation of information, and overall information system
which new technology and cornmunication issues have
attributes. To meet data collection requirements, the been joint factors in incidents involving errors of omis-
input data need to be reliable (if the analysis were to be sion. In this case, the relevance will be decided by the
repeated, it should produce similar results), accurate, coverage. Generally, the inability to derive satísfactory
and provide adequate coverage (e.g., on organizational answers to specific questions will signal the need for
and human factors/ergonomics issues) needed for exer- modifications of the system.
cising efficient control. Foremost in the distribution In addition to relevance, the information should
and presentation of information is the need for relevant be comprehensible and easy to survey; otherwise,
paz
its use will be restricted to highly trained analysts, in $41.5 million in savings. Unquestionably, the ben-
prompting high-level management to view the sys- efits that can potential1y be accrued from constructive
tem with suspicion. Overall, the inforrnation system use of worker feedback can have a powerful impact on
should promote involvement between management an organization' s effectiveness and are the basis for the
and employees, thus fostering organizational learn- appeal of IRSs in industry.
ing. Finally, the system should be cost-efficient. As
in most cost-benefit analyses, costs wilI be much
easier to assess than benefits. Investment, operations, 6.3 The Aviation Safety Reporting System
and maintenance costs are relatively straightforward to The Aviation Safety Reporting System (ASRS) was
determine, as are potential benefits resulting from cost developed in 1976 by the Federal Aviation Adrninis-
reductions associated with the handling, storing, and
tration (FAA) in conjunction with the National Aero-
distribution oí various safety-related documents. Ben-
n~utics ~d Space Administration (NASA). Many sig-
efits associated with reductions in adverse outcomes
ruficant improvements in aviation practices have since
such as accidents, production delays, and reduced qual-
been attributed to the ASRS, and these improvements
ities are generally much more difficult to assess.
have largely accounted for the promotion and devel-
In searching the database, the user may restrict
opment of IRSs in other work domains (Table 10).
the search to events that meet criteria defined on
The ASRS's mission is threefold: to identify defi-
one of the standard four (nominal, ordinal, interval,
ciencies and discrepancies in the National Aviation
or ratio) scales of measurement (e.g., find all near
Systern (NAS), to support 'policy formulation and
misses involving workers with less than six months
of experience) or to events that inelude keywords in planning for the NAS, and to collect human perfor-
free-text descriptions (e.g., find a11 near misses of mance data and strengthen research in the aviation
radiation overexposure that resulted in disruptions to domain, AH pilots, air traffic controllers, flight atten-
production schedules). Data entercd and coded based dants, rnechanics, ground personnel, and other per-
on standard forms of measurement are relatively easy sonnel associated with aviation operations can sub-
to manage, whereas data that have been documented mit confidential reports if they have been involved
and stored in unstructured free-text descriptions may in or observed any incident or situation that could
~equíre intelligent software agents for analysis and have a potential effect on aviation safety. Preaddressed
interpretation. AH information searches, however, postage-free report forms are available online and are
afford the possibility for type 1 errors (wanted data submitted to the ASRS vía the U.S. Postal Service,
!hat are not found) and type II errors (unwanted data However, unlike other systems, the ASRS presently is
identified as hits), not equipped to handle online submissions of informa-
tion. The ASRS database can be queried by accessing
6.2 Historical Antecedent its Internet site (http://asrs.arc.nasa.gov), and is also
available on CD~ROM.
An .idea related to IRSs, that of the modest sug-
gesnon box, has .been around for hundreds of years. ASRS reports are processed in two stages by a
Both IRSs and suggestion programs are tools designed group of analysts composed of experienced pilots and
t? capture problem-related data from interested par- air traffic controllers. In the first stage, each report is
hes regarding the operations of an organization, and read by at least two analysts who identify incidents
are deployed by organizations in order to Iearn about and situations requiring immediate attention. Alerting
~nd improve themselves. One of the earliest sugges- rnessages are then drafted and sent to the appropriate
.110nprograms, implemented by the British Navy in group. In the second stage, analysts c1assify the reports
1770 (Robinson and Stern , 1998), was motivated by and assess causes of the incident. Their analyses
the recognition that persons within the organization and the information contained in the reports are then
should have a way of speaking out without fear of incorporated into the ASRS database. The database
reprisals. The first suggestion box was implemented in consists of the narratives submitted by each reporter
the Scottish firm William Denny & Brothers in 1880, and coded information that is used for inforrnation
a~d the first U.S. company to implement a company- retrieval and statistical analysis procedures.
wide suggestion program was National Cash Register Several provisions exist for disseminating ASRS
In 1892. The suggestion program gained rapid accep- outputs. These include alerting messages that are sent
tance following WorId War 11, when it was adapted out in response to immediate and hazardous situ-
by quality initiatives to meet various objectives, such ations, the CALLBACK safety bulletin, which is a
as sa!ety (Turrell, 2002). At the Toyota Motor Cor- monthly publication containing excerpts of incident
pOratlOn, the suggestion program is part of the Kaizen report narratives and added cornments (Figure 10), and
?r "continuous improvernent" approach to manufactur- the ASRS Directline, which is published to meet the
IUg and represents an extremely important feature of needs of airline operators and flight crews. In addi-
~e Toy~ta production system. Implemented in 1951, tion, in response to database search requests, ASRS
~ttook rune years to achieve a 20% participation rateo staff comrnunicates with the FAA and the National
n 19?9, data from the Toyota Motor manufacturing Transportation Safety Board (NTSB) on an institu-
plant 10 ~entucky indicated that 5048 of 7800 employ- tionallevel in support of various tasks, such as accident
ees contnbuted 151,327 ideas into the system and that investigations, and conducts and publishes researeh
nearly all were implemented (Leech, 2004), resulting related primarily to human performance issues.
z-< -< -< Z Z 2 Z -<
o <D
C/)
(1)
C/)
<D
C/)
O O O O 9l
z z z -<2 2 -< z z Z z ~
~ 6fil_o o o g¡ O O <Il
c/).
o o o o
~ O
::J
~ ¡¡;. ¡t ....
..... -c
3
<D :J <D. O
o. - o. e
'"
z -< -< -< -<
o
z o
O O ([)
C/)
<Il
tn
<Il
C/)
ro
C/) ao.:
ro
;::¡_
él
·V 2Z
O O
Z
O
z
o
Z
o
Z
o
Z
o 3"'
3
c
::l
~
~
::J
O
:::J
III
o
o
o.:
<Il
;::¡_
en
-<
ro
-<
(1)
-<
ro
-<-<
C1J (1)
-<
(1)
-<
ro
(/1 en C/) (/1 C/) en (/)
¡
i
L
HUMAN ERROR 743
Many major U.S. airIines have their own internal a citation with penalties ranging from warning letters
programs for tracking human errors, especially among to license revocation. However, American Airlines
pilots, and these programs usual1y rely on sorne pilots who file ASAP reports are assured that the FAA·
form of IRS. The Aviation Safety Action Partnership will exact no punishment. or less severe punishment,
(ASAP), an American Airlines program, is somewhat as long as the error was unintentional. It has been
unique in that it collaborates with the FAA to estimated that without ASAP the FAA would be aware
determine how pilot errors should be handled.· The of fewer than 1% of the errors of American Airline
e~eIit-review team ineludes an FAA official, company pilots (Kaye, 1999b).
managers, and representatives from the Allied Pilots In contrast to these fairIy conventionaI industry-
Association (the pilots' union). Typically, if the FAA specific IRSs, United Airlines has adopted a more
detennines that a pilot error has occurred, it issues sophisticated approach. to deaJing with pilot error.
744 DESIGN FOR HEALTH, SAFETY, AND COMFORl'
Its flight operations quality assurance program uses occupational medicine, and pharmacy. One example ís
optical recorders in most of its daily flights to the Veteran's Admi~stration Patient Safety Reporting
reduce pilot error by capturing a pilot' s every move Systern (PSRS), which developed out of an agreement
electronically (Kaye, 1999c). These disks are later in 2000 between NASA and the Department of
analyzed by computer, and if something wrong, Veteran's Affairs (VA). The PSRS allows .all VA
dangerous, or outside normal operating procedure is .medical facility staff to report voluntarily any events
identified, a team of 10 United Airlines pilots examines a~d concer~s relate? to patient. safety confidentially
the problem and determines a course of action. For without being subject to reprisals. The types of
example, if a proficiency issue is identified, the team events that can be reported inelude close calls (i.e.
can authorize training for that .pilot. The optical near misses), unexpected situations involving deatb'
recorders could also be connected to operating systems physical or psychological injury of a patient o;
other than the cockpit. For instance, when linked to employee, and lessons Iearned related to patient safety.
i15 maintenance systems, it enabled United Airlines to Ultimately, the information is made .available through
disco ver that sorne internal engine parts were cracking alerts, publications such as the Patient Safety BuIletin
from too much heat. Electronic monitoring presumes and research studies. Although still in use, the PSRS
relinquishing privacy; thus, acceptance of the program now serves. as a complement to a more recent reporting
will require that workers acknowledge the possibility system being operated by the VA that utilizes a
that more of their mistakes can become corrected. root-cause analysis methodology (Section 10.2) for
analyzing adverse events and near misses, and provides
6.4 Medical Incident Reporting Systems strategies for decreasing the likelihood of the event's
The medical industry is currently struggling with reoccurrence.
what has been termed an epidemic of adverse events Another example of a medical IRS is the Anesthe-
stemming from medical error. (This industry defines sia Critical Incident Reporting System (CIRS) operated
an adverse event as an injury or death resulting by the Department of Anesthesia at the University of
from medical management, and medical error as Basel in Switzerland. Using a Web-based interface,
the failure of a planned action to be completed as contributors worldwide can anonymously report infor-
intended.) Taking a cue from other complex high- mation on incidents in anesthesia practice and review
risk industries such as nuclear power and chemical information collected on those incidents. The CIRS
processing, the bealth care industry is increasingly IRS defines a reportable event as "an event under
considering, developing, and deploying IRSs to deal anesthetic care which has the potential to lead to an
with patient safety conceros and related issues. Despite undesired outcome if left to progress." Contributors
acknowledgment by the Institute of Medicine that there can also report events resulting from team interactions.
are an enorrnous number of preventable injuries to The design of this system was based on the experiences
patients (Kohn et al., 1999), implementing IRSs in the of the Australian AIMS study, another infíuential IRS
health care industry has lagged behind other industries for reporting anesthesia incidents.
and for good reason. Compared to other industries, As of this writing, the U.S. Senate has proposed
the health care industry interacts with the public on a a bill that would set up a confidential, voluntary
highly personal basis, and protecting reports on near system for reporting medical errors in hospitals without
misses, incidents, and accidents is likely to be met fear of litigation. The goal of the bill, which is .
with resistance frOID a public that especially in the pending committee review and action, is to encourage
United States, is entrenched in a culture of litigation .health care providers to report errors so they can
and that is seeing an increasing part of their income be analyzed by patient safety organizations for the
being allocated to health care costs (Section 1.1). purpose of producing better procedures and safety
Collecting reports on medical error that are anonymous protocols that could improve the quality of careo
may not appease the public-amnesty of unsafe acts Notably, in his statement supporting the passage of this
that lead to near misses and adverse events would bill, Donald Palmisano, the immediate past president
probably not go over very well. Interestingly, medical of the American Medica! Association, stated that
IRSs have been successful in gaining acceptance in "the Aviation Safety Reporting System serves as a
Australia and New Zealand, where legal protection for successful model for this system."
those who report events has been enforced (Rosenthal
et al., 2001). 6.5 Limitatlons 01 Incident Reporting Systems
Not surprisingly, standardization of definitions of Sorne IRSs, by virtue of their inability to cope with tbe
errors, near misses, and adverse events in the medical vast number of incidents in their databases, have appar-
industry, which is fundamental to the industry's ently become "victims of their own success" (Johnson,
ability to gather information, leam about patient 2002). The Federal Aviation Administration's (FAA's)
safety, and institute intervention strategies, has been ASRS and the Food and Drug Adrninistration' s Med-
difficult to establish. It is al so questionable whetber Watch Reporting System (designed to gather data on
a true safety culture that supports IRSs exists in regulated, marketed medical products, including pre-
the medical industry. Despite these issues, there scription drugs, specialized nutritional products, and
have been a number of successful implementations medical devices) both contain over a half a million
of IRSs in the health care industry, in particular incidents. Because their database technologies were
in transfusion medicine, intensive care, anesthesia, not designed to manage this magnitude of data, users
HUMAN ERROR 745
who query these systems are having trouble extracting so these systems may need to be driven by new models
useful information and often fail to ídentify important of organizational dynamics and armed with new levels
cases. This is particularly true of the many IRSs that of intel1igence (Dekker, 2005).
rely on relational database technology. In these sys- A much more fundamental problem with IRSs
tems, each incident is stored as a record and incident is the difficulty in assuring anonymity to reporters,
identifiers are used to link similar records in response especially in smaller organizations. Although most
to user queries. Relational database techniques, how- IRSs are eonfidential, anonymity is more conducive
ever, do not adapt well to changes in the nature of to obtaining disclosures of incidents. Unfortunately,
incident reporting or in the models of incident causa- anonyrnity precludes the possibility for follow-up
tion. Also, different organizations in the same industry interviews, which are often necessary for clarifying
tend to classify events differently, which reduces the reported information (Reason, 1997).
benefits of drawing on the experiences of IRSs across Being able to follow up interviews, however, does
different organizations. It can also be extremely dif- not always resol ve problems contained in reports. Gaps
ficult for people who were not in volved in the cod- in time between the submission of a report and the
ing and classification process to develop appropriate elicitation of additional contextual information can
queries (Johnson, 2002). result in important details being forgotten or confused,
Problems with IRSs can also arise when large num- especially if one considers the many forms of bias
bers of reports on minor incidents are stored. These that can affect eyewitness testimony (TabIe 11). Biases
database systems may then begin to drift toward report- that can affeet reporters of incidents can also affect
ing information on quasi-incidents and precursors of the teams of people (i.e., analysts) that large-scale
quasi-incidents, which may not necessarily provide the IRSs often employ to analyze and classify the reports.
IRS with increased predíctive capability (Amalberti, For example, there is evidence that persons who have
2001). As stated by Amalberti: "The result is a bloated received previous training in human factors are more
and costly reporting system with not necessarily bet- likely to diagnose human factors issues in incident
ter predictability, but where everything can be found; reports than persons who have not received this type
this system is chronically diverted from its true calling of training (Lekberg, 1997).
(safety) to serve literary or technical causes. When a . Variability among analysts can also derive from
specific point needs to be proved, it is (always) pos- the confusion that arises when IRSs employ classifica-
sible to find confirming elements in these extra-Iarge tion schemes for incidents that are based 00 detailed
databases" (p. 113). There is, howcver, a counterargu- taxonomies. Difficulty in discrirninating between the
ment to this view: that in the absence of a sufficient various terms in the taxonomy may result in low recall
number of true incidents, the judicious examinarion systems, whereby sorne analysts fail to identify poten-
of quasi-incidents may reveal vulnerabilities within tially similar incidents. In general, concerns associated
the system that would normaIly be concealed. In this with interanalyst reliability stemming from bias and
regard, exploiting the potential of quasi-incidents in differences in analysts' abilities can impede an organi-
IRSs suggests the possibility for a proactive capabil- zation's ability to learn. More specifically, Iimitations
ity that may indeed reflect the existence of a highly in analysts' abilities to interpret causal events reduces
evolved safety culture. the capability for organizations to draw important con-
There is a drift of a different sort that would clusions from incidents, and analyst bias can lead to
be advantageous to catalog, but unfortunately is not organizations using IRSs for supporting existing pre-
amenable to capture by the current state-of-the-art conceptions concerning human error and safety. As
in incident reporting. These drifts reftect the various alluded to earlier, training all analysts to the same
adaptations by an organization' s eonstituents to the standard, although a resource-intensive proposition for
externa1 pressures and conflictíng goals to which they large organizations, is necessary for minimizing vari-
are continuously subjected (Dekker, 2005). Drifting ability associated with inferring causality.
into failure may occur, for instance, when a worker Although there are no software solutions to all
confronts increasingly searce resources while under these problems, a number of recommendations dis-
pressure to meet higher production standards. If the cussed by Johnson (2002) deserve consideration. For
adaptive responses by the worker to these demands example, for IRSs that are confidential but not anony-
gradually become absorbed into the organization' s mous, computer-assisted interviewing techniques can
definition of normal work operations (Section 8), work mitigate sorne of the problems associated with follow-
contexts that may be linked to system failures are up elicitations of contextual details from reporters.
unlikely to be reported. The intricate, incremental, and By relying on frames and scripts that are selected
transparent nature of the adaptive processes underlying in response to information from tbe user, these tech-
these drifts is manifest at both the horizontal and niques can ensure that particular questions are asked
vertical levels of an organization. Left unchecked, in particular situatioos, thus reducing interanalyst
aggregation of these drifts seals an organization's fate biases sternming from the use of different interview
by effectively excluding the possibility for proactive approaches. The success of these approaches, how-
risk management solutions. In the case of the accident ever, depends 00 ensuring that the dialogue is appro- .
in Bhopal (Casey, 1993), these drifts were personified priate for the situation that the reporter is being
at alllevels of the responsible organization. Although asked to address. Inforrnation-retrieval engines that
IRSs can, in theory, monitor these types of drifts, to do are the basis for Web search also offer pro mise, due
~
...
Table 11 Forms of Eyewitness Testimony Biases in generated from Web-based searches imply, retu-¿
Reporting almost every report in the system (i.e., recall JUay
•. Confidence bias: arises when witnesses unwittingly be too high). Alternatives to relational databases and
place the greatest store in their colleagues who information retrieval techniques that have been Slig-
express the greatest confidence in their view of an gested inelude conversational case-based reasoning
incident. Previous work into eyewitness testimonies where the user must answer a number of question~
and expert judgments has shown that it may be in order to obtain information concerning incidents
better to place greatest trust in those who do not of interest. The possibility also exists for determin_
exhibit this fonn ot overconfidence (Johnson, 2003). ing differences among analysts in the patterns of
• Hindsight bias: arises when witnesses criticize their searches, and thus insights into their poten tial
individuals and groups on the basis of information biases, by tracing their interactions with these sys-
that may not have been available at the time of an tems (Johnson, 2003).
incident. FinaJly, a very different type of concern with
• Judgment bias: arises when witnesses perceive the IRSs arises when these systems are used as a basis
need to reach conclusions about the cause of an for quantitative human error applications. In these
incident. The quality of the analysis is less important situations, fue voluntary nature of the reporting may
than the need to make a decision. invalidate the data that are used for deriving error
• Pofitícal bias: arises when a judgment or hypothesis likelihoods (Thomas and Helrnreich, 2002). From a
from a high-status member commands influence
. probabilistic risk assessment (Section 5.1) and risk
because others respect that status rather than the
value of the judgment itself. This can be paraphrased
management perspective, this issue can undernuns
as "pressure from above."
decisions regarding allocating resources for resolving
• Sponsor bias: arises when a witness testimony can human errors: Which errors do you attempt to
affect indirectly the prosperity or reputation of the remediate if it is unclear how often the errors are
organization they manage or tor which they are occurring?
responsible. This can be paraphrased as "pressure
from below." 7 AUTOMATION ANO HUMAN ERROR
• Professional bias: arises when witnesses may be 7.1 Human Factors Consíderations
excluded frorn the society of their colleagues if they in Automation
submit a reporto This can be paraphrased as Innovations in technology will aJways occur and
. "pressure from beside."
will bring with them new ways of performing tasks
• Recognition bias: arises when witnesses have a
limited vocabulary of causal factors. They actively
and doing work. Whether the technology completely
attempt to make any incident "frt" with one of those
elirninates fue need for the human to perform a task
factors, irrespective of the complexity of the or results in new ways of performing tasks through
circumstances that characteríze the incident. automation of selective task functions, the human' s
• Confirmatíon bias: arises when witnesses attempt to tasks wiUprobably become reeonfigured (Chapter 60).
make their evidence confirm an initial hypothesis. The human is especially vulnerable when adapting
• Frequencybias: occurs when witnesses become to new technology. During this period, knowledge
familiar with particular causal factors because they concerning the technology and the impact it may
are observed most often. Any subseqLient incident is have when integrated into task activities is relatively
therefore likely to be classified according to one of unsophisticated and biases deriving from previous
these common categories irrespective of whether an work routines are still influentíal.
incident is actually caused by those factars. Automating tasks or system functions by replac-
• Recency bias: occurs when a witness is heavily ing the human's sensing, planning, decision makíng,
influenced by previous incidents. or manual activities with computer-based technology
• Weapon bias.' occurs when witnesses become fixated often requires making allocation of function deci-
. on the more "sensational" causes of an incident. For sions-that is, deciding which functions to assign to
example, they may focus on the driver behavior that fue human and which lo delegate to automatic con-
led to a collision rather than the failure of a safety belt trol (Sharit, 1997). Because these decisions ultimately
to prevent injury to the driver.
can have an impact on fue propensity for human
Source: Adaptad from Johnson (2002). error, consideration may also need to be given to
the level of automation to be incorporated into the
systern (Parasuraman et al., 2000; Kaber and Ends-
to their flexibility in exploiting semantic informa- ley, 2004). Higher levels imply that automation will
tion about the relationships between terms or phrases assume greater autonorny in decision making and con-
that are contained in a user' s query and in .the trol. The primary concem with technology-centered
reports. In sorne instances, these search techniques systems is that they deprive themselves of the benefits
have been integrated with relational databases in order deriving from the human' s ability to anticipate, seareh
to capitalize on fields- previously encoded into the for, and discem relevant data based on the eurrent con-
database. However, the integration of these tech- text; make generalizations and inferences based on past
ruques cannot assure users that their queries will experienee; and modify activities based on changing
find similar incidents (i.e., the precision may be constraints. Determining the optimal level of automa-
low), or as the large results lists that are typically tion, however, is a daunting task for the designer.
HUMAN ERROR 747
While levels of automation somewhere between the the human, the actions taken by automatic systems
[owest and highest levels may be the most effective may appear confusing. In these situations, the human' s
way to exploit the combined capabiJities of both the tendency for partial matching and biased assessments
automation and the human, identifying an ideallevel of (Section 3.2) could lead to the use of an inappropriate
automation is complicated by the need also to account rule for explaining the behavior of the system-a
for the consequences of human error and system fail- mistake that in the face of properly functioning
ures (Moray et al., 2000). automation could have adverse consequences. These
Many of the direct benefits of automation are forms of human -automation interaction have been
accompanied by indirect benefits in the form of error examined in detail in flight deck operations in the
reduction. For example, the traffic alert and collision cockpit and have been termed automation surprises
avoidance system in aviation that assesses airspace for (Woods et al., 1997). Training that alIows the human to
nearby traffic and warns the pilot if there is a potential explore the various functions of the automation under
for collision can overcome human sensory limitations, a wide range of system or device states can help reduce
and robotic assembly cells in manufacturing can sorne of these problems. However, it is also essential
minimize fatigue-induced human errors. Generally, that designers work with users of automation to ensure
reducing human physical and cognitive workload that the user is informed about what the automation is
enables the human to attend to other higher-level doing and the basis for why it is doing it. In the past,
cognitive activities, sueh as the adoption of strategies slips and mistakes by flight crews tended to be errors
for improving system performance. Reckless design of commission. With automation, errors of omission
strategies, however, that automate functions based have become more conunon, whereby probIems are not
solely on technícal feasibility can often lead to a perceived and corrective interventions are not made in
number of problems (Bainbridge, 1987). For instance, a timely fashion.
manual and cognitive skills that are no longer used Another important consideration is mistrust of
due to the presence of automation will deteriorate, automation, which can develop when the performance
jeopardizing the system during times when human of automatic systems or subsystems is perceived to be
intervention is required. Situations requiring rapid unrelíable or uncertain (Lee and Moray, 1994). Lee
diagnosis that rely ori the human having available and See (2004) have defined trust as an attitude or
or being able quickly to construct an appropriate expectancy regardíng the likelihood that someone or
mental model will thus impose higher WM demands .something will help the person achieve bis or her
on humans who are no longer actively involved goal in situations characterized by uncertainty and
in system operations. The human may also need vulnerability. As these authors have pointed out, many
to allocate significant attention to monitoring the parallels exist between the trust that we gain in other
automation, which is a task humans do not perform people and the trust we acquire in complex technoIogy,
well. These problems are due largely to the capability and as in our interactions with other peopIe, we tend to
for automation to insulate the human from the process, reIy on automation we trust and reject automation we
and are best handled through training that empbasizes do not trust. Mistrust of automation can provide new
ample hands-on simulation exercises encompassing opportunities for errors, as when the human decides
varied scenarios. The important lesson learned is that to assume manual control of a system or decision-
"disinvolvement can create more work rather than making responsibilities that may be ill-advised under
less, and produce a greater error potential" (Dekker, the current conditions.
2005, p. 165). Like many decisions people make, the decision
Automation can also be clumsy for the human to to rely on automation can be strongly influenced
interact with,' making it difficult to program, monitor, .by emotions. Consequently, even if the automation
or verify, especially during periods of high workload. is perfoiming well, the person's trust in it may
A possible consequence of clumsy automation is that become undermined if its responses are not consistent
it "tunes out small errors and creates opportunities for with expectations (Rasmussen et al., 1994). Mistrust
larger ones" (Weiner, 1985) by virtue of its complex of automation can also lead to its disuse, which
connections to, and control of important systerns. impedes the development of knowledge concerning
Automation has also been associated with mode errors, the systern's capabilities and tbus further increases tbe
a type of mistake in which the human acts based ' tendency for mistrust and .human error. Overreliance
on the assumption that the system is in a particular on automation can also lead to errors in those
mode of operation (either because the available data' unlikely but still possible circumstances in which the
support this premise or because the human instructed automation is malfunctioning, or when it encounters
the system to adopt that mode) when in fact it is inputs or situations unanticipated in its design that the
in a different mode. In these situations, unanticipated human believes it was programmed to handle.
consequences may result if the system 'rernains capable Lee and See (2004) have developed a conceptual
of accommodating the human' s actions. The tendency model of the processes governing trust and its effect on
~or a system to mask its operational mode represents reliance that is based on a dynamic interaction among
just one of the-many ways that automation can disrupt the following factors: the human, organizational,
situation awareness. cultural, and work contexts; the automation; and the
More generally, when the logic governing the human-automation interface. As a framework for
automation is complex and not fully understood by guiding the creation of appropriate trust in automation,
748 DESIGN FOR HEALTH, SAFETY, AND COMFORl'
their model suggests that the algorithms governing attributed the vibration lo a loss of lift by the wings_
the automation need to be made more transparent to To regain the requircd airspeed, the throttles were
the user, that the interface should provide information advanced and a slow deseent was initiated. However
regarding the capabilities of the automation in a format upon descent the aircraft nearly collided with anothe;
tbat is easily understandable, and that trainíng sbould plane and both planes ~eede~ to be instructed to adopt
address the varieties of situations tbat can affect the new courses. The captam clauned that no warning had
capabilities of the automation. been provided to alert the crew that the automatic
Organizational and work culture influences a180 throttle system had disengaged.
need to be considered. If automation is imposed on In the aftennath of the erash of the American
workers, especially in the absence of a good rationale Airlines flight 965 near Cali, Columbia, in 1995
regarding its purpose or how human-automation the FAA' s human Iactors team suggested that pilot~
interaction may enhance the work experience or might not know how to interpret computer svstem
improve the potential for job enrichment, the integrity .information. The pilots of that f1ight accepted an
and meaningfulness of work may become threatened, offer to land on a different runway, forcing them to
resulting in work cultures that promote unproductive rush their descent. In the process, they incorrectly
and possibly dangerous bebavioral strategies. Finally, prograrnmed their FMS lo direct their plane to Bogotá
as the work of Cao and Taylor (2004) described below which was off course by more than 30 miles, anct
suggests, the adverse effects that interacting with ultimately ftew into a 9000-foot mountain.
complex technology can have on team cornmunication On a normal approach into Nagoya, Japan, in 1994
may require the need to address fue concept of meta- the first officer of a China Airlines Airbus A-300
trust, the trust peopIe have that other people's trust in hit the wrong switch on the autopilot, sending the
automation is. appropriate (Lee and See, 2004). plane into an emergency climb. The throttles increased
automatically and the n?se pitched up. As the pilots
7.2 Examples of Human Error in Commercial reduced power and tried to push the nose down, fue .
Aviation flightdeck computers became even more determined
Otomake the plane climb. The nose rose to 53 degrees,
The cockpits of commercial airliners contain numerous and despite adding full power, the airspeed dropped to
automated systems, Central among these systems
o
90 mph, which was too slow to maintain the plane in
is the flight management system (FMS). The FMS the airo The aircraft crashed tail first into the ground
can be programmed to follow an assigned flight near the runway. .
plan route, allowing aplane to navigate itself to In 1998 a Boeing 737 bound for Denver was
a series of checkpoints and providing the estimated instructed by air traffic controllers to descend quickly
time and distance to these checkpoints. It can also to 19,000 feet to avoid an oncoming planeo The captain
determine speed and power settings that optimize attempted to use the FMS to execute the deseent, but
fuel consumption, prevent the plane from deseending the system did not respond quickly enough. Following
below an altitude restriction, and display navigational a second order by air traffic controllers, the captain
information. Working in conjunction witb the FMS opted to turn off the FMS and assume manual control,
is the autopilot, which allows the pIane to assume resulting in a near miss with the other plane. The
and maintain a specific heading, level off at an captain attributed his "error" to reliance on automation.
assigned altitude, or climb or deseend at a specific
rate, and an auto-throttle system, which sets the 7.3 Adaptlng to Automation and New
tbrottles for specific airspeeds. In addition, the traffic Technology
alert and collision avoidance system notifies pilots 7.3.1 Designer Error
about potential collisions with other aircraft and
provides instructions on how lo avoid that aircraft, As is the case with user performance of various types
the stormscope warns pilots when severe weather of products, the performance of designers will also
lies ahead, and fue wind shear system allows pilots depend on the operational coritexts in which they are
to detect wind shear during takeoff and approach working and will be susceptible to many of the same
to landing. The pilot can al so employ an automatic
o o
forms of errors (Smii:h and Geddes, 2003). Working
landing system. These automatic systems have the against designers is fue increased specialization and
potential to reduce pilot workload significantly and heterogeneity of work dornains, which is making it
thus enhance safety. However, they can perform so exceedingly difficult for them to anticipate the effects
rnany functions that pilots can lose sight of where they on users of introducing automation and new technolo-
are or what tasks they need to perform. Sorne examples gies. Nonetheless, errors resulting from user interac-
of these situations are discussed below. tions with new technologies are now often attributed
In 1998 the pilots of a Boeing 757 failed to
o
to designers. Designer errors could arise from inad-
notice that the auto-throttle system had disengaged. equate or incorrect knowledge about the application
The pilots sensed a slight vibration, and after detecting area (i.e., a failure for designers to anticipate important
scenarios) or the inability to anticipate how the prod-
a dangerously low airspeed, fue captain correctly
. UC! will influence user performance (i.e., insufficien\, o
understanding by designers). .
In reality, designers' conceptualizations are noth- ,
*This section is adapted from Kaye (l999a). ing more than initial hypotheses concerning the
HUMAN ERROR 749
collaborative relationship between their technological and aviation industries. These industries were forced to
product and the human. Accordingly, their beliefs address the consequences of imposing on their workers
regarding this relationship need to be gradually shaped major transformations in the way that system data were
by data that are based on actual human interac- presented. In nuclear power control rooms, the banks
tion with these technologies, including the transfor- of hardwired display s were replaced by one or a few
mations in work experiences that tbese interactions computer-based display screens, and in cockpits the
produce (Dekker, 2005). However, as Dekker notes, analog single-function single display s were replaced
in practice the validation and verification studies by by sopbisticated software-driven electronic integrated
designers are usually limited, providing results that displays. These changes drastically altered the humau's
may be informative but "hardly about the processes visual -spatiallandscape and offered a wide variety of
of transformation (different work, new cognitive and schemes for representing, integrating, and custorniz-
coordination demands) and adaptation (novel work ing data. For tbose experienced operators who were
strategies, tailoring of the technology) that will deter- used to having the en tire data world available to them
mine the sources of a system's success and potential at a glance, adapting to the new technology was far
for faiIure once it has been fielded" (p. 164). In tbe from straíghtforward. The mental models and strate-
study on computerized physician order-entry systems gíes that were developed based on having all system
discussed in Section 3.4, many of the errors that were state information available símultaneously were not
identified were probably rooted in constraints of these likely to be as successful when applied to these newly
kinds that were imposed on the design process. designed environments, making these operators more
Although designers have a reasonable number of predisposed to errors than were their less experienced
choices available to them that can transIate into dif- coun terparts.
ferent technical, social, and emotional experiences In complex work dornains such as bealth care that
for users, like users they themselves are under the require the human to cope with a potentially enormous
inftuence of sociocultural (Evan and Manion, 2002) number of different task contexts, anticipating tbe
and organizational factors (Figure 1). For exarnple, user's adaptation to new technology can become
the reward structure of the organization, an empha- so difficult for designers that they themselves, like
sis 00 rapíd completion of projects •.and the insulation the practitioners who will use their products, can
of designers from the consequences of their design be expected to conform to strategies of minimizing
decisions can induce designers to give less consid- cognitive effort. Instead of designing systems with
eration to factors related to ease of operation and operational contexts in mind, a cognitively less taxing
even safety (Perrow, 1983). Although these circum- solutíon is to identify and make available all possible
stances would appear to shift the attribution of user information that the user may require but to place tbe
errors from designers to management, designer errors burden on the user to search for, extract, or configure
and management errors both represent types of latent the information as the sítuation demands. These
errors that are responsible for creating the precondi- designer strategies are often manifest as computer
tions for user errors (Reason, 1990). Perrow (1999) mediums tbat exhibit the keyhole property, whereby
contends thar a major deficiency in the design pro- the size of the available viewports (e.g., windows)
cess is the inability of designers and management to is very small relative to the number of data displays
appreciate human fallibility by failing to take into that potentially could be exarnined (Woods and Watts,
account relevant information that could be supplied by 1997). Unfortunately, this approach to design makes
human facrors and ergonomics specialists. This con- it more likely thai the user can "get lost in the large
cern is given serious consideration in user-centered space .of possibilities" and makes it difficult to find
design practices (Nielsen, 1995). However, in some the right data at tbe right time as activities cbange
highly technical systems where designers may still and unfold.
be viewing their products as closed systems governed In a study by Cook and Woods (1996) on adapting
by perfect logic, this issue remains unresolved. The to new technology in the domain of cardiac anes-
way the FAA has approached this problem has been thesía, physiological monitoring equipment dedicated
through recornmendations to manufacturers that they to cardiothoracic surgery was upgraded from separate
make displays and controls easier to use and that they devices to a computer system that integrated the func-
develop a better understanding of pilot vulnerabilities tions of four devices onto a single color display. How-
to complex environments. For example, in Boeing's ever, the flexibilities that the new technology provided
modero air fleets, a11 controls and throttles provide in display options and display customization also ere-
visual and tactile feedback to pilots-thus the con- ated the need for pbysicians to direct attention to inter-
trol column that a pilot normally pulls back to initiate acting with the patient monitoring system. By virtue
climb will move back on its own when aplane is of the keyhole property tbere were now new inter-
climbing on autopilot. face management tasks to contend with. These tasks
derived in part from the need to access bighly interre-
7.3.2 The Keyhole Property, Task Tailoring, lated data serially, thus potentially degrading the accu-
and System Tailoring racy and efficiency of the mental models the physi-
Much of our core human factors knowledge concerning cians required for making patient intervention deci-
human adaptation to new technology in complex sys- sions. New interface management tasks a180 included
terns is derived from experiences in the nuclear power the need ro declutter displays periodically to avoid
750 DESIGN FOR HEALTH, SAFETY, AND COMFORT
obscuring data channels that required monitoring. This Understandi~g the l?otential for human errors brought
requirement resulted from collapsing into a single about from interacnons among team members in the
device the data world previously made available by face of this new technology requires eloseIy examin-
the multi-instrument configuration. ing contextuaI factors such as communication, team.
To cope with these potentially overloading situ- work, fíow of information, work culture, uncertainty
ations, physicians were observed to tailor both the and overload (Figure 1). In their study, a framework
computer-based system (system tailoring) and their referred to as common ground (Clark and Schaefer
.own cognitive strategies (task tailoring). For example, 1989) was used to analyze communication for two
the physicians discovered that the default blood pres- cholecystectomy procedures that were performed by
sure display configuration for the three blood pressures the same surgeon: one using conventionallaparoscopic
that were routinely displayed was unsuitable-the instruments and the other using a robotic surgical sys-
waveforms and numeric values (derived from dig- temo Common ground represents a person's knowledge
ital processing) changed too slowly and eliminated or assumptions about what other people in the com-
important quantitative information. Rather than exploit munication setting know. It can be established through
the system's flexi bility , the physicians simplified the past and present experiences in communicating with
system by constraining the display of data into a particular individuals, the knowledge or assumptions
fixed spatially dedicated default organization. This one has about those individuals, and general back-
required substantial effort, initial1y to force the pre- ground information. High levels of common ground
ferred display configuration prior to the initiation of a
would thus be expected to result in more efficient and
case, then to ensure that this configuration is main- accurate communication.
tained in the event that the computer system per-
In the OR theater, common ground can become
forms automatie window management functions. To
inñuenced by a number of factors. For instance
tailor their tasks, they planned their interactions with
the surgeon's expectations for responses by team
the device to coincide with self-paced periods of
members may depend on the roles (such as nurse,
low criticality, and developed stereotypical routines
technician, or anesthesiologist) that those persons play.
to avoid getting lost in the complex menu struetures
rather than exploiting the systems flexibility. In the Other factors that can affect the level of common
face of circumstanees incompatible with task-tailoring ground inelude familiarity with team members, which
strategies, which are bound to oecur in Ibis complex is often undermined in the OR due to rotation of
work domain, the physicians had no choice but to surgical teams, and familiarity with the procedure.
confront the complexity of the device, thus divert- When new technology is introduced, all these factors
ing information-processing resources from the patient conspire to erode common ground and thus potentialIy
management function (Cook and Woods, 1996). This compromise patient safety. Roles may change, people
irony of automation, whereby the burden of interacting become les s familiar with their roles, the procedures
with the technology tends to occur during those situa- for using the new technology are Iess familiar,
tions when the human can least afford to divert atten- and expectations for responses from communication
tional resources, is also found in aviation. As noted, partners beco mes more uncertain. Misunderstandings
automation in cockpits can potentially reduce workload can propagate through team members in unpredictable
by allowing complete flight paths to be prograrnmed ways, ultimately leading to new forms of errors.
through keyboards. Changes in the flight path, how- The introduction of a remote master-slave surgical
ever, require that pilots divert their attention to the robot into the OR necessitates a physical barrier,
numerous keystrokes that need to be input to the key- and what Cao and Taylor (2004) observed was that
board, and these changes tend to occur during takeoff the surgeon, now removed from the surgical site,
or descent-the phases of flight containing the highest had to rely almost exclusively on vídeo images
risk and that can least accommodate increases in pilot from this remote surgical site. Instead of receiving
workload (Strauch, 2002). a full range of sensory information from the visual,
Task tailoring reflects a fundamental human adap- auditory, haptic, and olfactory senses, the surgeon
tive process. Thus, humans should be expected to had to contend with a "restricted field of view and
shape new technology to bridge gaps in their knowl- limited depth infonnation from a frequently poor
edge of the technology and fulfill task demands. The vantage point" (p. 310) and increased uncertainty
concem with task tailoring is that it can create new regarding the status of the . remote system. These
cognitive burdens, especially when the human is most changes potentially overload the surgeon' s visual
vulnerable to demands on attention, and mask the system and create more opportunities for decision-
real effects of technology change in terms of its making errors, due to gaps in the information that
capability for providíng new opportunities for human is being received. Also, in addition to the need
error (Dekker, 2005). for obtaining information on patient status and the
progress of the procedurc, the surgeon had to cope
7.3.3 Effects of New Technology on Team with information-processing demands deriving from
Coml1_lunication the need to access information about the status of
Cao and Taylor (2004) recently examined the effects the robotic manipulator. Thus, to ensure effective
of introducing a remote surgical robot on communica- coordination of the procedure, the surgeon was now
tion arnong the operatíng room (OR) team members. responsible for verbalIy distributing more information
HUMAN ERROR 751
to the OR tearn members than with eonventional workers are placed in the difficult position of need-
laparoscopic surgery. ing to invest considerable attentional resources almost
Overall, significantly more cornmunication within immediately in order to avoid an incident or accident.
the OR team was observed under robotic surgery con- Many preventive maintenance activities initially
ditions than with conventional Iaparoscopic surgery. involve searching for fiaws prior to applying corrective
Moreover, the cornmunication pattems were haphaz- procedures, and these search processes are often
ard, which increased the tearn member' s uneertainty subject to various expectancies that eould lead to
concerníng when information and what information errors. Por example, if faults or ftaws are seldom
should be distributed or requested and thereby the encountered, the likelihood of rníssing such targets
potential for human error resulting from miscornmu- will increase; if they are encountered frequently,
nication and lack of communication. Use of dífferent properly functioning equipment may be disassembled.
terminologíes in referring to the robotic system and Maintenance workers are also often required to
startup confusion contributed to the lack of common work in restricted spaces that are error inducing
ground. Although training on the use of this technol- by virtue of the physical and cognitive constraints
ogy was provided to these surgical team members, that these work eonditions impose (Reynolds-Mozrall
the findíngs suggested the need for training to attain et al., 2000).
common ground. This eould possibly be achieved Flawed partnerships between roaintenance work-
through the use of tules or an information visualiza- ers and troubleshooting equipment can also give
tion system that couId faeilitate the development of a rise to errors. As with other types of automation
shared mental model among the team members (Stout or aiding devices, troubleshooting aids can. compen-
. et al., 1999). sate for human limitations and extend human capa-
bilities when designed appropriately. However, these
8 HUMAN ERROR IN MAINTENANCE devices are often opaque and may be rnísused or disre-
ACTIVITIES garded (parasuraman and Riley, 1997), depending on
To function effectively, almost all systems require the worker' s self-confidence, prior experiences with
maintenance. Most organizations require both sched- the aid, and knowledge of co-worker attitudes toward
uled (preventive) maintenance and unscheduled the device. Por instance, if the logic underlying the
(active) maintenanee. Whereas unscheduled mainte-. software of an expert troubleshooting system is inac-
nance is required when systems or components fail, cessible, the user may not trust the recornmendations
preventive maintenanee attempts to anticipate failures or explanations given by the device (Section 7.1) and
and thereby minimize system unavailability. Frequent therefore choose not to replace a component that the
scheduled maintenance can be costly, and organiza- de vice has identified as faulty.
tions often seek to balance these costs against the Errors resulting from interruptions are particularly
risks of equipment failures. Lost in this equation, prevalent in rnaintenance environments. Interruptions
however, is a possible "irony of maintenance" -that due to the need to assist a co-worker or following
an increased frequeney in seheduled maintenance the discovery that the work procedure called for the
may aetually inerease system risk by providing more wrong tool or equipment generally require the worker
opportunities for human interaetion with the sys- to leave the scene of operations, and the most likely
~em.(Reason, 1997). This increase in risk is more likely error in these types of situations is an omission.
if assembly rather than disassembly operations are In fact, memory lapses probably eonstitute the most
c~.lledfor, as the eomparatively fewer constraints asso- common errors in maintenance, suggesting the need for
ciated with assembly operations makes these aetivities incorporating good reminders (Table 12). Reason and
much more susceptible to various errors, such as iden- Hobbs (2003) emphasize the need for mental readiness
tifying the wrong component, applying inappropriate and mental rehearsal as ways that maintenance workers
force, ·or ornítting an assembly step. . can inoculate themselves against errors that could arise
Maintenance environments are notorious for break- from interruptions, time pressure, cornmunication, and
downs in cornmunication, often in the form of implicit unfamiliar situations that may arise.
assumptions or arnbiguity in instructions that go Written work procedures are pervasive in mainte-
~nconfirmed (Reason and Hobbs, 2003). When opera- nance operations, and there may be numerous prob-
110nsextend over shifts and involve unfamiliar people, lems with the design of these procedures that can
these breakdowns in cornmunication can propagate predispose their users to errors (Drury, 1998). Vio-
mto catastrophic accidents, as was the case in the lations of these procedures are also relatively COID-
~xplosion aboard the Piper Alpha oil and gas platform mon, and management has been known to consider
m the North Sea (Reason and Hobbs, 2003) and the such violations as causes and contributors of adverse
crash of ValueJet flight 592 (Strauch, 2002). Incom- events-v-a belief that is both simplistic and unrealis-
mg shift workers are particularly vulnerable to errors tie. The assumptions that go into the design of pro-
f?l1owing cornmeneement of their task activities, espe- cedures are typically based on normative models of
cially if maintenance personnel in the outgoing shift work operations. However, the actual contexts under
. conelude their work at an untimely point in the pro- which real work takes place are often very differ-
, cedure and fail to brief incorníng shift workers ade- .ent from those that the designers 'of the proeedures
quately as to the operational context about to be con- have envisioned or were willing tooacknowledge. To
fronted (Sharit, 1998). In these cases, incoming shift the followers of the procedures, who must negotiate
___ o _
752 DESIGN FOR HEALTH, SAFETY, AND COMFORT
novel or unanticipated situations for which rules are 9.1 The Columbia Accident
not available. The Columbia space sbuttle aceident in 2003 exposed
Can companies with good cultures be differentiated a failed organizational culture. The physical cause of
from those with bad cultures? High-reliability organi- the accident was a breach in the thermal protection
zations (Section 11) that anticipate errors and encour- systern on the leading edge of Columbia' s left wing
age safety at the expense of production, that have about 82" seconds after the launch. This breach
effective error-reporting mechanisms without fear of was caused by a piece of insulating foam that
reprisaJs,and that maintain channels of communication separated from the external tank in an area where the
across alllevels of the company's operations generally orbiter attaches to the external tank. The Columbia
reflect good cultures. Questionable hiring practices, Accident Investigation Board' s (2003) report stated
poor economic incentives, inflexib~e .at1doutmo~ed that "NASA's organizational culture had as mueh to do
training programs, the absence of incident reportmg with this accident as foam did," that "only significant
systems and meaningful accident investigation mech- structural changes to NASA' s organizational culture
anisms, managerial instability, and the promotion of will enable it to succeed," and that NASA's current
atmospheres that discourage communication between organization "has not demonstrated the characteristics
superiors and subordinates are likely to produce poor of a learning organization" (p. J2).
organizational and work group cultures. To sorne extent NASA's culture was shaped by
Errors associated with maintenance operations can compromises with political administrations that were
often be traced to organizational culture. This was required to gain approval for the space shuttle programo
clearly the case in the crash of ValueJet flight 592 These compromises imposed competing budgetary and
into the Florida Everglades in 1996 just minutes after mission requirements that resulted in a "remarkably
takeoff. The crash occurred following an intense fire capable and resilient vehicle" but one that was "less
in the airplane's cargo compartment that made its way than optimal for manned flights" and "that never met
into the cabin and overcame the crew (Strauch, 2002). any of its original requirements for reliability, cost,
Unexpended and unprotected canisters of oxygen gen- ease of turnaround, maintainability, or, regrettably,
erators, which can inadvertently generate oxygen and safety" (p. 1i). The organizational failures are almost
heat and consequently ignite adjacent materials, had too numerous to document: unwillingness to trade off
somehow managed to become placed onto the air- scheduling and production pressures for safety; shifting
craft, Although most of the errors that were uncov- management systems and a lack of integrated manage-
ered by the investigation were associated with main- ment across prograrn elements; reliance on past suc-
tenance technicians at SabreTech-the maintenance cess as a basis for engineering practice rather than on
facility contracted by ValueJet to overhaul several dependable engineering data and rigorous testing; the
of its aircraft-these errors were attributed to prac- existenee of organizational barriers that compromised
tices at SabreTech that reflected organizational failures. communication of critical safety information and dis-
Specifically, the absence of information on the work couraged differences of opinion; and the emergence
cards concerning the removal of oxygen canisters from of an informal command and decision-making appa-
two ValueJet airplanes that were being overhauled ratus that operated outside the organization's norms.
led Otothe failure by maintenance personnel to lock According to the Columbia Accident Investigation
or expend the generators. There was also a lack of Board, deficiencies in communication, both up and
communication across shifts concerning the hazards down the shuttle program's hierarchy, were a foun-
associated with the oxygen generators; although sorne dation for the Columbia aecident.
technicians who had removed the canisters from the These failures were largely responsible for missed
other aircraft knew of the hazards, others did not. In opportunities, blocked or ineffective communica-
addition, procedures for briefing incoming and outgo- tion, and f1awed analysis by management during
ing shift workers conceming hazardous materials and Columbia's final flight that hindered the possibility
for tracking tasks performed during shifts were not in" of a ehallenging but conceivable rescue of the crew
place. Finally, parts needed to secure the generators by launching the Atlantis, another space shuttle eraft,
were unavailable, and none of the workers in"ship- to rendezvous with Columbia. The accident investiga-
ping and receiving, who were ultimately responsible tion board concluded: "Sorne Space Shuttle Program
for placing the canisters on the airplane, was aware of managers failed to fulfill the implicit contraet to do
the hazards. whatever is possible to ensure the safety of the crew.
Relevant to this discussion was the finding that In fact, their management techniques unknowingly
the majority of the technicians that removed oxygen imposed barriers that kept at bay both engineering
canisters from ValueJet airplanes as part of the over- concerns and dissenting views, and ultimately helped
haul of these aircraft were not SabreTech personnel but create 'blind spots' that prevented them from seeing
~ontractor personnel. In the absence of an adequately the dangerthe foam strike posed" (p. 170). Essentially,
lllformed organizational culture, it comes as no sur- the position adopted by managers concerning whether
prise that management would be oblivious to the impli- the debris strike created a safety-of-fíight issue plaeed
catiolls of outsourcing on worker cornmunication and the burden on engineers to prove that the system was
task performance. Further arguments concerning the unsafe.
lmportance of organizational culture for system safety Numerous deficiencies were also found with the
Can be found in Reason (1997) and Vicente (2004). Problem Reporting and Corrective Action database,
754 DESIGN FOR HEALTH, SAFETY, AND COMFORT
a critical information system that provided data on 1993). A further complication is hindsight bias, which
any nonconformances. In addition to being too time derives from the tendency to judge the quality of a
consuming and cumbersome, it was also incomplete. process based on whether positive or negative out-
For example, only foam strikes that were considered comes ensued (Fischhoff, 1975; Christoffersen and
in-ftight anomalies were added to this database, which Woods, 1999). Because accident investigators usually
masked the extent of this problem. have knowledge about negative outcomes, through
Finally, what is particularly disturbing was the hindsight they can look back and identify alI the
failure of the shuttle program to detect the foam trend failed behaviors that are consistent with these out-
and appreciate the danger that it presented. Shuttle comes (Section 1.1). It is then highly probable that
managers discarded warning signs from previous foam a causal sequence offering a crisp explanation for
strikes and normalized their occurrences, In so doing, the incident will unfold to the investigator. A foray
they desensitized the program to the dangers of through Casey' s (1993) reconstructed accounts of a
foam strikes and compro mi sed the flight readiness number of high-profile accidents attributed to human
process. Many workers at NASA knew of the problem. error would probably transform many in the lay public
However, in the absence of an effective mechanism for into hindsight experts.
cornmunicating these "incidents" (Section 6) proactive Although hindsight bias can assume a number of
approaches for identifying and mitigating risks were forms (Dekker, 2001), they all derive from the ten-
unlikely to be in place. In particular, a proactive dency to treat actions in isolation and thus distort the
perspective to risk identification and management context in which the actions took place. The perva-
could have resulted in a better understanding of the risk siveness of the hindsight bias has led Dekker (2005)
of thermal protection damage from foam strikes, tests to suggest the intriguing possibility that it may actually
being performed on the resilience of the reinforced be serving an adaptive function-that is, the hindsight
carbon-carbon panels, and either the elimination of bias is not so much about explaining what happened
external tank foam los s or its mitigation through the as it is about future survival, which would necessarily
use of redundant layers of protection. require the decontextualization of past failures into a
"linear series of binary choices." The true bias thus
10 INVESTIGATING HUMAN ERROR derives from the belief that the oversimplification of
IN ACCIDENTS ANO INCIDENTS rich contexts into a series of clearly defined choices
will inerease the likelihood of coping with complex-
10.1 Causality and Hindsight Bias ity successfully in the future. However, in reality, by
Investigations of human error are generally per- obstructing efforts at establishing eause and effect, the
formed as part of accident and incident investigations hindsight bias actually jeopardizes the ability to leam
(Chapter 41). In conducting these investigations, the from accidents (Woods et al., 1994) and thus the abil-
most fundamental issue is the attribution of causal- ity to predict or prevent future failures.
ity to incident and accident events. Currently, there
are a variety of techniques that investigators can 10.2 Methods for Investigating Human Error
choose from to assist them in performing causal anal- Investigations of human error can be pursued using
ysis (Johnson, 2003). either informal approaches or methods that are much
A related issue concerns the level of detail required more systematic or specialized. Strauch , s (2002)
for establishing causality (Senders and Moray, 1991). approach, which reflects a relatively broad and
At one level of analysis a cause can be an interruption; informal perspective to this problem, emphasizes
at a different level of analysis the cause can be antecedent factors (e.g., equipment, operator, main-
attributed to a set of competing neural acti vation tenance, and cultural factors), data collection and
pattems that result in an action slip. Dilernmas analysis issues, and factors such as situation aware-
regarding the appropriate level of causal analysis ness and automation, all of which are interwoven
are usually resolved by considering the requirements with case studies. The more specialized methods for
of the investigative analysis. Generally, analysts can investigating human error are generally referred to
be expected to employ heuristics such as satisficing as accident or incident analysis techniques, although
(Section 3.2), whereby decisions and judgments are in sorne cases these tools can also be used to
made that appear good enough for the purposes of the assess the potential risks associated with systems or
investigation. Investigators also need to be aware of the work processes. Examples of techniques used exclu-
possible cognitive biases that reporters of incidents and sively for investigating accidents inelude change anal-
accidents may be harboring (Table 11). These same ysis (Kepner and Tregoe, 1981) and the sequentially
biases can also playa part in how witnesses attribute timed events plotting procedure (Hendrick and Ben-
blame and thus in how they perceive relationships ner, 1987).
between causes and effects. Change analysis techniques are based on the well-
What makes determining causes of accidents espe- documented general relationship between change and
cially problematic for investigators is that they typ- increased risk. These techniques make use of accident-
ically work with discrete fragments of infonnation free reference bases to identify systematically changes
derived from decomposing continuous and interacting or differences associated with the accident or incident
sequences of events. This ultimately leads to various situation. A simple worksheet is usually all that is
distortions in the true occurrence of events (Woods, required for exploring potential changes contributory
HUMAN ERROR 755
t.
756 DESIGN FOR HEALrH, SAFETY, AND COMFORT
(more specific) levels of the map, selccting the most Although managers often speak in terms of the
applicable node at each level. The three upper-level need for eliminating human error, this goal is nei-
nodes of the map correspond to equipment failures, ther desirable nor reasonable. The henefits that derive
personnel failures, and other failures; however, only from the realization that errors have been cornmitted
the first two categories are analyzed for root causes. should not be readily dismissed; they playa critical
At the second level these tbree nodes are subdivided role in human adaptability, creativity, and the mani-
into 10 problem category nodes. Examples of these festation of expertise. The elimination of human error
categories are equipment design problem, equipment is also inconceivabJe if only because human fallibilitv
misuse, contract employee, natural phenomena, and will always exist. Tampering with human fallibilitj
sabotage or horseplay. The third level of the map for example by increasing the eapabilities of work-
consists of nodes corresponding to l1 major root- ing mernory and attention, would probably facilitate
cause categories; examples of these eategories inelude the design and production of new and more complex
procedures, human factors engineering, training, and systems, and ultimately, new and unanticipated oppor-
communications. In the transition from the second tunities for human error. More realistically, the natural
to the third level, the map allows for a number of evolution of knowledge and soeiety should translate
points of intersection between equipment failures and into the ernergence of new systems, new forros of ínter.
personnel failures, thus allowing all failures to be action among people and devices, and new sociopolit-
traced back to sorne type of human error. At the fourth ical and organizational cultures that will, in turn, pro-
level of the map these categories become subdivided vide new opportunities for enabling human fallibility.
into near root causes, which in tum are subdivided However, in no way should these suppositions
at the bottom level into a detailed set of root causes. detract frorn the goal of human error reduction
To aid the investigator in using the root-cause map, especially in complex high-risk systems. As astan:
examples for each node are provided in terms of typical
system hardware and software need to be made
issues and typical recommendations.
At this point in the process, a root-cause surnmary more reliable, better partnerships between humans
table can be generated that links eaeh causal factor and automation need to be established, barriers that
in the chart with one or more paths in the map that are effective in providing detection and absorption
terminate at root causes and to recommendations that of errors without adversely affecting contextua] and
address each of these root causes. These tables then cognitive constraints need to be put in place, and
form the basis for investigative reports that comprise incident-reporting systems that enable organizations to
the final step of RCA. learn and anticípate, especially when errors become
The availability of a systematic method for per- less frequent and thus deprive analysts with the
forming incident and accident investigations within a opportunit.yfor preparing and coping with their effects,
high-risk organization will inerease an organization's need to become more ubiquitous.
potential for leaming, improvement, and development Organizations also need to consider the impact that
of positive work cultures. However, as with IRSs these various economic incentives may have on shaping
benefits are anticipated only when these investigations work behaviors (Moray, 2000) and the adoption of
are not used as a basis for reprisals and 'when workers strategies and processes for implementing features
are informed about and involved in the investigative that have come to be associated with high-reliability
process. organizations (HROs) (Rochlin et al., 1987; Roberts,
1990). By incorporating fundamental characteristics
11 TOWARD MINIMIZATION OF HUMAN of HROs, particularly the development of cultures
ERROR ANO THE CREATION OF SAFE of reliability that anticipate and plan for unexpected
SYSTEMS cvents, try to monitor and understand the gap between
Human error is a complex phenomenon. Recent evi- work procedures and practice (Dekker, 2005), and
dence from neuroimaging studies has linked an error place value in organizational leaming, the adverse
negativity, an event-related brain potential probably consequences of interactive complexity and tight
originating from the anterior cingulate cortex, to the coupling that Perrow's theory predicts (Section 3.3)
detection by individuals of action slips, errors of can ·largely be countered.
choice, and other errors (Nieuwenhuis et al., 2001; In addition, methods for describing work contexts
Holroyd and Coles, 2002), possibly signifying the and for determining and assessing the perceptions and
existence of a neurophysiological basis for a pre- .assessments that workers make in response to these
conscious action-monitoring system. However, sug- contexts, as well as rigorous TA and CTA techniques
gestions that these kinds of findings may offer pos- for determining the possible ways that fallible humans
sibilities for predicting human errors in real-time can become ensnared by these situations, need to be
operations (Parasuraman, 2003) are probably over- investigated, implemented, and continuously evaluated
stated. Event-related brain potentials may provide in order to strengthen the 'predictive capabilities of
insight into attentional preparedness and awareness of human .error models. These methods also need to
response conflicts, but the complex interplay of factors be integrated into the conceptual, development, and
responsible for human error (Section 3.1) talees these testing stages of the design process to better inform
discoveries out of contention as meaningful explana- designers (of both. products and work procedures)
tory devices. about the potential effects of design decisions, thus
HUMAN ERROR 757
bridging the gap between the knowIedge and intentions Reporting Systeros," British Medical Journal, Vol. 320,
of tbe designer and the needs and goals of the user. pp. 759-763.
ProbIems created by poor designs and management Bierly, P. E., and Spender, J. e. (1995), "Culture and High
policies traditionally have been dumped on trainíng Reliability Organizations: The Case of the Nuclear Sub-
departments (CCPS, 1994). Instead of using training marine," Journal of Management, Vol. 21, pp. 639-656.
to compensate for these problems, it should be given a Cao, C. G. L., and Milgram, P. (2000), "Drsorientation in
pro active role in minimizing, detecting, and recovering Mínima! Access Surgery: A Case Study," in Proceed-
errors. This can be achieved through innovative train- ings of the lEA 2000IHFES 2000 Congress, San Diego,
CA, Vol. 4, pp. 169-172.
ing methods that emphasize management of task activ-
Cao, C. G. L., and Taylor, H. (2004), "Effects of New
ities under uncertainty and time constraints; integrate
Technology on the Operating Room Team," in Work
user-centereddesign principIes for establishing perfor-
with Computing Systems, 2004, H. M. Khalid, M. G.
mance support needs (such as the need for planning
Helander, and A. W. Yeo, Eds., Damai Sciences, Kuala
aids); give consideration to the kinds of cues that are Lumpur, Malaysia, pp. 309-312 .
. necessary for developing situation awareness (Endsley Casey, S. (1993), Set Phasers on Stun and Other True Tales
et al., 2003) and for interpreting common-cause and of Design, Technology, andHuman Error, Aegean Park
common-mode system failures; and utilize simulation Press, Santa Barbara, CA.
methods effectiveIy for providing extensive exposure CCPS (1992), Guidelines for Hazard Evaluation Procedures,
to a wide variety of contexts, By including provi- with Worked Examples, 2nd ed., Center for Chernical
sions in training for imparting mental preparedness, Process Safety, American Institute of Chemical Engi-
peopIe will be. better abIe to anticipate the anorna- neers, New York.
líes they might encounter and the errors they might . CCPS (1994), Guidelines for Preventing Human Error in
make, and to deveIop error-detection skills (Reason Process Safety, Center for Chemical Process Safety,
and Hobbs, 2003). American Institute of Chemical Engineers, New York.
Although worker selection (Chapter 17) is a poten- Christoffersen, K., and Woods, D. D. (1999), "How Complex
tially explosive issue, it can be used to exploit indi- Humau-Machine Systems Fail: Putting 'Human Error'
vidual variability in behavioraI tendencies and cogni- in Context," in The Occupational Ergonomics Hand-
tive capabilities and thus provide better human-system book, W. Karwowski and W. S. Marras, Eds., CRC
fits (Damos, 1995). Bierly and Spender (1995) have Press, Boca Raton, FL, pp. 585-600.
documented the extraordinary safety record of the U.S Clark, H. H., and Schaefer, E. F. (1989), "Contributing to
Discourse," Cognitive Science, Vol. 13, pp. 259-294.
nuclear navy and attributed ít in part to a culture that
Columbia Accident Investigation Board (2003), Report Vol-
insisted on careful seIection of people who were highly
ume 1, U.S. Government Printing Office, Washington,
intelligent, very motivated, and who were then thor- De.
oughly trained and held personally accountable for Cook, R. 1., and Woods, D. D. (1996), "Adapting to New
their tasks. These characteristics created the work cul- Technology in the Operating Room," Human Factors,
ture context for cornmunications that could: be carried Vol. 38, pp. 593-611.
out under conditions of high risk and high stress; flow Cook, R. L, Render, M., and Woods, D. D. (2000), "Gaps in
rapidly either top-down or bottom-up through the chain the Continuity of Care and Progress 00 Patient Safety,"
of command; and encompass information about mis- Britisn Medical Journal, VoL 320, pp. 791-794.
takes, whether technical, operational, or administrative, Cullen, D. J., Bates, D. W., Small, S. D., Cooper, 1. B.,
without fear of reprisals. Nemeskal, A. R., and Leape, L. L (1995), 'The Inci-
However, perhaps the greatest challenge in reduc- dent Reporting System Does Not Detect Adverse Drug
ing human error is managing these error-management Events: A Problem for Quality Improvement," Joint
processes (Reason .and Hobbs, 2003)-defense strate- Commission Joumal on Quality Improvement, Vol. 21,
gies need to be aggregated coherently (Amalberti, pp. 541..:.548. .
2001). Too often these types of error-reduction en ter- Damos, D. (1995), "Issues in Pilot Selection," in Proceed-
prises, innovative as they may be, remain isolated or ings of the 8th International Symposium on Aviation
hidden from each other. This needs to change-all Psychology, Ohio State University, Columbus, OH,
programs that can inftuence error management need to pp. 1365-1368.
be managed as a unified synergistic entity. Dekker, S. W. A. (2001), "The Disembodiment of Data in
the Analysis of Human Factors Accidents," Human
Factors and Aerospace Safety, Vol. 1, pp. 39-58.
REFERENCES
Dekker, S. W. A. (2005), Ten Questions About Human Error:
ABS Consulting Group (1998), Root Cause Analysis Hand- A New View of Human Factors and System Safety,
book: A Guide to Effective Inciden' Investigation, Gov- Lawrence Erlbaum Associates, Mahwah, NJ.
ernment Institutes Division, Knoxville, TN. Dey, A. K. (2001), "Understanding and Using Context,"
Amalberti, R. (2001), "The Paradoxes of Almost TotalIy Personal and Ubiquitous Computing, Vol. 5, pp. 4-7.
Safe Transportation System," Safety Science, Vol. 3, Dhillon, B. S., and Singh, C. (1981), Engineering Reliability:
pp. 109-126. New Technologies and Applications, Wiley, New York.
Bainbridge, L. (1987), "Ironies of Automation," in New Tech- Drury, C. G. (1998), "Human Factors in Aviation Main-
no[ogy and Human Error, J. Rasmussen, K. Duncan, tenance," in Handbook of Aviation Human Fac-
and 1. Leplat, Eds., Wiley, New York, pp. 273-276. tors, D. J. Garland, 1. A. Wise, and V. D. Hopkin,
Barach, P., and Small, S. (2000), "Reporting and Preventing Eds., . Lawrence Erlbaum Associates, Mahwah, NJ,
Medical Mishaps: Lessons from Non-medical Near Miss pp. 591-606.
758 DF.$IGN FOR HEALTH, SAFETY, AND COMFORT
Embrey, D. E., Humphreys, P., Rosa, E. A, Kirwan, B., and Kaye, K. (1999b), "United Has Eye in the Sky: Optical
Rea, K. (1984), SUM-MAUD: An Approach to Assess- Recorders Check Crews," South Florida Sun-SentíneL
ing Human Error Probabilities Using Structured Expert September 27. '
Judgment, NUREG/CR-3518, U.S. Nuclear Regulatory Kaye, K. (1999c), "Program Urges Error Reporting,
Cornmission, Washington, OC. Mitigates Penalty," South Florida Sun-Sentinel, Septein_
Endsley, M. R. (1995), 'Toward a Theory of Situation ber 27.
. Awareness," Human Factors, Vol. 37, pp. 32-64. Kepner, C. H., and Tregoe, B. B. (1981), The New Ratlonal
Endsley, M. R., Bolté, B., and Jones, D. G. (2003), Design- Manager, Kepner-Tregoe Inc., Princeton, NI.
ing for Situation Awareness: An Approach to User- Kirwan, B. (1994), A Guide to Practical Human Reliability
Centred Design, CRC Press, Boca Raton, FL. Assessment, TayJor & Francis, London.
Evan, W. M., and Manion, M. (2002), Minding the Machines: Kirwan, B. (1999), "Sorne Developments in Human Reli-
Preventing Technological Disasters, Prentice-Hall, Upper ability Assessment," in The Occupational Ergonomics
Saddle River, NJ. Handbook, W. Karwowski and W. S. Marras, Eds
Fiscbhoff, B. (1975), "Hindsight-Foresight: The Effect of CRC Press, Boca Raton, FL, pp. 643-666. .,
Outcome Knowledge on Judgment Under Uncertaínty," Kirwan, B., and Ainsworth, L. K. (l992), Guide lo Task
Ioumal 01Experimental Psychology: Human Perception Analysis, Taylor & Francis, London.
and Performance, Vol. 1, pp. 278-299. Kirwan, B., Martin, B. R., Rycraft, H., and Smith, A (1990),
Fraser, r. M., Smith, P. J., and Smith, J. W. (1992), HA Cat- "Human Error Data Collection and Data Generatíon "
alog of Errors," Intemational Journal 01 Man=Machine International Journal of Quality and Reliability Ma~-
Systems, Vol. 37, pp. 265-307. agement, VoL 7.4, pp. 34-66.
Gertrnan, D. 1, and Blackman, H. S. (1994), Human Relia- Kjellén, U. (2000), Prevention of Accidents Through Experi.
bility and Safety Analysis Data Handbook, Wiley, New enee Feedback, Taylor & Francis, London.
York. Kohn, L. T., Corrigan, J. M., and Donaldson, M. S., Eds.
Grosjean, V., and Terrier, P. (1999), "Temporal Aware- (1999), To Err Is Human: Building a Safer Healtn
ness: Pivotal in Performance?" Ergonomics, Vol. 42, System, National Acaderny Press; Washington, De.
Koppel, R., Metlay, J. P., Cohen, A, Abaluck, B., Localio,
pp. 443-456.
A. R., Kimrnel, S., and Strorn, B. L. (2005), "Role
Hahn, A. H., and deVries J. A., II (1991), "Identification of
of Computerized Physician Order Entry Systerns in
Human Errors of Conunission Using Sneak Analysis,"
Facilitating Medication Errors," Journal ofthe American
in Proceedings of the Human Factors Society 35th
Medical Association, VoL 293, pp. 1197-1203.
Annual Meeting, pp_ 1080-1084.
Kumamoto, H., and Henley, E. J. (1996), Probabilistic Risk
Helander, M. G. (1997), "The Human Factors Profession," in
Assessment and. Management for Engineers and Scien-
Handbook of Human Factors and Ergonomics, 2nd ed.,
tists, 2nd ed., IEEE Press, Piscataway, NJ.
G. Salvendy, Ed., Wiley, New York, pp. 3-16.
Leape, L. L., Brennan, T. A, Laird, N. M., Lawthers, A. G.,
Hendriek, K., and Benner, L., Ir. (1987), Investigating Aeci-
Localio, A. R., Barnes, B. A., Hebert, L, Newhouse,
dents with STEP, Mareel Dekker, New York.
J. P., WeiJer, P. C., and Hiatt, H. H. (1991), "The
Hofstede, G. (1991), Cultures and Organizations: Software 01 Nature of Adverse Events in Hospitalized Patients:
the Mind, McGraw-Hill, New York. Results from the Harvard Medical Practice Study
Hollnagel, E. (1993), Human Reliability Analysis: Context II," New England Journal 01 Medicine, Vol. 324,
and Control, Acadernie Press, London. pp. 377-384.
Hollnagel, E. (1998), Cognitive Reliability and Error Analysis Lee, J. D., and Moray, N. (1994), "Trust, Self-Confidence,
Method, Elsevier Seience, New York. and Operators' Adaptation to Automation," Interna-
Hollnagel, E., Ed. (2003), Handbook 01 Cognitive Task tional Journal 01 Human-Computer Studies, Vol. 40,
Design, Lawrence Erlbaum Associates, Mahwah, NJ. pp. 153-184. .
Holroyd, C. B., and Coles, M. G. H. (2002), "The Neural Lee, J. D., and See, K. A. (2004), "Trust in Automation:
Basis of Human Error Processing: Reinforeement Learn- Designing for Appropriate Relianee," Human Factors,
ing, Dopamine, and the Error-Related Negativity," Psy- Vol. 46, pp. 50-80.
chalogical Review, Vol. 109, pp. 679-709. Leech, D. S. (2004), "Learning in a Lean System," Defense
Johnson, W. G. (1980), MORT Safety Assurance Systems, Acquisition University Publication, Department of De-
Marcel Dekker, New York. fense, retrieved May 12, 2004, from http://acc.dau.rnill.
Johnson, C. (2002), "Software TooIs to Support Incident Lekberg, A (1997), "Different Approaehes to Accident
Reporting in Safety-Critical Systems," Safety Science, Investigation: How the Analyst Makes the Difference,"
Vol. 40, pp. 765-780. in Proceedings 01 the 15th 1ntemational Systems Safety
Johnson, C. (2003), Failure in Safety-Critical Systems: A Conference, Sterling, VA, Intemational Systems Safety
Handbook of Accident and Incident Reporting, Univer- Society, pp. 178-193.
sity of Glasgow Press, Glasgow. Levy, J., Gopher, D., and Donchin, Y. (2002), "An Analysis
. Kaber, D. B., and Endsley, M. R (2004), "The Effects.of of Work Activity in the Operating Room: Applying
Level of Automation and Adaptive Automation 00 Psychological Theory to Lower the Likelihood of
Human Performance, Situation Awareness and Work- Human Error," in Proceedings of the Human Factors
load in a Dynamic Control Task," Theoretical Issues in and Ergonomics Society 46th Annual Meeting, Human
Ergonomics Seience, Vol. 4, pp. 113-153. Faetors and Ergonomies Society, Santa Monica, CA,
Kapur, K. C; and Lamberson, L. R. (1977), ReLiability in pp. 1457-1461.
Engineering and Design, Wiley, New York. Luczak, H. (1997), "Task Analysis," in Handbook of Human
Kaye, K. (1999a), "Automated Flying Harbors Hidden Per- Factors and Ergonomics, 2nd ed., G. Salvendy, Ed.,
ils," South Florida Sun-Sentinel, September 27. Wiley, New York, pp. 340-416.
HUMAN ERROR 759
Moray, N. (2000), "Culture, Politics and Ergonomics," Roberts, K. H. (1990), "Sorne Characteristics of One Type
Ergonomics, Vol. 43, pp. 858-868. ofHigh Reliability Organizatíon," Organiration Science,
Moray, N., Inagaki, T., and Itoh, M. (2000), "Adaptive Vol. 1, pp. 160-176.
. Automation, Trust, and Self-Confidence in Fault Man- Robinson, AG., and Stern, S. (1998), Corporate Creativity,
agement of Time-Critical Tasks," Journal of Experimen- Berrett-Koehler, San Francisco.
tal Psychology: Applied, Vol. 6, pp. 44-58. Rochlin, G., La Por.te, T. D., and Roberts, K. H. (1987),
Nielsen, J. (1995), Usability Engineering, Academic Press, "The Self-Designing High Reliability Organization:
San Diego, CA. Aircraft Carrier Flight Operations at Sea," Naval War
Nieuwenhuis, S. N., Ridderinkhof, K. R., Blom, J., Band, College Review, Vol. 40, pp. 76-90.
G. P. H., and Kok, A (2001), "Error Related Brain Rosenthal, J., Booth, M., and Barry, A. (2001), "Cost Impli-
Potentials Are Differentially Related to Awareness of cations of State Medical Error Reporting Programs: A
Response Errors: Evidence from an Antisaccade Task," Briefing Paper," National Academy for State Health Pol-
Psychophysiology, Vol. 38, pp. 752-760. icy, Portland, ME.
Norman, D. A. (1981), "Categorization of Action Slips," Reuse, W. B., and Rouse, S. (1983), "Analysis and Classifi-
Psychological Review, Vol. 88, pp. 1-15. cation of Human Error," IEEE Transactions on Systems,
Parasuraman, R. (2003), "Neuroergonomics: Research and Man, and Cybernetics, Vol. SMC-13, pp. 539-549.
Practice," Theoretical lssues in Ergonomics Science. Rumelhart, D. E., and McClelland, J. L., Eds. (1986), Par-
Vol. 4, pp. 5 - 20. allel Distributed Processing: Explorations in the Micro-
Parasuraman, R., and Riley, V. (1997), "Humans and Auto- structure of Cognition, Vol. 1, Foundations, MIT Press,
mation: Use, Misuse, Disuse, ando Abuse," Human Cambridge, MA.
Factors, Vol. 39, pp. 230-253. Sanders, M. S., and McCormick, E. J. (1993), Human Fac-
Parasuraman, R., Sheridan, T. B., and Wickens, C. D. (2000), tors in Engineering and Design, 7th ed., McGraw-Hill,
HA Model for Types and Levels of Human Interaction New York.
with Automation," IEEE Transactions on Systems Man, Senders, J. W., and Moray, N. P. (1991), Human Error:
and Cybernetics, Part A: Systems and Humans, Vol. 30, Cause, Prediction, and Reduction, Lawrence Erlbaum
Associates, Mahwah, NJ.
pp. 276-297.
Sharit, J. (1997), "Allocation of Functions," in Handbook of
Perrow, C. (1983), "The Organizational Context of Human
Human Factors and Ergonomics, 2nd ed., G. Salvendy,
Factors Engineering," Administrative Science Quarterly,
Ed., Wiley, New York, pp. 301-339. .
Vol. 27, pp. 521-541.
Sharit,1. (1998), "Applying Human and System Reliability
Perrow, C. (1999), Normal Accidents: Living with High-Risk
Analysis to the Design and Analysis of Written Pro-
Technologies, Princcton University Press, Princeton, NJ.
cedures in High-Risk Industries," Human Factors and
Phillips, L. D., Embrey, D. E., Humphreys, P., and Selby,
Ergonomics in Manufacturing, VoL 8, pp. 265-281.
D. L (1990), HA Sociotechnical Approach to Assessing
Sharit, 1. (2003), "Perspectives on Computer Aiding in Cog-
Human Reliability," in Influence Diagrams, Belief Nets
nitive Work Dornains: Toward Predictions of Effective-
and Decision Making: Their lnfluence on Safety and ness and Use," Ergonomics, Vol. 46, pp. 126-140.
Reliability, R. M. Oliver and J. A. Smith, Eds., Wiley, Shepherd, A. (2000), Hierarchical Task Analysis, Taylor &
New York. Francis, London.
Poner, S. S., Roth, E. M., Woods, D. D., and Elm, W. C. Simon, H. A (1966), Models of Man: SociaL and Rational,
(1998), HA Framework for Integrating Cognítive Task Wiley, New York.
Analysis into the System Development Process," in Smith, P. 1., and Geddes, N. D. (2003), HA Cognitive Sys-
Proceedings 01 the Human Factors and Ergonomics tems Engineering Approach ro the Design of Decision
Society 42nd Annual Meeting, Human Factors and Support Systerns," in rhe Human=Computer Interaction
Ergonomics Society, Santa Monica, CA, pp. 395-399. Handbook: Fundamentals, Evolving Technolagies. and
Prager, L. O. (1998), "Sign Here," American Medical News, Emerging Applications, J. A Jacko and A Sears, Eds.,
VoL' 41, October 12, pp. 13-14. Lawrence Erlbaurn Associates, Mahwah, NI.
Rasmussen, J. (1982), "Human Errors: A Taxonomy for Stout, R. M., Cannon-Bowers, 1. A, Salas, E., and
Describing Human Malfunction in Industrial Instal- Milanovich, D. M. (1999), "Planning, Shared Mental
lations," Journal of Occupational Accidents, Vol. 4, Models, and Coordinated Performance: An Empiri-
pp. 311-333. cal Link Is Established," Human Factors, Vol. 41,
Rasrnussen, J. (1986), Information Processing and Humalt- pp. 61-71.
Machine Interaction: An Approach lo Cognitive Engi- Strauch, B. (2002), Investigating Human Error: Incidents,
neering, Elsevier, New York. Accidents, and Complex Systems, Ashgate, Aldershot,
Rasmussen, J., Pejterson, A. M., and Goodstein, L. P. (1994), Hampshire, England.
Cognitive Systems Engineering, Wiley, New York. Swain, A. D., and Guttmann, H. E. (1983), Handbook of
Reason. J. (1990), Human Error, Cambridge University Human Reliability Analysis with Emphasis on Nuclear
Press, New York. Power Plant Applications, NUREG/CR-1278, U.S.
Reason. J. (1997), Managing the Risks of Organizational Nuclear Regulatory Commission, Washington, DC.
Accidents, Ashgate, Aldershot, Hampshire, England. Thomas, E. L, and Helmreich, R. L. (2002), "Will Airline
Reason, 1., and Hobbs, A. (2003), Managing Maintenance Safety Models Work in Medicine?" in Medical Error:
Error: A Practica! Cuide, Ashgate, Aldershot, Hamp- What Do We Know? What Do We Do? M. M. Rosenthal
shire, England. . and K. M. Sutclíffe, Eds., Jossey-Bass, San Francisco,
Reynolds-Mozrall, J., Drury, C. G., Sharit, J,; and Cerny, F. pp. 217-234 ..
(2000), "The Effects ofWhole-Body Restriction on Task Turrell, M. (2002), "Idea Management and the Suggestion
Performance," Ergonomics, Vol. 43, pp. 1805-1823. Box," White Paper, Imaginatik Research, retrieved May
760 DESIGN FOR HEALTH, SAFETY, AND COMFORT
12, 2004, from http://www.imaginatik.com/web/nsfl Woods, D. D. (1993), "Process Tracing Methods for the
docs/idea_ reports _imaginatik. Study of Cognition Outside the Experimental Psychol-
Vicente, K. J. (1999), Cognitive Work Analysis: Toward ogy Laboratory," in Decision Making in Action: Models
Saje, Productive, and Healthy Computer-Based Work, and Methods, G. Klein, R. Calderwood, and J. Orasanu
Lawrence Erlbaum Associates, Mahwah, NJ. Eds., Ablex, Norwood, NJ, pp. 227-251. '
Vicente, K. J. (2004), The Human Factor: Revolutionizing the Woods, D. D., and Watts, 1. C. (1997), "How Not to Nav-
Way People Live with Technology, Routledge, New York. igate Through Too Many Displays," in Handbook of
Weiner, E. L. (1985), "Beyond the Sterile Cockpit," Human Human=Computer Interaction, 2nd ed., M. Helander
Factors, Vol. 27, pp. 75-90. T. K. Landauer, and P. Prabhu, Eds., Elsevier Science'
Wickens, C. D. (1984), "Processing Resources in Attention," New York, pp. 617-650. '
in Varieties of Attention, R. Parasuraman and R. Davies, Woods, D. D., Johannesen, L. J., Cook, R. 1, and Sarter
Eds., Academic Press, New York, pp. 63-101. N. B. (1994), "Behind Human Error: Cognitive Systems'
Wickens, C. D., Liu, Y., Becker, S. E. G., and Lee, J. D. Computers, and Hindsight," CSERIAC State-of-th~
(2004), An Introduction lo Human Factors Engineering, Art-Report, Crew Systerns Ergonomics Information
2nd ed., Prentice-Hall, Upper Saddle River, NJ. Analysis Center, Wright-Patterson Air Force Base, OH.
Wilde, G. 1. S. (1982), "The Theory of Risk Homeostasis: Woods, D. D., Sarter, N. B., and Bíllings, C. E. (1997),
Implications for Safety and Health," Risk Analysis, "Automation Surprises," in Handbook of Human Fac-
Vol. 2, pp. 209-225. tors and Ergonomics, 2nd ed., G. Salvendy, Ed., Wiley,
Woods, D. D. (1984), "Sorne Results on Operator Perfor- New York, pp. 1926-1943.
mance in Emergency Events," Institute of Chemical
Engineers Symposium Series, Vol. 90, pp. 21-31.