Chapter 11 The Human Technology Interface
Chapter 11 The Human Technology Interface
• Using the same cards, they might log into their patients’ EHR, access
their patient’s drugs from a drug administration system
• administer the drugs using bar-coding technology.
Other examples of human–technology interfaces
According to Rice and Tahir (2014)
• defibrillator
• a patient- controlled analgesia (PCA) pump
• any number of physiologic monitoring systems
• electronic thermometers
• and telephones and pagers
Different brands or versions of the same device
• Enter data into an EHR
– one might use a keyboard
– light pen
– a touch screen
– voice
• computer screen,
• printer
• or smartphone.
Patient data might be displayed differently;
• in the form of text, images (e.g., the results of a brain
scan), or even sound (an echocardiogram); in addition,
the information may be arrayed or presented differently,
based on roles and preferences
Human–technology interfaces mimic face-to-
face human encounters
• faculty members are increasingly using
videoconferencing technology to communicate with
their students
• telehealth allows nurses to use telecommunication and
videoconferencing software to communicate more
effectively and more frequently with patients at home by
using the technology to monitor patients’ vital signs,
supervise their wound care, or demonstrate a procedure.
According to Gephart and Effken (2013)
• Quinn (the founder of the Design Rhythmics Sonification Research Laboratory at the
University of New Hampshire) and Meeker (2000)
– “climate symphony,” different musical instruments, tones, pitches, and phrases
are mapped onto variables, such as the amounts and relative concentrations of
minerals
– help researchers detect patterns in ice core data covering more than 110,000 years
– Climate patterns take centuries to emerge and can be difficult to detect.
– The music allows the entire 110,000 years to be condensed into just a few minutes,
making detection of patterns and changes much easier.
The human–technology interface is ubiquitous
• in health care
• it takes in many forms
• A look at the quality of these
interfaces follows.
• Be warned: It is not always a
pretty picture
The Human–Technology Interface Problem
• The Human Factor, Vicente (2004)
– safety problems in health care identified by the Institute of Medicine’s
(1999) report and noted how the technology (defined broadly) used often
does not fit well with human characteristics.
– Vicente described his own studies of nurses’ PCA pump errors.
– Nurses made the errors (because of the complexity of the user interface,
which required as many as 27 steps to program the device)
– Vicente and his colleagues developed a PCA in which programming
required no more than 12 steps.
• Doyle (2005)
– reported that when a bar-coding medication system interfered
with their workflow, nurses devised workarounds;
– such as removing the armband from the patient and attaching it
to the bed, because the bar-code reader failed to interpret bar
codes when the bracelet curved tightly around a small arm.
• Koppel et al. (2005)
– reported that a widely used computer-based provider order
entry (CPOE) system
– to decrease medication errors actually facilitated 22 types of
errors because the information needed to order medications
was fragmented across as many as 20 screens
– available medication dosages differed from those the
physicians expected, and allergy alerts were triggered only
after an order was written.
• Han et al. (2005)
– reported increased mortality among children admitted to
Children’s Hospital in Pittsburgh after CPOE implementation.
– Three reasons were cited for this unexpected outcome.
– First, CPOE changed the workflow in the emergency room.
• Before CPOE, orders were written for critical time-sensitive treatment
based on radio communication with the incoming transport team before
the child arrived.
• After CPOE implementation, orders could not be written until the
patient arrived and was registered in the system (a policy that was later
changed).
• Second, entering an order required as many as 10 clicks
and took as long as 2 minutes; moreover, computer
screens sometimes froze or response time was slow.
• Third, when the team changed its workflow to
accommodate CPOE, face-to- face contact among team
members diminished.
In 2005, a Washington Post article
• reported that Cedars-Sinai Medical Center in Los Angeles had
shut down a $34 million system after 3 months because of the
medical staff’s rebellion.
• Reasons for the rebellion included the additional time it took to
complete the structured information forms, failure of the system to
recognize misspellings (as nurses had previously done), and
intrusive and interruptive automated alerts (Connolly, 2005).
• Even though physicians actually responded appropriately to the
alerts, modifying or canceling 35% of the orders that triggered
them, designers had not found the right balance of helpful-to-
interruptive alerts.
• Such unintended consequences (Ash, Berg, & Coiera,
2004) or ;
• unpredictable outcomes (Aarts, Doorewaard, & Berg,
2004) of healthcare information systems
– may be attributed, in part, to a flawed implementation process
– but there were clearly also human– technology interaction
issues.
– technology was not well matched to the users and the context
of care.
– In the pediatric case, a system developed for medical–surgical
units was implemented in a critical care unit.
(Walsh & Beatty, 2002) & (Vicente, 2004).
(1) work domain ( functions of the system and identifies the information that
users need to accomplish their task goals.)
(2) control tasks (investigates the control structures through which the user
interacts with or controls the system)
(3) strategies (how work is actually done by users to facilitate the design of
appropriate human–computer dialogues.)
(4) social–organizational ( identifies the responsibilities of various users (e.g.,
doctors, nurses, clerks, or therapists) so that the system can support
collaboration, communication, and a viable organizational structure.)
(5) worker competencies ( identifies design constraints related to the users
themselves)
Analysts typically borrow tools (e.g., ethnography) from the social
sciences for the two remaining types. Hajdukiewicz, Vicente, Doyle,
Milgram, and Burns (2001) used CWA to model an operating room
environment. Effken (2002) and Effken et al. (2001) used CWA to analyze
the information needs for an oxygenation management display for an ICU.
Other examples of the application of CWA in health care are described by
Burns and Hajdukiewicz (2004) in their chapter on medical systems (pp.
201– 238). Ashoon et al. (2014) used team CWA to reveal the interactions of
the healthcare team in the context of work models in a birthing unit. They felt
that team CWA enhances CWA in complex environments, such as health
care, that require effective teamwork because it reveals additional
constraints relevant to the workings of the team. The information gleaned
about the teamwork could be used for systems design applications
Axiom 2: The Design Process Should Be Iterative, Allowing for Evaluation and Correction of Identified Problems
Chernecky, Macklin, and Waller (2006) assessed cancer patients’ preferences for
website design. Participants were asked their preferences for a number of design
characteristics, such as display color, menu buttons, text, photo size, icon
metaphor, and layout, by selecting on a computer screen their preferences for
each item from two or three options.
Focus Groups
Typically used at the very start of the design process, focus groups can help the
designer better understand
users’ responses to potential interface designs and to content that might be included in
the interface.
Cognitive Walkthrough
A heuristic evaluation has become the most popular of what are called “discount
usability evaluation” methods. The objective of a heuristic evaluation is to detect
problems early in the design process, when they can be most easily and economically
corrected. The methods are termed “discount” because they typically are easy to do,
involve fewer than 10 experts (often experts in relevant fields such as human–
computer technology or cognitive engineering), hey are called “heuristic” because
evaluators assess the degree to which the design complies with recognized usability
rules of thumb or principles (the heuristics), such as those proposed by Nielsen
(1994).
In a field study, end users evaluate a prototype in the actual work setting just
before its general release. For example, Thompson, Lozano, and Christakis (2007)
evaluated the use of touch-screen computer kiosks containing child health–
promoting information in several low-income, urban community settings through an
online questionnaire that could be completed after the kiosk was used. Most users
found the kiosk easy to use and the information it provided easy to understand.
Researchers also gained a better understanding of the characteristics of the likely
users (e.g., 26% had never used the Internet and 48% had less than a high school
education) and the information most often accessed (television and media use,
and smoke exposure)
A Framework for Evaluation
Ammenwerth, Iller, and Mahler (2006) proposed a fit between individuals, tasks, and technology (FITT) model
( suggests each of these factors be considered in designing and evaluating human– technology interfaces)
It is not enough to consider only the user and technology characteristics; the tasks that the technology supports must
be considered as well.
The FITT model builds on DeLone and McLean’s (1992) information success model, Davis’s (1993) technology
acceptance model, and Goodhue andThompson’s (1995) task technology fit model.
Notable strength of the FITT model
• encourages the evaluator to examine the fit between the various pairs
of components: user and technology, task and technology, and user
and task
• Johnson and Turley (2006) compared how doctors and nurses
describe patient information and found that doctors emphasized
diagnosis, treatment, and management, whereas the nurses
emphasized functional issues.
• Although both physicians and nurses share some patient information,
how they thought about patients differed.
• EHR needs to present information (even the same information) to the
two groups in different ways.
Technology Acceptance model and Task–technology
Fit model
Increased attention to improving the human– technology interface through human factors
approaches has already led to significant improvements in one area of health care:
anesthesiology.
Anesthesia machines that once had hoses that would fit into any delivery port now have
hoses that can only be plugged into the proper port. Anesthesiologists have also been
actively working with engineers to improve the computer interface through which they monitor
their patients’ status and are among the leaders in investigating the use of audio techniques
as an alternative way to help anesthesiologists maintain their situational awareness.
No matter who is using the technology, the human–technology interface addresses the user’s ability and the technology’s functionality to complete the task demands.
THANK YOU!