Transcript
Transcript
Welcome to Usability and Human Factors, Electronic Health Records and Usability.
This is Lecture a.
In this unit we will apply principles of usability and design to critiquing EHR systems and
to making recommendations for iterative improvement.
Slide 2
Slide 3
In 2010, three major organizations, the Association for Research on Healthcare and
Quality (AHRQ), the National Research Council (NRC), and the Health information
Management Systems Society (HIMSS) published reports on usability. NRC was the
first, and after a two year study, in which experts travelled around the country looking at
some of the institutions with the best healthcare IT, came to this conclusion:
That while computing science has adequately met the needs of the back-end systems;
what is needed is better front-end development that provides cognitive support to
clinicians. Usability is a critical part of the user experience.
The other reports focused on usability also, pointing out its influence in errors (which in
medicine can be fatal), user satisfaction, and productivity.
Slide 4
Slide 5
Users form their impression of software from their experience above all; poor
experiences can lead to profound dissatisfaction (including refusal to use the system),
abuse, dangerous workarounds, and other serious consequences. For example, we
have seen the results of poor usability affect the outcome of elections.
Slide 6
Slide 7
Current research by the National Center for Cognitive Informatics and Decision Making
in Healthcare (the NCCD) has found that a great user interface follows established
human interface design principles that are based on the way users (doctors, nurses,
patients etc.) think and work. There are 14 general design principles that can be applied
to the development of EHRs, and they are an expansion and elaboration of Nielsen’s 10
principles that are discussed in other units in this component. We will use these
guidelines in the remainder of this unit.
Slide 8
Let’s look at some reasons why EHR usability is not yet optimal. Vendor contracts may
forbid customers (even customers of the same EHR) to discuss their experiences.
Publication of screenshots and other information may be forbidden by copyright; this
hinders research.
Slide 9
The AHRQ report found that many legacy systems currently in use are more than 10
years old, and implementation plans can take decades. Best practices have not been
defined yet, though AHRQ and other associations are working on this. Expectations are
unclear, communication limited, and many vendors do not do formal usability testing, or
only do it to a limited extent. Because of the lack of formal standards and training
usability may be perceived to be overly subjective and therefore difficult to measure. As
we will show later, this is not the case.
Slide 10
However, the increased interest and focus on this problem means that there is
increasing involvement of users in design. The AHRQ report on vendor practices found
that vendors attempt to compete on usability, users demand better products, and plans
for formal usability testing are increasing. Vendors also say they are amenable to
changing design if given guidelines.
Slide 11
Some users and researchers are discouraged at the extremely poor usability of some
systems, which has led to errors (including fatal errors). Political and power struggles in
implementations can ensue, as the introduction of technology can also change power
relationships, as well as radically alter workflow and work practices. Lack of appropriate
Slide 12
According to the AHRQ report on vendor practices below are some quotes.
“The field is competitive so there is little sharing of best practices to the community. The
industry should not look towards vendors to create these best practices. Other entities
must step up and define [them] and let the industry adapt.”
Products are picked on the amount of things they do, not how well they do them.”
“There are no standards most of the time, and when there are standards, there is no
enforcement of them. The software industry has plenty of guidelines and good best
practices, but in HIT, there are none.”
Slide 13
A study published in 2015 by Ratwani and colleagues found that there was a lack of
adherence to ONC certification requirements. For example, only 22% of the vendor
reports had used at least 15 participants with clinical backgrounds for usability tests.
Ratwani and colleagues stated, “The lack of adherence to usability testing may be a
major factor contributing to the poor usability experienced by clinicians. Enforcement of
existing standards, specific usability guidelines, and greater scrutiny of vendor UCD
processes may be necessary to achieve the functional and safety goals for the next
generation of EHRs.”
Slide 14
Slide 15
Take a look at this mock screenshot. What do you see that is suboptimal or could lead
to error?
Slide 16
Slide 17
The results section says that the result is negative, and the result is final. Most clinicians
who are busy would likely stop reading here.
Slide 18
Then, there is an addendum saying that the culture is actually positive for MRSA, a
dangerous infection that often spreads in hospitals.
Slide 19
This sort of bad design has several consequences. It forces clinicians to search for
indications of normalcy or danger. It presents a disparity from the lab system, which
normally flags abnormal results. This can lead to miscommunication between
personnel. The case is a real case in which the patient was not treated for a dangerous
infection for 24 hours. The system is CCHIT certified despite the bad design. This
example also shows one of the changes from paper to computer - in a paper system the
erroneous first test could have been crossed out, preventing the mistake.
Slide 20
This slide shows an alphabetized problem lists from a real system. It does not meet the
needs of clinicians, who would want to see the problems in order of severity or
importance.
Slide 21
The list is created by the system automatically, and the clinician does not have the
ability to edit or delete entries. The entries can be incorrect because many people put
information into the system, and may make selections for convenience, such as the
nurse who entered the atrial fibrillation diagnosis to speed up the order fulfillment.
Unbelievably, the wrong entry can only be removed by the vendor.
Slide 22
Thus the multiple diabetes diagnoses, only one of which is accurate. Lack of controlled
terminology makes term management difficult. The list also includes useless information
such as the 'medication use, long term' item.
Slide 23
Slide 24
This is a grid which the user must scroll to be able to see some information. However,
when the user scrolls...
Slide 25
...the row and column headers that tell what information belong to each column
disappears. Thus the user must keep track either mentally or (more likely) by placing
fingers on the screen. Otherwise it would be easy to lose track of columns or misread
information, potentially causing errors.
Slide 26
This screen has excessive repetitious information that is not needed and is distracting,
such as including the units in every cell instead of in the header rows. There is a lack of
focus and clarity; lab panel components are scattered.
Slide 27
This concludes lecture a of Usability and Human Factors, Electronic Health Records
and Usability.
In this unit we examined vendor practice reports by the Agency for Healthcare Research
and Quality. This provided key rules and roles for vendors. In addition this lectured
provided examples on how wrong data can be input into EHR systems (error). In the
next lecture we will continue by discussing usability concepts.
No Audio.
Slide 28 (Reference slide)
No Audio.
Slide 30 (Final slide)
No audio.
End.