0% found this document useful (0 votes)
172 views

Heuristic Evaluation

Uploaded by

Oleksandr Che
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
172 views

Heuristic Evaluation

Uploaded by

Oleksandr Che
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Heuristic evaluation

A heuristic evaluation is a usability inspection method for computer software that helps to identify
usability problems in the user interface (UI) design. It specifically involves evaluators examining the
interface and judging its compliance with recognized usability principles (the "heuristics"). These evaluation
methods are now widely taught and practiced in the new media sector, where UIs are often designed in a
short space of time on a budget that may restrict the amount of money available to provide for other types of
interface testing.

Contents
Introduction
Nielsen
Gerhardt-Powals' cognitive engineering principles
Weinschenk and Barker classification
See also
References
Further reading
External links

Introduction
The main goal of heuristic evaluations is to identify any problems associated with the design of user
interfaces. Usability consultants Rolf Molich and Jakob Nielsen developed this method on the basis of
several years of experience in teaching and consulting about usability engineering. Heuristic evaluations are
one of the most informal methods[1] of usability inspection in the field of human–computer interaction.
There are many sets of usability design heuristics; they are not mutually exclusive and cover many of the
same aspects of user interface design. Quite often, usability problems that are discovered are categorized—
often on a numeric scale—according to their estimated impact on user performance or acceptance. Often the
heuristic evaluation is conducted in the context of use cases (typical user tasks), to provide feedback to the
developers on the extent to which the interface is likely to be compatible with the intended users' needs and
preferences.

The simplicity of heuristic evaluation is beneficial at the early stages of design. This usability inspection
method does not require user testing which can be burdensome due to the need for users, a place to test them
and a payment for their time. Heuristic evaluation requires only one expert, reducing the complexity and
expended time for evaluation. Most heuristic evaluations can be accomplished in a matter of days. The time
required varies with the size of the artefact, its complexity, the purpose of the review, the nature of the
usability issues that arise in the review, and the competence of the reviewers. Using heuristic evaluation
prior to user testing will reduce the number and severity of design errors discovered by users. Although
heuristic evaluation can uncover many major usability issues in a short period of time, a criticism that is
often leveled is that results are highly influenced by the knowledge of the expert reviewer(s). This "one-
sided" review repeatedly has different results than software performance testing, each type of testing
uncovering a different set of problems.

Nielsen
Jakob Nielsen's heuristics are probably the most-used usability heuristics for user interface design. Nielsen
developed the heuristics based on work together with Rolf Molich in 1990.[1][2] The final set of heuristics
that are still used today were released by Nielsen in 1994.[3] The heuristics as published in Nielsen's book
Usability Engineering are as follows:[4]

1. Visibility of system status:


The system should always keep users informed about what is going on, through appropriate
feedback within reasonable time.
2. Match between system and the real world:
The system should speak the user's language, with words, phrases and concepts familiar to
the user, rather than system-oriented terms. Follow real-world conventions, making information
appear in a natural and logical order.
3. User control and freedom:
Users often choose system functions by mistake and will need a clearly marked "emergency
exit" to leave the unwanted state without having to go through an extended dialogue. Support
undo and redo.
4. Consistency and standards:
Users should not have to wonder whether different words, situations, or actions mean the
same thing. Follow platform conventions.
5. Error prevention:
Even better than good error messages is a careful design which prevents a problem from
occurring in the first place. Either eliminate error-prone conditions or check for them and
present users with a confirmation option before they commit to the action.
6. Recognition rather than recall:
Minimize the user's memory load by making objects, actions, and options visible. The user
should not have to remember information from one part of the dialogue to another. Instructions
for use of the system should be visible or easily retrievable whenever appropriate.
7. Flexibility and efficiency of use:
Accelerators—unseen by the novice user—may often speed up the interaction for the expert
user such that the system can cater to both inexperienced and experienced users. Allow users
to tailor frequent actions.
8. Aesthetic and minimalist design:
Dialogues should not contain information which is irrelevant or rarely needed. Every extra unit
of information in a dialogue competes with the relevant units of information and diminishes
their relative visibility.
9. Help users recognize, diagnose, and recover from errors:
Error messages should be expressed in plain language (no codes), precisely indicate the
problem, and constructively suggest a solution.
10. Help and documentation:
Even though it is better if the system can be used without documentation, it may be necessary
to provide help and documentation. Any such information should be easy to search, focused
on the user's task, list concrete steps to be carried out, and not be too large.
Gerhardt-Powals' cognitive engineering principles
Although Nielsen is considered the expert and field leader in heuristic evaluation, Jill Gerhardt-Powals
developed a set of cognitive engineering principles for enhancing human-computer performance.[5] These
heuristics, or principles, are similar to Nielsen's heuristics but take a more holistic approach to evaluation.
Gerhardt Powals' principles[6] are listed below.

1. Automate unwanted workload:


Eliminate mental calculations, estimations, comparisons, and any unnecessary thinking, to free
cognitive resources for high-level tasks.
2. Reduce uncertainty:
Display data in a manner that is clear and obvious to reduce decision time and error.
3. Fuse data:
Bring together lower level data into a higher level summation to reduce cognitive load.
4. Present new information with meaningful aids to interpretation:
New information should be presented within familiar frameworks (e.g., schemas, metaphors,
everyday terms) so that information is easier to absorb.
5. Use names that are conceptually related to function:
Display names and labels should be context-dependent, which will improve recall and
recognition.
6. Group data in consistently meaningful ways:
Within a screen, data should be logically grouped; across screens, it should be consistently
grouped. This will decrease information search time.
7. Limit data-driven tasks:
Use color and graphics, for example, to reduce the time spent assimilating raw data.
8. Include in the displays only that information needed by the user at a given time:
Exclude extraneous information that is not relevant to current tasks so that the user can focus
attention on critical data.
9. Provide multiple coding of data when appropriate:
The system should provide data in varying formats and/or levels of detail in order to promote
cognitive flexibility and satisfy user preferences.
10. Practice judicious redundancy:
Principle 10 was devised by the first two authors to resolve the possible conflict between
Principles 6 and 8, that is, in order to be consistent, it is sometimes necessary to include more
information than may be needed at a given time.

Weinschenk and Barker classification


Susan Weinschenk and Dean Barker[7] created a categorization of heuristics and guidelines by several major
providers into the following twenty types:[8]

1. User Control:
The interface will allow the user to perceive that they are in control and will allow appropriate
control.
2. Human Limitations:
The interface will not overload the user’s cognitive, visual, auditory, tactile, or motor limits.
3. Modal Integrity:
The interface will fit individual tasks within whatever modality is being used: auditory, visual, or
motor/kinesthetic.
4. Accommodation:
The interface will fit the way each user group works and thinks.
5. Linguistic Clarity:
The interface will communicate as efficiently as possible.
6. Aesthetic Integrity:
The interface will have an attractive and appropriate design.
7. Simplicity:
The interface will present elements simply.
8. Predictability:
The interface will behave in a manner such that users can accurately predict what will happen
next.
9. Interpretation:
The interface will make reasonable guesses about what the user is trying to do.
10. Accuracy:
The interface will be free from errors.
11. Technical Clarity:
The interface will have the highest possible fidelity.
12. Flexibility:
The interface will allow the user to adjust the design for custom use.
13. Fulfillment:
The interface will provide a satisfying user experience.
14. Cultural Propriety:
The interface will match the user’s social customs and expectations.
15. Suitable Tempo:
The interface will operate at a tempo suitable to the user.
16. Consistency:
The interface will be consistent.
17. User Support:
The interface will provide additional assistance as needed or requested.
18. Precision:
The interface will allow the users to perform a task exactly.
19. Forgiveness:
The interface will make actions recoverable.
20. Responsiveness:
The interface will inform users about the results of their actions and the interface’s status.

See also
Usability inspection
Progressive disclosure
Cognitive bias
Cognitive dimensions, a framework for evaluating the design of notations, user interfaces and
programming languages

References
1. Nielsen, J., and Molich, R. (1990). Heuristic evaluation of user interfaces, Proc. ACM CHI'90
Conf. (Seattle, WA, 1–5 April), 249–256
2. Molich, R., and Nielsen, J. (1990). Improving a human–computer dialogue, Communications of
the ACM 33, 3 (March), 338–348
3. Nielsen, J. (1994). Heuristic evaluation. In Nielsen, J., and Mack, R.L. (Eds.), Usability
Inspection Methods, John Wiley & Sons, New York, NY
4. Nielsen, Jakob (1994). Usability Engineering. San Diego: Academic Press. pp. 115–148.
ISBN 0-12-518406-9.
5. Gerhardt-Powals, Jill (1996). "Cognitive engineering principles for enhancing human –
computer performance". International Journal of Human-Computer Interaction. 8 (2): 189–211.
doi:10.1080/10447319609526147 (https://doi.org/10.1080%2F10447319609526147).
6. Heuristic Evaluation – Usability Methods – What is a heuristic evaluation? (https://usability.gov/
methods/test_refine/heuristic.html#WhatisaHeuristicEvaluation) Usability.gov
7. Weinschenk, S and Barker,D. (2000) Designing Effective Speech Interfaces. Wiley.
8. Jeff Sauro. "What's the difference between a Heuristic Evaluation and a Cognitive
Walkthrough?" (http://www.measuringusability.com/blog/he-cw.php). MeasuringUsability.com.

Further reading
Dix, A., Finlay, J., Abowd, G., D., & Beale, R. (2004). Human-computer interaction (3rd ed.).
Harlow, England: Pearson Education Limited. p324
Gerhardt-Powals, Jill (1996). Cognitive Engineering Principles for Enhancing Human-
Computer Performance. “International Journal of Human-Computer Interaction”, 8(2), 189–21
Hvannberg, E., Law, E., & Lárusdóttir, M. (2007) “Heuristic Evaluation: Comparing Ways of
Finding and Reporting Usability Problems”, Interacting with Computers, 19 (2), 225–240
Nielsen, J. and Mack, R.L. (eds) (1994). Usability Inspection Methods, John Wiley & Sons Inc

External links
Jakob Nielsen's introduction to Heuristic Evaluation (http://www.useit.com/papers/heuristic/) –
Including fundamental points, methodologies and benefits.
Alternate First Principles (Tognazzini) (http://www.asktog.com/basics/firstPrinciples.html) –
Including Jakob Nielsen's ten rules of thumb
Heuristic Evaluation at Usability.gov (https://www.usability.gov/methods/test_refine/heuristic.ht
ml)
Heuristic Evaluation in the RKBExplorer (https://web.archive.org/web/20080828060019/http://
www.rkbexplorer.com/explorer/#display=mechanism%2D{http://resex.rkbexplorer.com/id/resilie
nce-mechanism-4331d919})
Remote (online) Heuristic Evaluation Tool (http://www.usabilitest.com/features/Heuristic_Evalu
ation) at usabiliTEST.com.

Retrieved from "https://en.wikipedia.org/w/index.php?title=Heuristic_evaluation&oldid=919475206"

This page was last edited on 3 October 2019, at 22:42 (UTC).

Text is available under the Creative Commons Attribution-ShareAlike License; additional terms may apply. By using this
site, you agree to the Terms of Use and Privacy Policy. Wikipedia® is a registered trademark of the Wikimedia
Foundation, Inc., a non-profit organization.

You might also like