SlideShare a Scribd company logo
Evaluation in hci
 Nowadays users expect much more than just a
usable system; they also look for a pleasing
and engaging experience. This means it is even
more important to carry out an evaluation.
 Imagine you have designed an app for teenagers to
share music, gossip, and photos. You have
prototyped your first design and implemented the
core functionality. How would you find out
whether it would appeal to them and if they will
use it? You could need to evaluate it – but how?
 In our human-centred approach to design, we
evaluate designs right from the sketches and then
as the project progresses we keep improving our
prototypes and evaluate those.
Evaluation in hci
 Basically two types evaluation:
◦ Formative Evaluation
◦ Summative Evaluation
 Formative evaluations involve evaluating a
product or service during development. The goal
is to make improvements in the design prior to
release. This means identifying or diagnosing the
problems, making and implementing
recommendations, and then evaluating again.
Formative usability is always done before the
design has been finalized. In fact, the earlier the
formative evaluation, the more impact the
usability evaluations will have on the design.
 Here are a few key questions you will be able
answer with a formative approach:
• What are the most significant usability issues
preventing users from accomplishing their goals or
resulting in inefficiencies?
• What aspects of the product work well for the
users? What do users find frustrating?
• What are the most common errors or mistakes
users are making?
 Here are a few key questions you will be able
answer with a formative approach:
• What are the most significant usability issues
preventing users from accomplishing their goals or
resulting in inefficiencies?
• What aspects of the product work well for the
users? What do users find frustrating?
• What are the most common errors or mistakes
users are making?
• Are improvements being made from one design
iteration to the next?
• What usability issues can you expect to remain
after the product is launched?
 Summative usability is about evaluating the dish
after it comes out of the oven. The usability
specialist running a summative test is like a food
critic who evaluates a few sample dishes at a
restaurant or perhaps compares the same meal in
multiple restaurants. The goal of summative
usability is to evaluate how well a product or piece
of functionality meets its objectives. Summative
testing can also be about comparing several
products to each other.
 Summative usability evaluations answer these
questions:
• What is the overall usability of our product?
• How does our product compare against the
competition?
• Have we made improvements from one product
release to the next?
. Summative testing can also be about comparing
several products to each other.
 Two main types
◦ Expert Evaluation
 Heuristic Evaluation
 Cognitive Walkthroughs
◦ User Testing
 A simple, relatively quick and effective method of
evaluation is to get an interaction design, or
usability, expert to look at the system and try
using it
 this is no substitute for getting real people to
use your design
 expert evaluation is effective, particularly early in
the design process.
Most usability engineering methods will
contribute substantially to the usability of an
interface …
…if they are actually used.
 What is it?
 What is it?
A discount usability engineering method
- Easy (can be taught in ½ day seminar)
- Fast (about a day for most evaluations)
- Cheap
 How does it work?
 How does it work?
◦ Evaluators use a checklist of basic usability
heuristics
◦ Evaluators go through an interface twice
 1st pass get a feel for the flow and general scope
 2nd pass refer to checklist of usability heuristics
and focus on individual elements
◦ The findings of evaluators are combined and
assessed
Nielsen: 10 Usability Heuristics
(based on extensive empirical testing)
•Visibility of system status
• Match between system and
the real world
• User control and freedom
Consistency
•Error prevention
• Recognition not recall
• Flexibility and efficiency
• Aesthetic and minimalist
design
• Help users diagnose and
recover from errors
• Help and documentation
 One expert won’t do
 Need 3 - 5 evaluators
 Exact number
needed depends on
cost-benefit analysis
 Debriefing session
◦ Evaluators rate the severity of all problems
identified
◦ Use a 0 – 4, absolute scale
 0 I don’t agree that this is a prob at all
 1 Cosmetic prob only
 2 Minor prob – low priority
 3 Major prob – high priority
 4 Usability catastrophe – imperative to fix
 How does H.E. differ from User Testing?
 How does H.E. differ from User Testing?
◦ Evaluators have checklists
◦ Evaluators are not the target users
◦ Evaluators decide on their own how they want to
proceed
◦ Observer can answer evaluators’ questions about
the domain or give hints for using the interface
◦ Evaluators say what they didn’t like and why;
observer doesn’t interpret evaluators’ actions
 What are the shortcomings of H.E.?
 What are the shortcomings of H.E.?
◦ Identifies usability problems without indicating
how they are to be fixed.
 “Ideas for appropriate redesigns have to appear
magically in the heads of designers on the
basis of their sheer creative powers.”
Evaluation in hci
 Red is used both for help messages and for error
messages (consistency, match real world)
 “There is a problem with your order”, but no
explanation or suggestions for resolution (error
reporting)
 No “Continue shopping" button (user control &
freedom)
 Recalculate is very close to Clear Cart (error
prevention)
 “Check Out” button doesn’t look like other
buttons (consistency, both internal & external)
 Must recall and type in cart title to load
(recognition not recall, error prevention,
efficiency)
 Expert Evaluation
◦ Heuristic Evaluation
◦ Cognitive Walkthroughs
 User Testing
Evaluation in hci
 The cognitive walkthrough was originally
designed as a tool to evaluate walk-up-and-use
systems like postal kiosks, automated teller
machines (ATMs), and interactive exhibits in
museums where users would have little or no
training. However, the cognitive walkthrough has
been employed successfully with more complex
systems like CAD software and software
development tools to understand the first
experience of new users.
 The cognitive walkthrough is a usability
evaluation method in which one or more
evaluators work through a series of tasks and ask a
set of questions from the perspective of the user.
 Inputs:
◦ Prototype
◦ Task
◦ Sequence of actions to do the task in the
prototype
◦ User analysis
 For each action the evaluation asks the following
questions:
 What is the user goal and why?
Is the action obviously available?
Does the action or label match the goal?
 Is there good feedback?
Evaluation in hci
Evaluation in hci
 Expert evaluation is useful but there is no
comparison to using the actual users for
evaluation.
 There are many different types of user evaluations
for example
◦ Formative Evaluation
◦ Field Study
◦ Controlled Experiments
 Usability testing involves collecting of data using a
combination of methods in a controlled setting,
for example, experiments that follow basic
experimental design, observation, interviews, and
questionnaires. Often, usability testing is
conducted in labs, although increasingly
interviews and other forms of data collection are
being done remotely via phone and digital
communication (for instance, through Skype or
Zoom) or in natural settings.
 The primary goal of usability testing is to
determine whether an interface is usable by the
intended user population to carry out the tasks for
which it was designed. This involves investigating
how typical users perform on typical tasks. By
typical, we mean the users for whom the system
is designed (for example, teenagers, adults, and so
on) and the activities that it is designed for
them to be able to do (such as, purchasing the
latest fashions).
 As users perform the tasks, they may be recorded
on video. Their interactions with the software may
also be recorded, usually by logging software. User
satisfaction questionnaires and interviews can
also be used to elicit users’ opinions about how
they liked the experience of using the system.
Evaluation in hci
Evaluation in hci

More Related Content

What's hot (20)

PPT
HCI 3e - Ch 12: Cognitive models
Alan Dix
 
PPTX
evaluation techniques in HCI
sawsan slii
 
PDF
Human Computer Interaction Evaluation
LGS, GBHS&IC, University Of South-Asia, TARA-Technologies
 
PDF
HCI Basics
Zdeněk Lanc
 
PPTX
Human computer interaction-Memory, Reasoning and Problem solving
N.Jagadish Kumar
 
PDF
Human computer interaction-web interface design and mobile eco system
N.Jagadish Kumar
 
PPTX
Usability Inspection, Human computer intraction.pptx
SyedGhassanAzhar
 
PPT
User Centered Design 101
Experience Dynamics
 
PPTX
Chapter five HCI
yihunie ayalew
 
PPTX
hci in software development process
Kainat Ilyas
 
PPT
HCI 3e - Ch 6: HCI in the software process
Alan Dix
 
PPT
HCI 3e - Ch 8: Implementation support
Alan Dix
 
PPT
Human computer interaction
sai anjaneya
 
PPTX
Direct manipulation - ppt
Ayeesha Kissinger
 
PPT
Human Computer Interaction Chapter 5 Universal Design and User Support - Dr....
VijiPriya Jeyamani
 
PPT
HCI - Chapter 1
Alan Dix
 
PPT
HCI - Chapter 4
Alan Dix
 
PDF
Mobile Design
Lifna C.S
 
PPT
HCI - Chapter 3
Alan Dix
 
HCI 3e - Ch 12: Cognitive models
Alan Dix
 
evaluation techniques in HCI
sawsan slii
 
Human Computer Interaction Evaluation
LGS, GBHS&IC, University Of South-Asia, TARA-Technologies
 
HCI Basics
Zdeněk Lanc
 
Human computer interaction-Memory, Reasoning and Problem solving
N.Jagadish Kumar
 
Human computer interaction-web interface design and mobile eco system
N.Jagadish Kumar
 
Usability Inspection, Human computer intraction.pptx
SyedGhassanAzhar
 
User Centered Design 101
Experience Dynamics
 
Chapter five HCI
yihunie ayalew
 
hci in software development process
Kainat Ilyas
 
HCI 3e - Ch 6: HCI in the software process
Alan Dix
 
HCI 3e - Ch 8: Implementation support
Alan Dix
 
Human computer interaction
sai anjaneya
 
Direct manipulation - ppt
Ayeesha Kissinger
 
Human Computer Interaction Chapter 5 Universal Design and User Support - Dr....
VijiPriya Jeyamani
 
HCI - Chapter 1
Alan Dix
 
HCI - Chapter 4
Alan Dix
 
Mobile Design
Lifna C.S
 
HCI - Chapter 3
Alan Dix
 

Similar to Evaluation in hci (20)

PPTX
Usability in product development
Ravi Shyam
 
PPTX
week_10_day_2_evaluatvssvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvion.pptx
kawsarnewazchowdhury
 
PPTX
Usability Evaluation
Saqib Shehzad
 
PDF
Usability_Evaluation
Narayanan Thiruvengadam
 
PPTX
evaluation -human computer interaction.pptx
NanMdSahar
 
PPTX
hci Evaluation Techniques.pptx
SamanJabbarSamanJabb
 
PPTX
Lesson 6 - HCI Evaluation Techniques.pptx
cjanegenovana
 
PPTX
Chapter 7 - Evaluation Tekhnique
Muhammad Najib
 
PPTX
Design and Evaluation techniques unit 5
KrishnaVeni451953
 
PPTX
Sefjejsjdhdhshshddhshession-8 User Research.pptx
SamarthPansala
 
PDF
Usability Testing - A Holistic Guide.pdf
kalichargn70th171
 
PDF
User Experience Design: an Overview
Julie Grundy
 
PPSX
Usability Testing - Sivaprasath Selvaraj
Sivaprasath Selvaraj
 
PDF
Metrics in usability testing and user experiences
Him Chitchat
 
PDF
HCI-Chapter9.pdf. learning materials for us all
DANZUMAHANUKARU
 
PDF
HCI-Chapter9 (1).pdf. Simple download and learn
DANZUMAHANUKARU
 
PPS
Topic 7 Product Evaluation
Jutka Czirok
 
PDF
Agile methodology - Humanity
Humanity
 
PPT
Usability and evolution Human computer intraction.ppt
SyedGhassanAzhar
 
PPT
Qué es un blog?
carolina_zapata
 
Usability in product development
Ravi Shyam
 
week_10_day_2_evaluatvssvvvvvvvvvvvvvvvvvvvvvvvvvvvvvvion.pptx
kawsarnewazchowdhury
 
Usability Evaluation
Saqib Shehzad
 
Usability_Evaluation
Narayanan Thiruvengadam
 
evaluation -human computer interaction.pptx
NanMdSahar
 
hci Evaluation Techniques.pptx
SamanJabbarSamanJabb
 
Lesson 6 - HCI Evaluation Techniques.pptx
cjanegenovana
 
Chapter 7 - Evaluation Tekhnique
Muhammad Najib
 
Design and Evaluation techniques unit 5
KrishnaVeni451953
 
Sefjejsjdhdhshshddhshession-8 User Research.pptx
SamarthPansala
 
Usability Testing - A Holistic Guide.pdf
kalichargn70th171
 
User Experience Design: an Overview
Julie Grundy
 
Usability Testing - Sivaprasath Selvaraj
Sivaprasath Selvaraj
 
Metrics in usability testing and user experiences
Him Chitchat
 
HCI-Chapter9.pdf. learning materials for us all
DANZUMAHANUKARU
 
HCI-Chapter9 (1).pdf. Simple download and learn
DANZUMAHANUKARU
 
Topic 7 Product Evaluation
Jutka Czirok
 
Agile methodology - Humanity
Humanity
 
Usability and evolution Human computer intraction.ppt
SyedGhassanAzhar
 
Qué es un blog?
carolina_zapata
 
Ad

Recently uploaded (20)

PPTX
Role Of Python In Programing Language.pptx
jaykoshti048
 
PPTX
Web Testing.pptx528278vshbuqffqhhqiwnwuq
studylike474
 
PDF
Step-by-Step Guide to Install SAP HANA Studio | Complete Installation Tutoria...
SAP Vista, an A L T Z E N Company
 
PDF
Using licensed Data Loss Prevention (DLP) as a strategic proactive data secur...
Q-Advise
 
PDF
New Download FL Studio Crack Full Version [Latest 2025]
imang66g
 
PDF
SAP GUI Installation Guide for macOS (iOS) | Connect to SAP Systems on Mac
SAP Vista, an A L T Z E N Company
 
PDF
Supabase Meetup: Build in a weekend, scale to millions
Carlo Gilmar Padilla Santana
 
PDF
On Software Engineers' Productivity - Beyond Misleading Metrics
Romén Rodríguez-Gil
 
PDF
AI Image Enhancer: Revolutionizing Visual Quality”
docmasoom
 
PDF
What companies do with Pharo (ESUG 2025)
ESUG
 
PPTX
classification of computer and basic part of digital computer
ravisinghrajpurohit3
 
PDF
Download iTop VPN Free 6.1.0.5882 Crack Full Activated Pre Latest 2025
imang66g
 
PDF
Virtual Threads in Java: A New Dimension of Scalability and Performance
Tier1 app
 
PPTX
slidesgo-unlocking-the-code-the-dynamic-dance-of-variables-and-constants-2024...
kr2589474
 
PDF
advancepresentationskillshdhdhhdhdhdhhfhf
jasmenrojas249
 
PPTX
Explanation about Structures in C language.pptx
Veeral Rathod
 
PDF
New Download MiniTool Partition Wizard Crack Latest Version 2025
imang66g
 
PDF
10 posting ideas for community engagement with AI prompts
Pankaj Taneja
 
PPT
Brief History of Python by Learning Python in three hours
adanechb21
 
PDF
AWS_Agentic_AI_in_Indian_BFSI_A_Strategic_Blueprint_for_Customer.pdf
siddharthnetsavvies
 
Role Of Python In Programing Language.pptx
jaykoshti048
 
Web Testing.pptx528278vshbuqffqhhqiwnwuq
studylike474
 
Step-by-Step Guide to Install SAP HANA Studio | Complete Installation Tutoria...
SAP Vista, an A L T Z E N Company
 
Using licensed Data Loss Prevention (DLP) as a strategic proactive data secur...
Q-Advise
 
New Download FL Studio Crack Full Version [Latest 2025]
imang66g
 
SAP GUI Installation Guide for macOS (iOS) | Connect to SAP Systems on Mac
SAP Vista, an A L T Z E N Company
 
Supabase Meetup: Build in a weekend, scale to millions
Carlo Gilmar Padilla Santana
 
On Software Engineers' Productivity - Beyond Misleading Metrics
Romén Rodríguez-Gil
 
AI Image Enhancer: Revolutionizing Visual Quality”
docmasoom
 
What companies do with Pharo (ESUG 2025)
ESUG
 
classification of computer and basic part of digital computer
ravisinghrajpurohit3
 
Download iTop VPN Free 6.1.0.5882 Crack Full Activated Pre Latest 2025
imang66g
 
Virtual Threads in Java: A New Dimension of Scalability and Performance
Tier1 app
 
slidesgo-unlocking-the-code-the-dynamic-dance-of-variables-and-constants-2024...
kr2589474
 
advancepresentationskillshdhdhhdhdhdhhfhf
jasmenrojas249
 
Explanation about Structures in C language.pptx
Veeral Rathod
 
New Download MiniTool Partition Wizard Crack Latest Version 2025
imang66g
 
10 posting ideas for community engagement with AI prompts
Pankaj Taneja
 
Brief History of Python by Learning Python in three hours
adanechb21
 
AWS_Agentic_AI_in_Indian_BFSI_A_Strategic_Blueprint_for_Customer.pdf
siddharthnetsavvies
 
Ad

Evaluation in hci

  • 2.  Nowadays users expect much more than just a usable system; they also look for a pleasing and engaging experience. This means it is even more important to carry out an evaluation.  Imagine you have designed an app for teenagers to share music, gossip, and photos. You have prototyped your first design and implemented the core functionality. How would you find out whether it would appeal to them and if they will use it? You could need to evaluate it – but how?
  • 3.  In our human-centred approach to design, we evaluate designs right from the sketches and then as the project progresses we keep improving our prototypes and evaluate those.
  • 5.  Basically two types evaluation: ◦ Formative Evaluation ◦ Summative Evaluation
  • 6.  Formative evaluations involve evaluating a product or service during development. The goal is to make improvements in the design prior to release. This means identifying or diagnosing the problems, making and implementing recommendations, and then evaluating again. Formative usability is always done before the design has been finalized. In fact, the earlier the formative evaluation, the more impact the usability evaluations will have on the design.
  • 7.  Here are a few key questions you will be able answer with a formative approach: • What are the most significant usability issues preventing users from accomplishing their goals or resulting in inefficiencies? • What aspects of the product work well for the users? What do users find frustrating? • What are the most common errors or mistakes users are making?
  • 8.  Here are a few key questions you will be able answer with a formative approach: • What are the most significant usability issues preventing users from accomplishing their goals or resulting in inefficiencies? • What aspects of the product work well for the users? What do users find frustrating? • What are the most common errors or mistakes users are making?
  • 9. • Are improvements being made from one design iteration to the next? • What usability issues can you expect to remain after the product is launched?
  • 10.  Summative usability is about evaluating the dish after it comes out of the oven. The usability specialist running a summative test is like a food critic who evaluates a few sample dishes at a restaurant or perhaps compares the same meal in multiple restaurants. The goal of summative usability is to evaluate how well a product or piece of functionality meets its objectives. Summative testing can also be about comparing several products to each other.
  • 11.  Summative usability evaluations answer these questions: • What is the overall usability of our product? • How does our product compare against the competition? • Have we made improvements from one product release to the next? . Summative testing can also be about comparing several products to each other.
  • 12.  Two main types ◦ Expert Evaluation  Heuristic Evaluation  Cognitive Walkthroughs ◦ User Testing
  • 13.  A simple, relatively quick and effective method of evaluation is to get an interaction design, or usability, expert to look at the system and try using it  this is no substitute for getting real people to use your design  expert evaluation is effective, particularly early in the design process.
  • 14. Most usability engineering methods will contribute substantially to the usability of an interface … …if they are actually used.
  • 15.  What is it?
  • 16.  What is it? A discount usability engineering method - Easy (can be taught in ½ day seminar) - Fast (about a day for most evaluations) - Cheap
  • 17.  How does it work?
  • 18.  How does it work? ◦ Evaluators use a checklist of basic usability heuristics ◦ Evaluators go through an interface twice  1st pass get a feel for the flow and general scope  2nd pass refer to checklist of usability heuristics and focus on individual elements ◦ The findings of evaluators are combined and assessed
  • 19. Nielsen: 10 Usability Heuristics (based on extensive empirical testing) •Visibility of system status • Match between system and the real world • User control and freedom Consistency •Error prevention • Recognition not recall • Flexibility and efficiency • Aesthetic and minimalist design • Help users diagnose and recover from errors • Help and documentation
  • 20.  One expert won’t do  Need 3 - 5 evaluators  Exact number needed depends on cost-benefit analysis
  • 21.  Debriefing session ◦ Evaluators rate the severity of all problems identified ◦ Use a 0 – 4, absolute scale  0 I don’t agree that this is a prob at all  1 Cosmetic prob only  2 Minor prob – low priority  3 Major prob – high priority  4 Usability catastrophe – imperative to fix
  • 22.  How does H.E. differ from User Testing?
  • 23.  How does H.E. differ from User Testing? ◦ Evaluators have checklists ◦ Evaluators are not the target users ◦ Evaluators decide on their own how they want to proceed ◦ Observer can answer evaluators’ questions about the domain or give hints for using the interface ◦ Evaluators say what they didn’t like and why; observer doesn’t interpret evaluators’ actions
  • 24.  What are the shortcomings of H.E.?
  • 25.  What are the shortcomings of H.E.? ◦ Identifies usability problems without indicating how they are to be fixed.  “Ideas for appropriate redesigns have to appear magically in the heads of designers on the basis of their sheer creative powers.”
  • 27.  Red is used both for help messages and for error messages (consistency, match real world)  “There is a problem with your order”, but no explanation or suggestions for resolution (error reporting)
  • 28.  No “Continue shopping" button (user control & freedom)  Recalculate is very close to Clear Cart (error prevention)  “Check Out” button doesn’t look like other buttons (consistency, both internal & external)  Must recall and type in cart title to load (recognition not recall, error prevention, efficiency)
  • 29.  Expert Evaluation ◦ Heuristic Evaluation ◦ Cognitive Walkthroughs  User Testing
  • 31.  The cognitive walkthrough was originally designed as a tool to evaluate walk-up-and-use systems like postal kiosks, automated teller machines (ATMs), and interactive exhibits in museums where users would have little or no training. However, the cognitive walkthrough has been employed successfully with more complex systems like CAD software and software development tools to understand the first experience of new users.
  • 32.  The cognitive walkthrough is a usability evaluation method in which one or more evaluators work through a series of tasks and ask a set of questions from the perspective of the user.  Inputs: ◦ Prototype ◦ Task ◦ Sequence of actions to do the task in the prototype ◦ User analysis
  • 33.  For each action the evaluation asks the following questions:  What is the user goal and why? Is the action obviously available? Does the action or label match the goal?  Is there good feedback?
  • 36.  Expert evaluation is useful but there is no comparison to using the actual users for evaluation.  There are many different types of user evaluations for example ◦ Formative Evaluation ◦ Field Study ◦ Controlled Experiments
  • 37.  Usability testing involves collecting of data using a combination of methods in a controlled setting, for example, experiments that follow basic experimental design, observation, interviews, and questionnaires. Often, usability testing is conducted in labs, although increasingly interviews and other forms of data collection are being done remotely via phone and digital communication (for instance, through Skype or Zoom) or in natural settings.
  • 38.  The primary goal of usability testing is to determine whether an interface is usable by the intended user population to carry out the tasks for which it was designed. This involves investigating how typical users perform on typical tasks. By typical, we mean the users for whom the system is designed (for example, teenagers, adults, and so on) and the activities that it is designed for them to be able to do (such as, purchasing the latest fashions).
  • 39.  As users perform the tasks, they may be recorded on video. Their interactions with the software may also be recorded, usually by logging software. User satisfaction questionnaires and interviews can also be used to elicit users’ opinions about how they liked the experience of using the system.