0% found this document useful (0 votes)
21 views

08 GUI Evaluation Techniques

Uploaded by

tsabharwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

08 GUI Evaluation Techniques

Uploaded by

tsabharwal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 86

Human Computer Interaction

Evaluation Techniques

Human-Computer Interaction 1
?What is evaluation
 The role of evaluation:
 We still need to assess our designs and test our
systems to ensure that they actually behave as
we expect and meet user requirements.

Human-Computer Interaction 2
What is evaluation? (Cont.)
 Evaluation should not be thought of as a
single phase in the design process.
 Evaluation should occur throughout the
design life cycle, with the results of the
evaluation feeding back into modifications to
the design.
 It is not usually possible to perform extensive
experimental testing continuously throughout
the design, but analytical(‫ ) تحليلي)ة‬and informal
techniques can and should be used.
Human-Computer Interaction 3
Broad headings on evaluation
techniques
 We will consider evaluation techniques under
two broad headings(‫) عناوين عريضة‬:

 Evaluation Through Expert Analysis.


 Evaluation Through User Participation.

Human-Computer Interaction 4
Goals of Evaluation
 Evaluation has three main goals:
 To assess extent(‫ ) م))دى‬and accessibility of the
system’s functionality.

 To assess user’s experience of the interaction.

 To identify specific problems with the system.

Human-Computer Interaction 5
Goals of Evaluation (Cont.)
1. To assess extent and accessibility of
the system’s functionality:
 The system’s functionality is important in that
it must accord with the users requirements.

 So the design of the system should enable


users to performed their intended tasks more
easily.

 The use of the system must be matching the


user’s expectations of the task.
Human-Computer Interaction 6
Goals of Evaluation (Cont.)

2. To assess user’s experience of the


interaction:
 This includes considering aspects such as:
 How easy the system is to learn.
 Its usability.
 The users satisfaction with it.
 The user’s enjoyment and emotional response.

Human-Computer Interaction 7
Goals of Evaluation (Cont.)

3. To identify specific problems with the


system:
 Unexpected results -> confusion amongst
users.

 Related to both the functionality and usability


of the design.

Human-Computer Interaction 8
Objectives of User Interface Evaluation

 Key objective of both UI design and evaluation:

“Minimize malfunctions”
 Key reason for focusing on evaluation:

 Without it, the designer would be working “blindfold”.


 Designers wouldn’t really know whether they are solving
customer’s problems in the most productive way.

Human-Computer Interaction 9
Evaluation Techniques
 Evaluation:
 Tests usability and functionality of system.
 Evaluates both design and implementation.
 Should be considered at all stages in the design life
cycle.

 But, in order for evaluation to give feedback to


designers...
 ...we must understand why a malfunction occurs.

 Malfunction analysis:
 Determine why a malfunction occurs.
 Determine how to eliminate malfunctions.
Human-Computer Interaction 10
Overview of Interface Evaluation
Methods
 Three types of methods:
 Passive evaluation.
 Active evaluation.

 Predictive evaluation (usability inspections).

 All types of methods useful for optimal


results.
 Used in parallel.
 All attempt to prevent malfunctions.

Human-Computer Interaction 11
1. Passive evaluation
 Performed while prototyping in a test.

 Does not actively seek malfunctions.


 Only finds them when they happen to occur.
 Infrequent malfunctions may not be found.

 Generally requires realistic use of a system.


 Users become frustrated(‫محب))))))ط‬ ) with
malfunctions.

Human-Computer Interaction 12
1. Passive evaluation. (Cont.)
Gathering Information:

a) Problem report monitoring:


 Users should have an easy way to register their
frustration / suggestions.

 Best if integrated with software.

Human-Computer Interaction 13
1. Passive evaluation (Cont.)
b) Automatic software logs.
 Can gather much data about usage:
 Command frequency.
 Error frequency.
 Undone operations (a sign of malfunctions).

 Logs can be taken of:


 Just keystrokes, mouse clicks.
 Full details of interaction.

Human-Computer Interaction 14
1. Passive evaluation (Cont.)
c) Questionnaires:
 Useful to obtain statistical data from large
numbers of users.
 Proper statistical means(‫ ) الوس)ائل اإلحص)ائية المناس)بة‬are
needed to analyze results.
 Gathers subjective data about importance of
malfunction.
 Less frequent malfunctions may be more important.
 Users can prioritize needed improvements.

Human-Computer Interaction 15
Questionnaires
 Set of fixed questions given to users.

 Limit on number of questions:


 Very hard to phrase questions well.
 Questions can be closed- or open-ended.

 Advantages:
 Quick and reaches large user group.

Human-Computer Interaction 16
Open-ended questions
 Open-ended questions are those questions that
will solicit (‫ ) التماس‬additional information from the
inquirer(‫) المستعلم‬.

 Examples:
 How may/can I help you?
 Where have you looked already?
 What aspect are you looking for?
 What kind of information are you looking for?
 What would you like to know about [topic]?
 When you say [topic], what do you mean?
17
Closed ended questions
 Closed ended questions are those questions, which can be
answered finitely by either “yes” or “no.”
 Examples:
 a. Can I help you?
 b. May I help you?
 c. Can you give me more information?
 d. Have you searched elsewhere?
 e. Can you describe the kind of information you want?
 f. Can you give me an example?
 g. Could you be more specific?
 h. Are you looking for [topic]?

18
2. Active evaluation
 Actively study specific actions performed by users.
 Performed when prototyping done.
 Gathering Information:
d) Experiments & usability engineering:
 Prove hypothesis about measurable attributes of one or
more UI’s.
 e.g. speed/learning/accuracy/frustration…
 In usability engineering test against goal of system.
 Hard to control for all variables.

e) Observation sessions (Videotaped Evaluation).


 Also called ‘interpretive evaluation’.
 Study active use on realistic tasks.
Human-Computer Interaction 19
3. Predictive evaluation
 Studies of system by experts rather than users.
 Performed when UI is specified ( useful even before prototype
developed).
 Can eliminate many malfunctions before users ever see software.
 Also called “usability inspections”.( ‫التفتيش‬ ‫) عمليات‬
 Gathering Information:
f) Heuristic(‫ ) ارشادي‬evaluation.
 Based on a UI design principle document.
 Analyze whether each guideline is commit to(‫ ) تل)تزم‬in the context of
the task and users.
 Can also look at commit to standards.

g) Cognitive(‫ ) المعرفية‬walkthroughs.
 Step-by-step analysis of:
 Steps in task being performed.
 Goals users form to perform these tasks.
 How system leads userHuman-Computer
through tasks.
Interaction 20
Summary of evaluation techniques
When to use Technique

Always a) Problem reporting

b) Automatic logs
In any moderately(‫ ) متوسط‬complex system
and whenever there are large numbers and
commands
Whenever there are large number of users c) Questionnaires

In special cases where it is hard to choose d) Experiments & Usability


between alternatives, Engineering

Almost always, especially when user has to e) Observation sessions


interact with a client while using the system

Always f) Heuristic evaluation

When usability must be optimized g) Cognitive Walkthrough

Human-Computer Interaction 21
Evaluating Designs
 The evaluation should occur throughout the
design process.

 These methods can be used at any stage in the


development process from a design
specification, through storyboards and
prototypes, to full implementations, making
them: Flexible evaluation approaches.

Human-Computer Interaction 22
1. Videotaped Evaluation
 A software engineer studies users who are actively
using the user interface:
 To observe what problems they have.
 The sessions are videotaped.
 Can be done in user’s environment.

 Activities of the user:


 Preferably talks to him/her-self as if alone in a room.
 This process is called ‘co-operative’ evaluation when the software
engineering and user talk to each other.

Human-Computer Interaction 23
1. Videotaped Evaluation (Cont.)
The importance of video:
 With using it, ‘you can see what you want to see from the system’.
 You can repeatedly analyze, looking for different problems.

Tips for using video:


 Several cameras are useful.
 Software is available to help analyse video by dividing into
segments and labelling the segments.

Human-Computer Interaction 24
2. Experiments
1. Pick a set of subjects (users):
1. A good mix to avoid biases(‫) التحيزات‬.
2. A sufficient number to get statistical significance (avoid
random happenings effect results).
2. Pick variables to test:
 Variables Manipulated to produce different conditions:
 Should not have too many.
 They should not affect each other too much.
 Make sure there are no hidden variables.
3. Develop a hypothesis:
 A prediction of the outcome.
 Aim of experiment is to show this is correct.

Human-Computer Interaction 25
Variables

 Independent variable (IV):


 Characteristics changed to produce different conditions.
e.g. interface style, number of menu items…

 Dependent variable (DV):


 Characteristics measured in the experiment.
e.g. time taken, number of errors.

Human-Computer Interaction 26
Heuristic Evaluation .3
 Developed by Jakob Nielsen & Rolf
Molich in the early 1990s.
 Helps find usability problems in a UI design.
 a heuristic is based on UI guideline.
 usability criteria (heuristics) are identified.
 design examined by experts to see if these
are violated

Human-Computer Interaction 27
Heuristic Evaluation (cont.)
 Evaluators goes through UI several
times.
 Inspects(‫ )يفحص‬various dialogue elements.
 Consider and compares with list of usability
principles.
 Usability principles:
 Nielsen’s “heuristics”.
 Competitive analysis & user testing of existing
products.
 Use violations to redesign/fix problems.

Human-Computer Interaction 28
Heuristic Evaluation .1
 A type of predictive(‫ ) تنبؤي‬evaluation:
 Use HCI experts as reviewers instead of users.
 Benefits of predictive evaluation:
 The experts know what problems to look for.
 Can be done before system is built.
 Experts give prescriptive(‫ ) توجيهي‬feedback.
 Important points about predictive evaluation:
 Reviewers should be independent of designers.
 Reviewers should have experience in both the application
domain and HCI.
 Include several experts to avoid bias.
 Experts must know classes of users.
 Beware: Novices can do some very bizarre(‫ ) غ)ريب‬things
that experts may not anticipate.
Human-Computer Interaction 29
Heuristic Evaluation .1
 Example heuristics:
 System behaviour is predictable.
 System behaviour is consistent.
 Feedback is predictable.

 Heuristics(‫ االس))تدالل‬, ‫ ) األس))اليب البحثي))ة‬being


developed for mobile devices, virtual
worlds, etc…

Human-Computer Interaction 30
:Nielsen’s ten heuristics are
1. Visibility of system status.
2. Match between system and the real world.
3. User control and freedom.
4. Consistency and standards.
5. Error prevention.
6. Recognition(‫ ) التعرف‬rather than recall(‫) تذكر‬.
7. Flexibility and efficiency of use.
8. Aesthetic(‫ ) جم))الي‬and minimalist(‫) الح)))د األدنى‬
design.
9. Help users recognize, diagnose(‫ ) تش))خيص‬and
recover from errors.
10. Help and documentation.
Human-Computer Interaction 31
H-1: Visibility of system status
 Provide feedback.
 Keep users informed about what is going on.
 example: pay attention to response time.
 0.1 sec: no special indicators needed.
 1.0 sec: user tends to lose track of data.
 10 sec: max. duration if user to stay focused on 1
action.
 For longer delays, use percent-done progress bars.

Human-Computer Interaction 32
H-1: Visibility of system status
 Continuously inform the user about:
 What it is doing.
 How it is interpreting the user’s input.
 User should always be aware of what is
going on.

What’s it > Do it
Time for
> Do it coffee.
doing? This will take
5 minutes...

33
H-1: Visibility of system status
What mode
am I in now?

What did I
select? How is the
system
interpreting
my actions?

Human-Computer Interaction 34

Microsoft Paint
H-1: Visibility of system status
Be as specific as possible, based on user’s input.

Best within the context of the action.

Human-Computer Interaction 35
H-1: Visibility of system status
Multiple files being copied,
but feedback is file by file.

Drawing Board LT

Human-Computer Interaction 36
H-1: Visibility of system status
 Dealing with long delays.
 Cursors.
 For short transactions.

 Percent done dialogs.


 Time left.
 Estimated time.

 Random
Contacting host (10-60 seconds)
 For unknown times.

cancel 37
H-2: Match between system and real world

 Speak the users’ language.


 Follow real world conventions.

 Dragging disk to trash.


 Should delete it, not eject it.

Human-Computer Interaction 38
H-2: Match between system and real world

My program gave me the That’s


message Rstrd Info. restricted But surely you No, no… Rstrd Info
What does it mean? information can tell me!!! stands for “Restricted
Information”

Hmm… but what It means the Ok, I’ll take a


does it mean??? program is too busy coffee
to let you log on

Human-Computer Interaction 39
H-2: Match between system and real world

Terminology based on users’ language for task.


 e.g. withdrawing money from a bank machine.

Use meaningful mnemonics, icons & abbreviations.


 eg File / Save
 Ctrl + S (abbreviation)
 Alt FS (mnemonic for menu action)
 (tooltip icon) 40
H-3: User control and freedom
 “exits” for mistaken choices, undo, redo.
 Don’t force down fixed paths.

 Wizards
 Must respond to 1 Q
before going to next
 Good for beginners
 Have N versions
 SketchUp 6

Human-Computer Interaction 41
H-3: User control and freedom
How do
I get
out of
this?

42
H-3: User control and freedom
Users don’t like to feel trapped by the computer!
 Should offer an easy way out of as many situations as
possible.

Strategies:
 Cancel button (for dialogs waiting for user input).
 Universal Undo (can get back to previous state).
 Quit (for leaving the program at any time).
 Defaults (for restoring a property sheet). Core
Dump

43
H-4: Consistency & standards

Human-Computer Interaction 44
H-4: Consistency & standards
Consistent syntax of input.
Consist language and graphics.
 Same visual appearance across the system.
 Same information/controls in same location on all windows.

Ok Cancel Cancel Ok Ok Accept Dismiss

Consist effects Cancel

 Commands, actions have same effect in equivalent situations


 Predictability.

45
H-4: Consistency & standards
These are labels with a
raised appearance.

Is it any surprise that


people try and click on
them?

Human-Computer Interaction 46
Why

Human-Computer Interaction 47
Human-Computer Interaction 48

From Peachpit website


Human-Computer Interaction 49

From Peachpit website


H-5: Error prevention

 Make it difficult to make errors.

 Even better than good error message is


a careful design that prevents a
problem from occurring in the first
place.

Human-Computer Interaction 50
H-6: Recognition rather than recall

 Make objects, actions, options, and directions visible.


 The user should not have to remember information
from one part of the dialog to another.

Human-Computer Interaction 51
H-6: Recognition rather than recall

Computers good at remembering, people are not!


Promote recognition over recall.
 Menus, icons, choice dialog boxes vs commands, field formats.
 Relies on visibility of objects to the user.

52
H-6: Recognition rather than recall

Gives input format, example and default.

Human-Computer Interaction 53
H-7: Flexibility and efficiency of use

 Accelerators for experts


 e.g., keyboard shortcuts
 Allow users to tailor frequent actions
 e.g., macros
 Customized user profiles on the web

Human-Computer Interaction 54
H-8: Aesthetic(‫ ) جمالي‬and
minimalist(‫ ) الحد األدنى‬design

 No irrelevant information in dialogues.

Human-Computer Interaction 55
H-9: Help users recognize,
diagnose, and recover from
errors
 Error messages in plain language.
 Precisely indicate the problem.
 Constructively suggest a solution.

Human-Computer Interaction 56
H-9: Help users recognize,
diagnose, and recover from errors
People will make errors!

Errors we make
 Mistakes
 Conscious(‫الواعي‬ ) actions lead to an error instead of
correct solution.
 Slips
 Unconscious behaviour gets misdirected in route to
satisfy a goal.

57
H2-9: Help users recognize,
diagnose, and recover from
errors

What is “error 15762”? 58


H2-9: Help users recognize,
diagnose, and recover from
errors
Provide meaningful error messages:
 Error messages should be in the user’s task language.
 Don’t make people feel stupid:-
 Try again!.
 Error 25.
 Cannot open this document.
 Cannot open “chapter 5” because the application “Microsoft
Word” is not on your system.
 Cannot open “chapter 5” because the application “Microsoft
Word” is not on your system. Open it with “Teachtext” instead?.

59
H2-9: Help users recognize,
diagnose, and recover from
errors
Prevent errors:
 Try to make errors impossible.
 Modern widgets: can only enter legal data.

60
H2-10: Help and documentation
 Easy to search.
 Focused on the user’s task.
 List concrete steps to carry out.
 Not too large.

Human-Computer Interaction 61
H2-10: Help and documentation

Help is not a replacement for bad design!

Simple systems:
 Use minimal instructions.

Most other systems:


 Simple things should be simple.
 Learning path for advanced features. Volume 37:
A user's
guide to...

62
Documentation and how it is
used
Many users do not read manuals.
Usually used when users are in some kind of
panic.
 paper manuals unavailable in many businesses!
 e.g. single copy locked away in system administrator’s
office.
 online documentation better.
 online help specific to current context.

Sometimes used for quick reference.


 list of shortcuts ...

63
2. Cognitive Walkthrough
Proposed by Polson and colleagues.
 Evaluates design on how well it supports user in learning
task.
 Focus on ease of learning.

 Usually performed by expert in cognitive psychology.


 Expert is told the assumptions about user population,
context of use, task details.

 expert ‘walks though’ design to identify potential problems


using psychological principles.

Human-Computer Interaction 64
Cognitive Walkthrough .2
 Walkthroughs require a detailed review of a sequence
of actions.

 In the cognitive walkthrough, the sequence of


actions refers to the steps that an interface will
require a user to perform in order to accomplish
some task.

Human-Computer Interaction 65
Walkthrough needs four
:things
1. A specification or prototype of the system.
 It doesn't have to be complete, but it should be fairly
detailed.
2. A description of the task the user is to perform on the
system.

3. Written list of the actions needed to complete the


task with the proposed system.

4. An indication of who the users are and what kind of


experience and knowledge the evaluators can
assume about them.

Human-Computer Interaction 66
Four Questions
The evaluator will answer these questions:

1. Is the effect of the action the same as the user’s goal


at that point?

2. Will users see that the action is available?

3. Once users have found the correct action, will they


know it is the one they need?

4. After the action is taken, will users understand the


feedback they get?

Human-Computer Interaction 67
Cognitive Walkthrough
 For each task walkthrough considers:
 What impact will interaction have on user?
 What processes are required?
 What learning problems may occur?

 Analysis focuses on goals and knowledge:


 Does the design lead the user to generate the
correct goals?

Human-Computer Interaction 68
Steps of a Cognitive
Walkthrough
 Define inputs.
 Convene analysts.
 Step through action sequences for each task.
 Record important information.
 Modify UI.

Human-Computer Interaction 69
Define Inputs - Example

 Task: Move an application to a new folder or


drive
 Who: Win 2003 user
 Interface: Win 2003 desktop
 Folder containing desired app. is open.
 Destination folder/drive is visible.
 Action sequence...

Human-Computer Interaction 70
Action Sequence

 Move mouse to app. icon.


 Right mouse down on app. icon:
 Result: App. icon highlights.
 Failure: The user may not know that the right
mouse button is the proper one to use.
 Success?: Highlighting shows something
happened, but was it the right thing?

Human-Computer Interaction 71
Action Sequence (cont’d)

 Release mouse button:


 Result: Menu appears: Cut, Copy, Create
Shortcut, Cancel.
 Success: User is prompted for next action.
 Move mouse to “Cut”:
 Result: Selection highlights.
 Success: Standard GUI menu interaction.

Human-Computer Interaction 72
Action Sequence (cont’d)

 Move mouse to destination icon:


 Result:
 App. icon follows mouse.
 Destination icon highlights when mouse reaches it.
 Success: Dragging is intuitive (and common in
GUIs) for moving. The feedback is appropriate.

Human-Computer Interaction 73
Action Sequence (cont’d)

 Click mouse button:


 Result:
 App. icon disappears from under the mouse.
 App. icon disappears from original folder.
 App. icon appears in destination folder.
 Success: Standard GUI menu selection.
Feedback shows desired goal was accomplished.

Human-Computer Interaction 74
Cognitive Walkthrough
:Example
 step1: identify task
 step 2: identify action sequence for task
 user action: Press the ‘timed record’ button
 system display: Display moves to timer mode. Flashing
cursor appears after ‘start’.
 step 3: perform walkthrough
 for each action – answer the following questions
 Is the effect of the action the same as the user’s goal at that point?
 Will users see that the action is available?
 Once users have found the correct action, will they know it is the
one they need?
 After the action is taken, will users understand the feedback they
get?
 Might find a potential usability problem relating to icon on
‘timed record’ button.
Human-Computer Interaction 75
Example:
Programming a VCR by remote control
1
Start:
Time 21:45
End:
Channel 3 Channel:
Date:

1 2 3 1 2 3

4 5 6 4 5 6

7 8 9 0 7 8 9 0

Human-Computer Interaction 76
Example: VCR (Cont.)
 Task: Program the video time-record a
program starting at 12.00 and finishing at
13.30 on channel 2 on 23 February 2008.
 Who: Assume user is familiar with VCRs but
not with this particular design.
 Action Sequence:
 User’s action ( UA ).
 System’s Display ( SD ).

Human-Computer Interaction 77
 UA 1: Press the ‘timed record’ button
 SD 1: Display moves to timer mode. Flashing cursor appears after ‘start:’
 UA 2: Press digits 1 2 0 0
 SD 2: Each digit is displayed as typed and flashing cursor moves to next
position
 UA 3: Press the ‘timed record’ button
 SD 3: Flashing cursor moves to ‘end:’
 UA 4: Press digits 1 3 3 0
 SD 4: Each digit is displayed as typed and flashing cursor moves to next
position
 UA 5: Press the ‘timed record’ button
 SD 5: Flashing cursor moves to ‘channel’
 UA 6: Press digit 2
 SD 6: Digit is displayed as typed and flashing cursor moves to next position
 UA 7: Press ‘timed record’ button
 SD 7: Flashing cursor moves to ‘date’
 UA 8: Press digits 2 3 0 2 0 8
 SD 8: Each digit is displayed as typed and flashing cursor moves to next
position.
 UA 9: Press the ‘timed record’
 SD 9: Stream number in top right-hand corner of display flashes
 UA 10: Press the transmit button
 SD 10: Details are transmitted to video player and display returns to normal
mode
Human-Computer Interaction 78
Example: VCR (Cont.)
 We must answer the four questions and tell a
story about the usability of the system:
 UA 1: Press the ‘timed record’ button
 Q1) is the effect of the action the same as the user’s
goal at that point?
 The timed record button initiates timer programming. It is
reasonable to assume that a user familiar with VCRs would
be trying to do this as his first goal.
 Q2) Will users see that the action is available?
 The ‘timed record’ button is visible on the remote controller.

Human-Computer Interaction 79
Example: VCR (Cont.)
 Q3) once users have found the correct action, will they
know it is the one they need?
 It is not clear which button is the ‘timed record’ button. The icon of
a clock is a possible candidate but this could be interpreted as a
button to change the time. Other possible candidates might be the
fourth button down on the left or the filled circle ( associated with
record ). In fact, the icon of the clock is the correct choice but it is
quite possible that the user would fail at this point. This identify a
potential usability problem.
 Q4) After the action is taken, will users understand the
feedback they get?
 Once the action is taken the display changes to the timed record
mode and shows familiar headings ( start, end, channel, date ). It
is reasonable to assume that the user would recognize these as
indicating successful completion of the first action.

Human-Computer Interaction 80
Evaluation through user
participation
 Some of the techniques we have considered
so far concentrate on evaluating a design or
system through analysis by the designer, or
an expert evaluator, rather than testing with
actual users.
 User participation in evaluation tends to
occur in the later stages of development
when there is at least a working prototype of
the system in place.
Human-Computer Interaction 81
Styles of evaluation
 There are two distinct evaluation styles:
 Those performed under laboratory conditions.
 Those conducted in the work environment or ‘in
the field’.

Human-Computer Interaction 82
Laboratory Studies
 In the first type of evaluation studies, users
are taken out of their normal work
environment to take part in controlled tests,
often in a specialist usability laboratory.
 This approach has a number of benefits and
disadvantages.
 A well equipped usability laboratory may
contain sophisticated audio/visual recording
and analysis facilities.
Human-Computer Interaction 83
Laboratory Studies (Cont.)
 There are some situations where laboratory
observation is the only option.
 e.g. if the system is to be located in a dangerous or remote
location, such as a space station.
 Some very constrained single user tasks may be
adequately performed in a laboratory.
 Want to manipulate the context in order to uncover
problems or observe less used procedures.
 Want to compare alternative designs within a controlled
context.

Human-Computer Interaction 84
Field Studies
 The second type of evaluation takes the
designer or evaluator out into the user’s work
environment in order to observe the system
in action.
 High levels of ambient noise, greater levels of
movement and constant interruptions, such
as phone calls, all make field observation
difficult.

Human-Computer Interaction 85
Field studies (Cont.)
 The very ‘open’ nature of this situation means
that you will observe interactions between
systems and between individuals that would
have been missed in a laboratory study.
 The context is retained and you are seeing
the user in his ‘ natural environment ‘.

Human-Computer Interaction 86

You might also like