Chapter 3
Chapter 3
HUMAN FACTORS AS
HCI THEORIES
CHAPTER 3
SENSATION
which senses external information (e.g., visual, aural, haptic), and
Perception, which interprets and extracts basic meanings of the
external information.
PERCEPTION
which interprets and extracts basic meanings of the external
information.
TASKMODELING AND
HUMAN PROBLEM-SOLVING
MODEL
Human problem-solving or information-processing efforts consist of the
following important parts:
MEMORY
which stores momentary and short-term information or long-term
knowledge. This knowledge includes information about the external
world, procedures, rules, relations, schemas, candidates of actions to
apply, the current objective (e.g., accomplishing the interactive task
successfully), the plan of action, etc.
TASKMODELING AND
HUMAN PROBLEM-SOLVING
MODEL
Human problem-solving or information-processing efforts consist of the
following important parts:
DECISION MAKER/EXECUTOR
which formulates and revises a “plan,” then decides what to do based
on the various knowledge in the memory, and finally acts it out by
commanding the motor system (e.g., to click the mouse left button).
HUMAN FACTORS AS HCI THEORIES
The mismatch between the user’s mental model and the task
model employed by the interactive system creates the “gulf.”
On the other hand, when the task model and interface structure
of the interactive system maps well to the expected mental
model of the user, the task performance will be very fluid.
HUMAN REACTION AND PREDICTION OF COGNITIVE PERFORMANCE
VISUAL
Humans are known to have at least five senses. Among them, those
that would be relevant to HCI are the modalities of visual, aural, haptic
(force feedback), and tactile sensation.
AURAL
Taking external stimulation or raw sensory information (sometimes
computer generated) and then processing it for perception is the first
part in any human–computer interaction.
TACTILE AND HAPTIC
Naturally, the information must be supplied in a fashion that is
amenable to human consumption, that is, within the bounds of a
human’s perceptual capabilities.
MULTIMODAL INTERACTION
SENSATION AND PERCEPTION OF INFORMATION
VISUAL
VISUAL
VISUAL
VISUAL
VISUAL
VISUAL
VISUAL
VISUAL
The human eye contains two types of cells that react to light
intensities in different ways.
The cones, which are responsible for color and detail recognition,
are distributed heavily in the center of the retina (back of the
eyeball), which subtends about 5° in the human FOV and roughly
establishes the area of focus.
On the other hand, the rods are distributed mainly in the periphery
of the retina and are responsible for motion detection and less
detailed peripheral vision. While details may not be sensed, the
rods contribute to our awareness of the surrounding environment.
SENSATION AND PERCEPTION OF INFORMATION
VISUAL
VISUAL
VISUAL
Detail, color, brightness, and contrast are all very-low-level raw visual
properties. Before all these low-level-part features are finally
consolidated for conscious recognition (of a larger object) through the
visual information processing pipeline, pre-attentive features might be
used to attract our attention.
Figure 3.12 shows several examples and how they can be used
collectively to form and design effective graphic icons.
SENSATION AND PERCEPTION OF INFORMATION
AURAL
AURAL
INTENSITY (AMPLITUDE)
refers to the amount of sound energy and is synonymous
with the more familiar term, volume.
AURAL
AURAL
PHASE
refers to the time differences among sound waves that
emanate from the same source. Phase differences
occur, for example, because our left and right ears
may have slightly different distances to the sound
source and, as such, phase differences are also known
to contribute to the perception of spatialized sound
such as stereo.
SENSATION AND PERCEPTION OF INFORMATION
AURAL
AURAL
HAPTIC
is defined to be the modality that takes advantage of touch
by applying forces, vibrations, or motions to the user. Thus
haptic refers to both the sensation of force feedback as well
as touch (tactile).
TACTILE RESOLUTION:
VIBRATION FREQUENCY
PRESSURE THRESHOLD:
(a) the degrees of freedom (the number of directions in which force or torque be
can displayed),
(b) the force range (should be at least greater than 0.5mN),
(c) operating/interaction range (how much movement is allowed through the
device), and
(d) stability (how stable the supplied force is felt to be).
Stability is in fact a by-product of the proper sampling period, which refers to the
time taken to sense the current amount of force at the interaction point and then
determine whether the target value has been reached and reinforce it (a process
that repeats until a target equilibrium force is reached at the interaction point).
The ideal sampling period is about 1000 Hz, and when the sampling period falls
under a certain value, the robotic mechanism exhibits instability (e.g., exhibited in
the form of vibration) and thus lower usability. The dilemma is that providing a
high sampling rate requires a heavy computation load, not only in updating the
output force, but also in physical simulation (e.g., to check if the 3-D cursor has hit
any virtual object). Therefore, a careful “satisficing” solution is needed to balance
the level of the haptic device performance and the user experience (Figure 3.18)
SENSATION AND PERCEPTION OF INFORMATION
MULTIMODAL INTERACTION
MULTIMODAL INTERACTION
COMPLEMENTARY:
Different modalities can assume different roles and act in a
complementary fashion to achieve specific interaction objectives. For
example, an aural feedback can signify the arrival of a phone call while the
visual displays the caller’s name.
REDUNDANT:
Different modality input methods or feedback can be used to ensure a
reliable achievement of the interaction objective. For instance, the ring of a
phone call can be simultaneously aural and tactile to strengthen the pick-
up probability.
ALTERNATIVE:
Providing users with alternative ways to interact gives people more
choices. For instance, a phone call can be made either by touching a
button or by speaking the callee’s name, thereby promoting convenience
and usability.
SENSATION AND PERCEPTION OF INFORMATION
MULTIMODAL INTERACTION
For instance, to signify a button touch, the visual highlighting and beep
sound effect must occur within a short time (e.g., less than 200 ms)
to be recognized as one consistent event. The representation must be
coordinated between the two: In the previous example, if there is
one highlighting, then there should also be one corresponding beep.
When inconsistent, the interpretation of the feedback can be confusing, or
only the dominant modality will be recognized.
Human Body Ergonomics
(Motor Capabilities)
So far, we have mostly talked about human cognitive and
perceptual capabilities and how display or input systems must
be configured to match them.
MT = a + b * ID and ID = log(A/W + 1)
where A and B are coefficients specific to a given task.
Note that the original Fitts’s law was created for interaction
with everyday objects (in the context of operation in factory
assembly lines) rather than for computer interfaces.