0% found this document useful (0 votes)
11 views7 pages

Unit 3

Uploaded by

daisyrechavez7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views7 pages

Unit 3

Uploaded by

daisyrechavez7
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Human-Computer

Interaction 1

Unit 3. Understanding
and Conceptualizing
Interaction
Lesson 1 : Understanding the Problem Space and Conceptualizing Interaction

A. Understanding the Problem Space


The problem space refers to the environment and context in which a design problem exists. To effectively solve any interaction
design challenge, designers must first understand this space. This includes:
1. Defining the Problem:
o What are you trying to solve?
o What is the user’s problem, and what pain points need addressing?
o Why does the problem exist, and what are its underlying causes?
2. Clarifying Goals:
o What are the objectives of the system or product?
o How will success be measured?
o Are there any competing goals?
3. Understanding Users:
o Who are the users? (Defining personas, identifying needs)
o What are their tasks and workflows?
o What are the limitations or constraints of the users?
4. Considering the Context of Use:
o In what environment will the system be used?
o Are there social, cultural, or environmental factors influencing use?
o What technological constraints exist?
By understanding the problem space, designers can avoid assumptions and biases, focusing instead on solving the right problem for
the right users.

B. Conceptualizing Interaction
Once the problem space is understood, the next step is to conceptualize interaction. This involves generating ideas about how users
will interact with the system to solve the problem. This includes:
1. Defining the Interaction Paradigm:
o What type of interaction will be most suitable? (e.g., command-based, direct manipulation, natural language)
o Is the system going to be desktop-based, mobile, or ambient?
o Will it use familiar paradigms like WIMP (Windows, Icons, Menus, Pointer) or innovative, multimodal
interactions?
2. Identifying the User Experience (UX) Goals:
o What experience do you want to create for the user? (e.g., fun, efficient, secure, immersive)
o How will the interaction style support these goals?
3. Mapping the Interaction:
o What actions will users need to take to achieve their goals?
o What feedback will the system provide to help users understand their actions?
o How will the system handle errors or unexpected user behavior?
4. Understanding Affordances and Constraints:
o What natural clues can guide users to interact correctly? (affordances)
o How will the system limit incorrect actions? (constraints)
5. Creating Prototypes and Wireframes:
o What low-fidelity prototypes can help visualize the interaction?
o How will wireframes guide the structure of the interface?
6. Iterative Design and Testing:
o How will user testing inform and refine the interaction?
o How can feedback loops be incorporated into the design process to ensure continuous improvement?
C. Tools for Conceptualizing Interaction
• Personas and Scenarios: Developing detailed user personas and interaction scenarios to explore potential user
journeys.
• Storyboarding: Creating visual narratives of how users will interact with the system.
• Task Flows: Diagramming the step-by-step actions users will take.
• Wireframing and Prototyping Tools: Using digital tools (e.g., Figma, Sketch) to create wireframes and low- or high-
fidelity prototypes.
D. The Importance of Collaboration
Conceptualizing interaction involves multiple stakeholders, including designers, developers, and users. Cross-disciplinary
collaboration ensures that the final design meets technical, business, and user requirements.
Lesson 2: Conceptual Models in HCI

A conceptual model is a high-level representation of how a system is structured


and how it operates, providing users with a framework for understanding how they
can interact with the system. In Human-Computer Interaction (HCI), conceptual
models play a vital role in the design process, ensuring that users can develop an
accurate mental model of how a system works

Types of Conceptual Models


Conceptual models can vary depending on the system and the user goals. Some
common types include:
1. Instructive Models:
o Command-based systems: Users input explicit commands, and the
system executes them. For example, in a command-line interface (CLI),
users must understand a precise set of instructions to control the system.
2. Direct Manipulation Models:
o Graphical User Interfaces (GUIs): Users interact directly with objects
on the screen, such as icons, buttons, and sliders. This model often
mirrors real-world metaphors (e.g., a desktop metaphor), making
interactions intuitive.
3. Object-Action Models:
o Users select objects (e.g., files, images) and then perform actions (e.g.,
delete, open) on those objects. This model is commonly used in desktop
environments.
4. Conversational Models:
o Natural Language Interfaces: The system is designed to engage in a
conversational or question-answer exchange with the user, such as a
chatbot or virtual assistant.
5. Exploratory Models:
o Browsing Systems: Users explore data or media without a specific goal
in mind. Search engines and content recommendation systems (e.g.,
YouTube, Netflix) are examples of exploratory models.

Lesson 3: Interface Metaphors


What Are Interface Metaphors?
An interface metaphor is a design technique in which the characteristics of an
interaction are modeled after familiar real-world objects or concepts. By mapping
these real-world objects to digital interfaces, users can apply their existing
understanding of physical objects or environments to new software systems.
Example:
• The "desktop" metaphor in operating systems (e.g., Windows, macOS) models
the digital workspace after a physical desk, with files, folders, and trash cans that
users can interact with just as they would in a physical office environment.
Types of Interface Metaphors
1. Direct Metaphors:
o These closely mimic real-world objects. For example, a trash can icon in
the interface allows users to delete files by dragging them into it, just like
in the physical world.
2. Indirect Metaphors:
o These use real-world objects but modify their properties or behaviors to fit
the digital context. For example, email systems use an envelope metaphor
for messages, but an email can be sent instantly, unlike physical mail.
3. Hybrid Metaphors:
o Hybrid metaphors combine elements from multiple real-world concepts to
help users understand more complex digital tasks. For example, a
shopping cart in e-commerce systems blends the metaphor of physical
shopping with digital purchasing workflows.

Lesson 4: In Human-Computer Interaction (HCI), interaction types refer


to the different ways users communicate with and control digital systems. These
types help designers determine how users will provide input, how the system will
respond, and how tasks will be completed. Understanding these types is essential
for creating intuitive, user-friendly interfaces.

A. Instructing
Instructing interactions occur when users issue explicit commands or instructions to
a system. This is a direct, straightforward interaction type, where users know
exactly what they want the system to do.
Examples:
1. Command-line interfaces (CLI): Users type specific instructions (e.g., mkdir
foldername to create a folder).
2. Menu selection: Choosing an option from a drop-down menu (e.g., File →
Save).
3. Button clicks: Clicking an icon, like the "Save" or "Delete" button.

B. Conversing
Conversing interactions involve a dialogue between the user and the system. This
type mimics human conversation, where users provide input in the form of natural
language or simple queries, and the system responds accordingly. Conversing is often
used in natural language processing (NLP) and AI systems.
Examples:
1. Chatbots and virtual assistants: Users can ask Siri, Alexa, or Google
Assistant questions or give instructions using natural language (e.g., "What's the
weather today?").
2. Customer service bots: Users interact with bots via text to troubleshoot issues
or get information.
C. Manipulating
Manipulating interactions involve users interacting directly with objects in the
interface. These interactions are often based on real-world metaphors, such as
dragging, dropping, clicking, or swiping objects. Manipulating interaction types
emphasize direct engagement with the interface, often making it more intuitive.
Examples:
1. Drag-and-drop interfaces: Users can move files, resize windows, or rearrange
items (e.g., dragging files to a folder).
2. Touch interfaces: Pinch-to-zoom, swiping through images or apps on a
touchscreen.

D. Exploring
Exploring interactions involve users navigating and browsing through a system or
environment. In this type, users are not necessarily performing specific tasks but are
exploring options, information, or content. It’s commonly seen in systems with large
sets of data or content, where users need to search, filter, or browse.
Examples:
1. Web browsing: Users navigate websites by clicking on links, exploring various
sections.
2. Virtual environments: In virtual reality (VR) or 3D applications, users explore
immersive environments.
3. Search engines: Users enter queries and explore the results.

E. Responding
Responding interactions occur when the system takes the lead in the interaction,
and the user responds to prompts or system-initiated actions. These are common in
systems where the user must provide feedback, confirmations, or responses to
questions posed by the system.
Examples:
1. System prompts and notifications: The system might ask the user to confirm
an action (e.g., "Do you want to save this document?").
2. Forms and surveys: Users fill in fields or answer questions based on system
prompts.
3. Progressive dialogues: Multi-step processes where the system guides the user
through input tasks (e.g., software installations or setup wizards).
Lesson 5: Paradigms, Visions, Theories, Models, and Frameworks

A. Paradigms
A paradigm in HCI refers to a fundamental way of thinking about interaction design and
the relationship between humans and computers. It represents a shift in the dominant
practices and ideas of how technology should be designed and used.
Examples of HCI Paradigms:
1. The Desktop Paradigm:
o Based on the idea that the computer interface should resemble a physical
office desktop, with documents, files, and folders that can be
manipulated. This paradigm shaped early operating systems (e.g.,
Windows, macOS) and emphasizes the use of graphical user interfaces
(GUIs).
2. The Ubiquitous Computing Paradigm:
o This paradigm envisions technology embedded into everyday objects and
environments, making computing "invisible" and available everywhere.
Devices like smart homes and wearables reflect this paradigm.
3. The Tangible Computing Paradigm:
o Focuses on physical interaction with digital information. For example,
users might manipulate physical objects to interact with digital data (e.g.,
smartboards or interactive tables).

B. Visions
A vision in HCI is a futuristic, idealistic concept of how technology could evolve to
shape human lives and interaction. Visions often push the boundaries of current
paradigms, proposing novel and revolutionary ways of integrating technology into daily
life.
Examples of HCI Visions:
1. The Paperless Office:
o A vision of the future where all documents, records, and work processes
are digitized, eliminating the need for physical paper. While this vision
hasn’t fully materialized, aspects of it are seen in document management
systems and digital signatures.
2. Augmented Reality Everywhere:
o A vision of the future where augmented reality (AR) seamlessly integrates
into everyday life, overlaying digital information on the physical world
through smart glasses or other wearable devices.
3. The Internet of Things (IoT):
o A vision where everyday objects (like refrigerators, lights, or cars) are
connected to the internet, communicate with each other, and enhance
user experiences by automating tasks.

You might also like