Evaluation in Human
Evaluation in Human
Definition:
Note:
Evaluation in HCI provides valuable insights into how users interact with a
computer system or application and how well the system meets their needs
Without evaluation, it is difficult to know if a system is truly usable, efficient,
and satisfying to users
Evaluation helps to identify problems early on in the design process, when
they are easier and less costly to fix
It helps to ensure that the final product is of high quality and meets the
needs of the users
HCI evaluations can provide feedback that can help improve the overall user
experience, increase efficiency and productivity, and reduce frustration and
errors.
There are several different types of evaluation that can be used in Human-Computer
Interaction (HCI) research and design, including:
Note:
7. A/B testing: A team of marketers runs an A/B test on a website by showing half
of the visitors a version with a red "buy" button and the other half a version with
a blue "buy" button. They then measure the conversion rate for each version to
determine which color of button is more effective.
Problem:
A new social media platform for sharing and commenting on news articles is being
developed, but the development team is unsure if the platform is meeting the needs
of its intended users.
Solution: To evaluate the effectiveness and usability of the new social media
platform, the development team could use a combination of different types of
evaluation methods:
1. Usability evaluation: The team could conduct usability testing by having a group
of users perform a set of tasks on the platform, such as sharing an article, leaving
a comment, and searching for a specific topic. The team could measure the users'
performance, including task completion time and error rate, as well as their
satisfaction with the platform.
2. Functional evaluation: The team could conduct functional testing on the
platform to ensure that it is able to perform its intended tasks, such as posting,
commenting and searching articles.
3. User acceptance evaluation: The team could administer a survey to users of the
platform to gather feedback on the platform's perceived usefulness, ease of use,
and overall satisfaction.
4. Accessibility evaluation: The team could conduct an accessibility evaluation on
the platform to ensure that it is compatible with assistive technologies and
compliant with accessibility guidelines for people with disabilities.
5. Performance evaluation: The team could conduct performance testing on the
platform to measure its response time, ability to handle large amounts of data,
and overall scalability.
6. Impact evaluation: The team could conduct a study to measure the impact of the
platform on users' knowledge and overall well-being
7. A/B testing: The team could conduct an A/B test on the platform by showing
different versions of the interface to different users, and then measuring which
version is more effective in terms of user engagement and satisfaction.
8. Heuristic evaluation: The team could conduct a Heuristic evaluation by
inspecting the platform against a set of established usability principles
(heuristics) such as consistency, visibility of system status, and error prevention.
9. Cognitive Walkthrough: The team could conduct a cognitive walkthrough by
simulating the thought process of a user trying to complete a task on the
platform and identifying potential problem areas, such as lack of clear
instructions or confusing navigation.
Problem:
A new computer rental services website is being developed, but the development
team is unsure if the website is meeting the needs of its intended users.
Solution: To evaluate the effectiveness and usability of the new computer rental
services website, the development team could use a combination of different types
of evaluation methods:
1. Usability evaluation: The team could conduct usability testing by having a group
of users perform a set of tasks on the website, such as searching for a specific
computer, reserving a rental, and completing the checkout process. The team
could measure the users' performance, including task completion time and error
rate, as well as their satisfaction with the website.
2. Functional evaluation: The team could conduct functional testing on the website
to ensure that all the features are working as intended, such as the rental
reservation system, the checkout process, the payment gateway, and the
inventory management system.
3. User acceptance evaluation: The team could administer a survey to users of the
website to gather feedback on the website's perceived usefulness, ease of use,
and overall satisfaction.
4. Accessibility evaluation: The team could conduct an accessibility evaluation on
the website to ensure that it is compatible with assistive technologies and
compliant with accessibility guidelines for people with disabilities.
5. Performance evaluation: The team could conduct performance testing on the
website to measure its response time, ability to handle large amounts of data,
and overall scalability.
6. Impact evaluation: The team could conduct a study to measure the impact of the
website on users' productivity and overall satisfaction.
7. A/B testing: The team could conduct an A/B test on the website by showing
different versions of the interface to different users, and then measuring which
version is more effective in terms of user engagement and satisfaction.
8. Heuristic evaluation: The team could conduct a Heuristic evaluation by
inspecting the website against a set of established usability principles (heuristics)
such as consistency, visibility of system status, and error prevention.
9. Cognitive Walkthrough: The team could conduct a cognitive walkthrough by
simulating the thought process of a user trying to complete a task on the website
and identifying potential problem areas, such as lack of clear instructions or
confusing navigation.
The Scenario:
Imagine you are designing a new recipe website, and you want to make sure that it's
easy for users to find and use the recipes they want. In order to do this, you might
conduct a heuristic evaluation. The evaluators would be experts in the field of
website design and usability, and they would examine the website to identify any
usability issues. They would use a set of established usability principles, such as
"Consistency and Standards" or "Error Prevention," to guide their evaluation. They
would then provide feedback on what they found, pointing out areas where the
website could be improved. For example, the evaluators might suggest that the
website's search function could be improved, or that the recipe categories could be
more clearly labeled.
1. Design: The first step is to design the evaluation plan, which includes identifying
the goals and objectives of the evaluation, selecting appropriate methods, and
developing a data collection plan.
2. Evaluate: This step involves collecting and analyzing data, interpreting the results
and drawing conclusions about the system or application being evaluated.
3. Implement: Based on the evaluation results, appropriate changes are made to
the system or application to improve its usability, efficiency, and overall
satisfaction for the users.
4. Disseminate: This step involves sharing the results of the evaluation with
stakeholders such as the development team, users, and other relevant parties.
5. Evaluate again: The final step is to re-evaluate the system or application after the
improvements have been made to ensure that the changes have been effective
and that the system continues to meet the needs of the users.
Using the DECIDE framework ensures that the evaluation process is systematic and
comprehensive, and that all important aspects of the evaluation process are considered,
making the evaluation more reliable and trustworthy.
Why it is important?
Using the DECIDE framework ensures that the evaluation process is systematic and
comprehensive, and that all important aspects of the evaluation process are considered,
making the evaluation more reliable and trustworthy.
1. Design: The evaluation plan is designed with the following goals and objective
2. Evaluate: The data is collected by observing the users while they perform the tasks
and recording any difficulties they encounter, how long it takes them to complete
each task, and any feedback they provide. The data is analyzed and conclusions are
drawn about the website's usability, efficiency, and user satisfaction.
3. Implement: Based on the evaluation results, the design team makes changes to the
website, such as simplifying the navigation, adding clear calls to action, and improving
the checkout process.
4. Disseminate: The results of the evaluation are shared with stakeholders such as the
development team, users, and other relevant parties.
5. Evaluate again: The website is re-evaluated after the improvements have been made
by conducting a follow-up user testing to ensure that the changes have been effective
and that the website continues to meet the needs of the users.
A Tourist Website
1. Design: The evaluation plan is designed with the following goals and objectives:
2. Evaluate: The data is collected by observing the users while they perform the tasks
and recording any difficulties they encounter, how long it takes them to complete
each task, and any feedback they provide. The data is analyzed and conclusions are
drawn about the website's usability, efficiency, and user satisfaction.
3. Implement: Based on the evaluation results, the design team makes changes to the
website, such as simplifying the navigation, adding filters and sorting options, and
improving the booking process.
4. Disseminate: The results of the evaluation are shared with stakeholders such as the
development team, users, and other relevant parties.
5. Evaluate again: The website is re-evaluated after the improvements have been made
by conducting a follow-up user testing to ensure that the changes have been
effective and that the website continues to meet the needs of the users.
窗体顶端
Practical Issues:
Ethical issues:
Example:
窗体底端
A problem: A team of HCI researchers wants to conduct an evaluation study of a new
virtual reality (VR) system, but they are facing practical and ethical issues.
Solution:
Practical issues:
Recruiting participants: The team reaches out to local universities and research
groups to recruit a diverse group of participants who are willing to participate in
the study. They also offer incentives such as payment or course credit for
participation.
Time and cost: To minimize time and cost, the team carefully plans and designs
the study, selecting the most appropriate methods for data collection and
analysis.
Data collection and analysis: The team trains themselves in the appropriate
methods for data collection and analysis, such as user testing and usability
testing, to ensure that the data is accurate and reliable.
Generalizing results: The team acknowledges that the results may not be
generalizable to the entire population of VR users, and they report the
limitations of the study in their findings.
Ethical issues:
Informed consent: The team obtains informed consent from all participants,
informing them of the purpose, procedures, and potential risks of the study.
Privacy and confidentiality: The team protects participants' personal information
and data by keeping it confidential and not sharing it with any third parties.
Deception: The team ensures that participants are not deceived about the
purpose or nature of the study by providing them with accurate and honest
information.
Risk of harm: The team assesses the potential risks of the study and takes
appropriate measures to minimize them, such as providing participants with a
way to exit the VR environment if they feel uncomfortable.
Fairness: The team designs the study to be fair and not discriminate against any
particular group of users, such as those with disabilities or different cultural
backgrounds.
This solution shows how the team of HCI researchers can address the practical and
ethical issues involved in evaluating a new VR system, by carefully planning and
designing the study, recruiting a diverse group of participants, obtaining informed
consent and protecting their privacy, ensuring the accuracy and honesty of the
information provided, assessing and minimizing the potential risks and designing the
study to be fair to all participants.
窗体顶端
Regenerate response
窗体底端
Tourist Website
Solution:
Practical issues:
Recruiting participants: The team reaches out to local tourist agencies and travel
groups to recruit a diverse group of participants who are willing to participate in
the study. They also offer incentives such as discounts for future travel or gift
cards for participation.
Time and cost: To minimize time and cost, the team carefully plans and designs
the study, selecting the most appropriate methods for data collection and
analysis.
Data collection and analysis: The team trains themselves in the appropriate
methods for data collection and analysis, such as user testing and usability
testing, to ensure that the data is accurate and reliable.
Generalizing results: The team acknowledges that the results may not be
generalizable to the entire population of tourist website users, and they report
the limitations of the study in their findings.
Ethical issues:
Informed consent: The team obtains informed consent from all participants,
informing them of the purpose, procedures, and potential risks of the study.
Privacy and confidentiality: The team protects participants' personal information
and data by keeping it confidential and not sharing it with any third parties.
Deception: The team ensures that participants are not deceived about the
purpose or nature of the study by providing them with accurate and honest
information.
Risk of harm: The team assesses the potential risks of the study and takes
appropriate measures to minimize them, such as providing participants with a
way to exit the study if they feel uncomfortable.
Fairness: The team designs the study to be fair and not discriminate against any
particular group of users, such as those with disabilities or different cultural
backgrounds.
This solution shows how the team of HCI researchers can address the practical and
ethical issues involved in evaluating a new tourist website, by carefully planning and
designing the study, recruiting a diverse group of participants, obtaining informed
consent and protecting their privacy, ensuring the accuracy and honesty of the
information provided, assessing and minimizing the potential risks and designing the
study to be fair to all participants.
Additional Topics:
Imagine you are designing a new recipe website, and you want to make sure that it's
easy for users to find and use the recipes they want. In order to do this, you might
conduct a heuristic evaluation. The evaluators would be experts in the field of
website design and usability, and they would examine the website to identify any
usability issues. They would use a set of established usability principles, such as
"Consistency and Standards" or "Error Prevention," to guide their evaluation. They
would then provide feedback on what they found, pointing out areas where the
website could be improved. For example, the evaluators might suggest that the
website's search function could be improved, or that the recipe categories could be
more clearly labeled.