0% found this document useful (0 votes)
174 views

Module 10 Conduct The Usability Test

Uploaded by

api-736223585
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
174 views

Module 10 Conduct The Usability Test

Uploaded by

api-736223585
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 31

1

Pilot Usability Report: Google Maps


Web-Based Version

Technical Report
May 10, 2023

T-TUx Team Members


Andrea Ali-Rechtfertig
Elizabeth (Liza) Armstrong
Gerard Castaneda
Christina Mayne

Citation: Ali-Rechtfertig, A., Armstrong, E., Castaneda, G., & Mayne, C. (2023). Pilot usability
test: Google Maps web-based version. University of Missouri-Columbia, Course ISLT 9421:
Usability of Information Systems and Services.
Table of Contents
2

1.0 Executive Summary


2.0 Technology Description, Goals and Questions
3.0 Methodology
4.0 Participants
Table 3. Participant Demographic Information
5.0 Results
Table 4. Task Completion Success Rate
Table 5. Task SEQ Ratings
Table 6. Time on Task (mm:ss)
Table 7. System Usability Scale Scores
Table 8. Summary of Completion, Time on Task, SUS Scores
Table 9. Thematic Analysis
6.0 Design Recommendations
7.0 References
8.0 Team Project Reflections
9.0 Individual Reflections
Appendix A
Appendix B
Appendix C
Appendix D
Appendix E

1.0 Executive Summary


3

Project & Project goals

The primary goal of this pilot investigation was to assess the web-based version of Google
Maps and its impact on the end-user experience. The research aimed to identify any flaws that
might hinder a user’s ability to use the system efficiently and provide suggestions for
improvement. The study utilized a predefined test protocol, with each team member conducting
two pilot tests each.

Number of Participants & Important Demographics

The study included a total of 8 participants, with ages ranging from 19 to 60 and varying college
degrees ranging from some college to master’s degree. All participants had English as their first
language and no visual or hearing impairments of any sort. All participants classified their
technical proficiency as intermediate and used the same browser, Google Chrome, with a Mac
or PC.

Methods Applied

1. Participants were read an orientation script and demographic information was obtained
and documented on a Google Sheet.
2. The think-aloud process and interactions with research participants were recorded via
Zoom.
3. Five tasks were given to participants.
4. As a new component of the pilot test, a qualitative analysis was conducted that included
three components (Any comments relating to 1. difficulty, 2. ease, or 3. surprise) were
recorded.
5. A system usability scale (SUS) was asked via a Google Form.
6. Specific metrics were used

Main Results & List of Major/Minor Problems with Design Recommendations

The system has good usability, with participants successfully completing tasks and expressing
surprise at some features, such as being able to send text from their desktop using Google.
However, some minor problems were identified, such as the difficulty of choosing a panoramic
view of a location and entering a home address. One participant commented that the task was
easier to complete on their phone. A design recommendation to address these minor issues is
to make the process of selecting tools and entering information more intuitive and streamlined.

Usability Issues

Overall, our pilot test highlights the challenges with navigation for participants, participants’
perceived difficulty concerning the required functions to complete each task, the effectiveness
and efficiency of Google Maps, and the need to improve the visibility of tasks. In general, the
efficiency, visibility, and navigation were the most explicitly mentioned areas of weakness.

2.0 Technology Description, Goals and Questions


The test administrators took notes of the comments expressed by the participants during the
session using their choice of note-taking method. Sessions with the participants were recorded
using Zoom via synchronous observation online or in person. Zoom transcripts were available
4

for additional support if testers needed to review any of the verbal statements relating to the
qualitative analysis. The qualitative analysis has been added to the data along with the user
tasks into a single spreadsheet. The research team was interested in gaining a better
understanding of the user’s experience by collecting qualitative data regarding three themes
related to surprise, difficulty, and ease of use that may have been expressed by the user.
Quantitative data was included on the Google spreadsheet through the use of timed tasks via
the use of a stopwatch. Lastly, Google Forms was used to collect and inform user feedback and
opinions.

By leveraging a combination of technologies, a number of valuable insights were gained:

● The efficiency and effectiveness of of Google Maps


● The level of user-efficiency with the product
● Identification of which part of the product needs to be improved
● Identification of where additional support could be needed
● Unexpected errors in the system
● Users’ preferences

Overall, utilizing various technology tools have enabled the collection of both qualitative and
quantitative data gathering methods with technology to inform decision making and drive
product enhancement.

The goals of this usability study were to explore the end-user experience of the web-based
version of Google Maps, identify areas of weakness that may decrease a user’s efficient use of
the system, and provide recommendations for improvement to enhance ergonomic use of the
platform. These goals were accomplished through acquisition of quantitative and qualitative
data, inquisitive evaluation of test results, and use a focused approach to provide high-impact
and effective design recommendations.

The team seeks to evaluate and address the following questions in this report:
1. What are the most prevalent challenges with obtaining navigation information when
asked to perform five tasks in the web-based Google Maps platform?
2. How do test participants perceive the difficulty of the required tasks needed to obtain
navigation information?
3. Based on the metrics obtained from this study, what are the most critical design
elements that will need to be modified to enhance overall system navigation usability?

3.0 Methodology
The four researchers on Team T-Tux participated in writing tasks and a scenario about Google
Maps, recruiting participants, testing these participants, and recording participant data.

The researchers created five tasks in Google Maps for the usability test, listed in Table 1, along
with corresponding goals. The tasks were centered around the following scenario: You are
taking your significant other (or close friend) for a special dinner next Friday at 6:00 p.m.
Table 1. User Goals and Tasks

Goal Task

1 Locate Two Restaurants Using Google Maps, find 2 dining


5

establishments that have the highest reviews.

2 Map Directions to a Location Create a route of your preferred method of


travel from your location to the selected
restaurant.

3 Estimate Travel Duration Using a car or transit, designate a date and time
of departure to arrive by 6 p.m.

4 Visualize a Location Use the street view to get a panoramic view of


the street in front of your restaurant.

5 Send Directions to a Mobile Text a copy of the directions to your mobile


Device device.

Eight participants, two from each researcher’s personal network, were recruited for this usability
study. While this convenience sample was primarily due to time and cost restrictions, it also
provided advantages in terms of allowing for follow-up questions more easily and for relatively
consistent prior knowledge about Google Maps among participants (Bruzi, n.d.). Moreover,
having eight participants was concluded to be an ample number for this study as it exceeded
the minimum of five suggested by Nielson (2012a),

Before taking part in the study, participants were informed about the goals of the usability test
and the purpose of recruiting test participants. Participants’ willingness to participate by
completing a usability test for the web-based version of Google Maps was also confirmed.
During the recruitment process, the participants were informed of the test purpose, recording
procedures, and their right to choose not to participate at any time during the testing process.

Each researcher administered two test sessions, one for each of their recruited participants. A
typical test session lasted about 35 to 45 minutes. The test procedure is described below (see
Appendix A for a summary).

Before each session, the test administrator read the orientation script to participants (see
Appendix B), including such information as the purpose of the test, the number of tasks, the
process of thinking aloud, evaluation prompts, and the right to withdraw from the test at any
time. Also prior to the test, participants provided demographic information. To gather and record
this information, administrators read from and took notes in a “Usability Worksheet,” a formal
observation form (see Appendix C).

The think aloud was selected as a method to immediately access user thoughts about Google
Maps as users were completing the five tasks. In the think-aloud method, participants were
asked to “use the system while continuously thinking out loud — that is, simply verbalizing their
thoughts as they move through the user interface” (Nielsen, 2012b). The think-aloud method
was chosen because of its low cost, ease of use, and effectiveness (Nielsen, 2012b). Compared
to other methods, such as a “retrospective think aloud” or a follow-up interview following all the
tasks, the “concurrent think aloud,” was used (Panagiotidi, n.d.). Another advantage of this
method, in which users verbalized during each task instead of afterwards, was that it reduced
the cognitive burden of participants to recall specific processes they used, thus likely increasing
the accuracy of the data collected about their thoughts
6

If permission to record was granted by the participant, the administrator began recording the
test using Zoom. Then, using the “Usability Worksheet” the administrator read the first task,
began the timer, and recorded the participants’ responses during their think aloud process.
These notes were recorded on the “Usability Worksheet,” and the administrator was careful to
note positive, negative, and surprising reactions by participants. When the participant completed
or gave up on the task, the administrator noted the end time and marked the completion status
(pass = 1 or fail = 0) on the worksheet. Appendix D provides a description of the functions
required to successfully complete each task in Google Maps. Afterwards, the administrator
asked the participant to respond to the Single Ease Question (SEQ): “Overall, how would you
rate this task to complete? 1-Very Difficult to 7-Very Easy” (Sauro (2010), and recorded the
participant’s response on the worksheet. The SEQ questionnaire, described by Sauro as
“psychometrically vetted” (as cited in Birkett, 2022), was chosen as opposed to others, like the
After Scenario Questionnaire (ASQ) and NASA-TLX, due to its simplicity and brevity: a single
question. After asking the SEQ, this process was repeated for the remaining four tasks.

After the completion of the five tasks, the administrator asked the participant ten questions in
the System Usability Scale (SUS) (see Appendix E). SUS, which is “an industry standard, with
references in over 1300 articles and publications” (Usability.gov, n.d.), is valid and practical,
consisting of only 10 questions as opposed to other questionnaires like the SUMI: Software
Usability Measurement Inventory, which has 50 questions. For each question in SUS, users
selected responses on a 1-5 scale, from strongly disagree to strongly agree. SUS was also
advantageous in its suitable for testing small numbers of users and for assessing software like
Google Maps (Mifsud, n.d.)

Participant responses to SUS were recorded in a Google Form, which allowed for easier
collection and summarization of the data than a simple paper form. Administrators recorded the
data collected from each “Usability Worksheet” into a shared Google sheet. As needed,
administrators analyzed the recordings for verification of the time needed to complete each task
and for key comments during the think aloud process.

4.0 Participants
The pilot usability test was conducted using eight participants. See Table 2 for the testing dates
and participants. See Table 3 for the participant demographic information. The mean age of the
eight participants is 42.8 years of age. The minimum age for the participants is 19 years of age
with the maximum age equal to 60 years of age. The mode is 44 years of age with the median
age being equal to 43 years of age. The participant's highest education levels vary ranging from
some college to Masters degrees. None of the eight participants report having hearing or visual
impairments. The native language for all eight participants is English. All eight participants
report an intermediate level of technical proficiency. All eight participants completed the usability
test using the Google Chrome browser and 5 participants used a PC and the 3 participants used
a Macintosh computer.

Table 2. Participant Number and Date of Testing

Participant Number Date of Usability Test


P1 & P2 March 7 2023
P3 March 6 2023
7

P4 March 5 2023
P5 April 25 2023
P6 April 23 2023
P7 April 22 2023
P8 April 25 2023

Note: The sequence of the participant does not determine the sequence of date for the usability
testing date.

Table 3. Participant Demographic Information

Participant Age Highest Hearing Visual Native Technical Type of Web


Number (in years) Educationa Impairment Impairment Language Proficiency Computer Browser
l Level Level Used for Used for
Testing Testing
P1 60 MD NONE NONE English INT PC Chrome
P2 41 AD NONE NONE English INT PC Chrome
P3 44 MD NONE NONE English INT MAC Chrome
P4 42 BD NONE NONE English INT MAC Chrome
P5 44 BD NONE NONE English INT PC Chrome
P6 19 SC NONE NONE English INT PC Chrome
P7 53 BD NONE NONE English INT MAC Chrome
P8 40 MD NONE NONE English INT PC Chrome

Table 3 Legend:
Highest Education Level:
● SC = Some College
● AD = Associate’s Degree
● BD = Bachelor's Degree
● MD = Master’s Degree
Technical Proficiency:
● INT = Intermediate

5.0 Results
The raw data of the usability test results can be found by selecting this Google Sheet. The
synthesis for each of the metrics are detailed further below.

5.1 Task Completion Results

All eight participants successfully completed Task 1 (‘finding two restaurants’), Task 2 (‘creating
a route’), and Task 3 (‘designate the date and time of departure’). This yielded a 100%
completion rate for each of those respective tasks (Task 1, Task 2, and Task 3). Participant
eight (P8) was unable to complete Task 4 (‘generate a street view image of the restaurant’).
This resulted in seven of the eight participants completing Task 4 which yielded an 87.5%
completion rate. Participant 6 (P6) was unable to complete Task 5 (‘text copy of directions to
your device’) which also yielded a completion rate of 87.5% for this task.
8

Table 4. Task Completion Success Rate

Participant Task 1 Task 2 Task 3 Task 4 Task 5


P1 1 1 1 1 1
P2 1 1 1 1 1
P3 1 1 1 1 1
P4 1 1 1 1 1
P5 1 1 1 1 1
P6 1 1 1 1 0
P7 1 1 1 1 1
P8 1 1 1 0 1
Success Points 8/8 8/8 8/8 7/8 7/8
Success Rate 100% 100% 100% 87.5% 87.5%

Notes. Completion rates were calculated using a numeric scoring system. When a research
participant successfully completed the task, a value of one was assigned to the task for the
participant. When the participant was unable to complete the task, a value of zero was
assigned.

5.2 Single Ease Question (SEQ) Results

Task 1 (‘finding two restaurants’) yielded the highest SEQ results (94.6%). Six of the eight
participants rated Task 1 a 7 (‘very easy’) while Participant 2 (P2) and Participant 4 (P4)
assigned a score of 5 and 6 respectively. Task 5 (‘text copy of directions to your device’)
demonstrated the next highest average rating with 87.5%. This was followed by Task 2 (82.1%)
where P4 rated this as a 1 (‘very difficult’). Task 3 (‘designate the date and time of departure’)
follows with an average rating of 67.9% with Participant 8 (P8) assigning a rating of 1. Finally,
Task 4 garnered the lowest rating with 53.6%. P8 gave this task a 1 as well which contrasts with
the score of a 7 assigned by Participant 5.

Table 5. Task SEQ Ratings

P1 P2 P3 P4 P5 P6 P7 P8 Average %
Task 1 7 5 7 6 7 7 7 7 94.6%
Task 2 7 6 7 1 7 7 5 6 82.1%
Task 3 4 5 5 5 5 6 7 1 67.9%
Task 4 3 6 3 5 7 3 2 1 53.6%
Task 5 7 7 7 7 7 4 7 3 87.5%
Average % 80% 83% 83% 69% 94% 77% 80% 51%

Notes. The Single Ease Question as described by Sauro (2010) was used to demonstrate rating
results. Research participants employed a scale and assigned a value of 1 (‘very difficult’)
through 7 (‘very easy’) following each task. Average results were calculated by dividing the
value assigned to the task with the highest number of possible points for each respective task
and converting the value to a percentage.

5.3 Task Time Calculation


9

Task 5 (‘text copy of directions to your device’) yielded the fastest completion time with four of
the eight research participants performing the task below the average of 0:49. The longest
amount of time on a single task was on Task 4 (‘generate a street view image of the restaurant’)
by Participant 7 (5:42). Comparatively, five participants completed Task 4 below the average
(2:08). Task 3 (‘designate the date and time of departure’) demonstrated the longest time
average of 2:15 with six of the eight performing under the average. Task 1 (‘finding two
restaurants’) reported an average of 1:47 while Task 2 (‘creating a route’) pulled an average
time of 1:31.

Table 6. Time on Task (mm:ss)

P1 P2 P3 P4 P5 P6 P7 P8 Average
Task 1 0:46 1:14 2:43 2:37 2:21 0:50 3:13 0:38 1:47
Task 2 0:44 0:28 0:20 4:09 0:51 0:45 3:56 1:02 1:31
Task 3 4:15 2:13 2:09 0:45 3:47 1:57 1:32 1:29 2:15
Task 4 1:53 0:48 4:16 0:33 0:17 2:14 5:42 1:26 2:08
Task 5 0:07 0:09 0:59 0:15 0:04 2:06 0:57 2:01 0:49
Average 1:33 0:58 2:05 1:39 1:28 1:34 3:04 1:19

5.4 System Usability Scale

The overall System Usability Scale score yielded an average score of SUS = 80 reflecting an
acceptable grade scale of an A- by the research participants. Based on scores and the grading
scale, six of the eight participants (P1, P2, P3, P5, P6, and P8) consider the solution as
“Acceptable” whereas scores for two of the eight (P4 and P7) were considered “Marginal”.
Participant 4 yielded the lowest score of a SUS = 60 resulting in a grade of a D. However,
Participant 5 garnered the highest score with a SUS = 95 furnishing a grade A+.

Table 7. System Usability Scale Scores

P1 P2 P3 P4 P5 P6 P7 P8 Average
SUS Score 85 92.5 77.5 60 95 77.5 67.5 85 80
Grade Scale A- A+ B D A+ B+ C A+ A-
Acceptability Acceptable Acceptable Acceptable Marginal Acceptable Acceptable Marginal Acceptable Acceptable
Range

Note. Grade scale and acceptability range as outlined by Sauro (2018).

5.5 Overall Metrics

Table 8 provides a summary of completion times, SEQ ratings, mean time on task, and the
average SUS score. The collective completion rate of all five tasks yielded an average
completion rate of 95%. SEQ rating results yielded a 77.1% result with the lowest rating on Task
4 (‘generate a street view image of the restaurant’) (53.6%) and highest rating on Task 1
(‘finding two restaurants’) (94.6%). The average time to complete all the tasks was 1:42. Finally,
the SUS score yielded an SUS = 80 indicating an acceptable usability range.
10

Table 8. Summary of Completion, Time on Task, SUS Scores

Task Task Completion SEQ Rating Mean of Time on Task SUS Average Score
Task 1 100% 94.6% 1:47 80%
Task 2 100% 82.1% 1:31
Task 3 100% 67.9% 2:15
Task 4 87.5% 53.6% 2:08
Task 5 87.5% 87.5% 0:49
Average 95% 77.1% 1:42 80%

Thematic analysis was utilized using an inductive approach to help identify patterns with the
participants thoughts regarding their experience during the usability testing process. Data was
coded and then arranged into specific themes.

Table 9. Thematic Analysis

Theme Number of Times Theme Keywords Task Number


was Detected
Participant found difficulty 4 difficult, hard, strenuous, 1
with a task. 3 arduous, laborious, tough, 2
demanding
5 3
7 4
2 5
Participant found the task to 6 easy, effortless, simple, 1
be easy. 9 straightforward, no problem, 2
painless
3 3
4 4
6 5
Participant identified an 1 Surprise, wow, did not 1
element of surprise with a 0 realize, can't believe, shock, 2
task. eye-opener, amazed, woah
0 3
1 4
0 5

The data analyzed during the thematic analysis process indicated that there were very few
elements of surprise (neither positive nor negative) that surfaced.

The top two tasks that participants found most difficult based on the comments were “Task #4:
Use the street view to get a panoramic view of the street in front of your restaurant” and “Task
#3: Designate a date and time of departure to arrive by 6pm.”

The top three tasks that participants found the easiest based on the comments were “Task #2:
Create a route of your preferred method of travel from your location to the selected restaurant.”
“Task #1: Using Google Maps, find 2 dining establishments that have the highest reviews” and
“Task #5: Text a copy of the directions to your mobile device” received equal scores.

Errors or Problems Identified by the Participants


11

Task #1: There were no reports of any specific errors or problems identified by the participants
as part of the usability testing.

Task #2: The usability testing showed mixed results on the user’s understanding of using pins
and/or manipulating pin drops to add and/or change the route.

Task #3: The usability testing showed mixed results on each user’s ability and understanding of
the use of this feature. The option to use this feature appropriately is not clear when selecting a
specific mode of travel. Additional guidance and/or prompting on the use of this feature is
preferred. Overall, the participants feel like this feature is buried and needs to be easier to use.

Task #4: The usability testing demonstrated that participants found it challenging to locate the
easiest way to navigate or identify the street view option. Additional guidance and/or prompting
on the use of this feature is preferred.

Task #5: There were no reports of any specific errors or problems identified by the participants
as part of the usability testing.

User Reflections

Task #1: Positives: The usability testing for task number one yielded multiple positive
comments.The general consensus showed that the participants appreciated having multiple
options available from the interface to search and filter by specifics to complete the task. The
participants thought the way search result information was presented to them made it relatively
easy to help influence their decision making. Participants were generally satisfied with the
simplicity of the search bar and with how quickly search results are returned.

Negatives: The usability testing for task number one yielded one negative comment. The
comment indicated that while you can filter by ratings, price, cuisine etc., there is no way to
search specifically by “fine dining” (as indicated by the task specifics) and additionally, the term
“fine dining” is subjective and results returned in the search can vary based on reviews or ad
sponsorship potentially skewing the results returned.
12

Task #2: Positives: The usability testing results for task number two indicated that most
participants found that creating a route was intuitive and appreciated that multiple route options
were suggested and available for selection. All participants indicated that identifying and
selecting a mode of travel was an easy task. Other comments included an appreciation for
being able to get directions to any place and that the interface offers additional information that
might be helpful depending on your mode of travel (i.e. elevation, terrain etc.).
13

Negatives: The usability testing results for task number two indicated that participants found the
option for pin dropping for address selection was challenging or non-existent creating barriers to
task completion. Further research showed that participants who wish to drop multiple pins on
Google Maps, will need to customize their own map by using the Create Map option. This opens
a custom map where a user can drop as many pin icons as desired (Dube, 2021).

Task #3: Positives: The usability testing for task number three generated a few positive
comments regarding the ease of locating the option to add an arrival/departure time to each
route helping to calculate the duration of travel. Participants commented that this was a nice
feature to have to assist with time management and planning.

Negatives: The usability testing of task number three generated a few negative comments about
completing the task when selecting public transit. There is some ambiguity regarding how to
calculate the duration of travel when factoring in the walking distance to arrive at the public
transit depot.
14

Task #4: Positives: The usability testing generated positive comments with regards to the street
view option. Participants were overall satisfied with this feature and found it helpful to familiarize
themselves with the area, points of interest, and the view of the final destination. For those who
were satisfied with the street view option, the comments demonstrated appreciation for the
feature.

Negatives: The usability testing showed several negative comments and drawbacks regarding
the street view option. Participants were not satisfied with the lack of directions and/or
instructions for enabling street view. In addition, the map and/or images can be out of date due
to closed locations, new construction etc. so participants should proceed with caution on the use
of the street view.
15

Task #5: Positives: The usability testing showed positive results for this task. Participants felt
the system was user friendly and the functionality was easy to use. Comments overall indicated
that this feature is nice to have and very simple to use.

Negatives: No negative comments were presented for this task as part of the usability testing.

6.0 Design Recommendations


1. Design Recommendation: If a Google Maps user identifies that an image is out of date
and/or incorrect using Google Maps street view, consider implementing an option for a
user to report a problem with the imagery. Google Maps currently offers participants the
ability to report a crash, speed trap, slowdown, construction, lane closure, stalled
vehicle, or an object on the road when navigating. Adding the option to report issues with
imagery would be an easy way to receive feedback from the general public and will help
contribute to keeping Street View up to date.
2. Good Usability: Main features including but not limited to the search bar and the
category buttons (i.e. restaurants, hotels, things to do etc.) at the top of the page indicate
16

their functions clearly and allows participants to quickly access the information they are
seeking.
3. Design Recommendation: Improve the visibility actions. Participants struggled to
find accessible tools. For example, from task four, participants commented included: “It
was buried.” “I know what I need, but I cannot get there.” “It needs to be easier to
navigate.” In task 5, participants continued to find tools, and one comment was, “I don’t
know how to do that, oh wait I finally see an icon at the bottom of the screen.” From the
qualitative Analysis, participants said,”It is difficult to find the street view option if you
don’t know where to find that tool.” Also, “Several landmarks showed up when I chose a
panoramic view. Maye these are supposed to be landmarks.” Indicating that Google
Maps did not direct and lead the user through the correct steps to narrow down what
was needed. Simplify menu options. Participants found the menu options on Google
Maps interface overwhelming. One participant said, “I wish I didn’t have to manually
enter the address of the locations on the left fields. Dropping pins would have been
much easier.”

7.0 References

Birkett, A. (2022, December 20). Measuring user the think-aloud process: 8 Ways to
improve experience. CXL. https://cxl.com/blog/8-ways-to-measure-ux-satisfaction/#h-1-
system-usability-scale-sus

Bruzi, N. (n.d.). Six ways to recruit participants for remote usability testing. Usability Geek.
https://usabilitygeek.com/recruit-participants-for-remote-usability-testing/

Mifsud, J. (n.d.). Usability metrics: A guide to quantify the usability of any system.
Usability geek. https://usabilitygeek.com/usability-metrics-a-guide-to-quantify-system-
usability/

Nielsen, J. (2012a, June 3). How many test users in a usability study?. Nielsen Norman Group.
https://www.nngroup.com/articles/how-many-test-users/

Nielsen, J. (2012b, January 15). Thinking aloud: The #1 usability tool. Nielsen Norman
Group. https://www.nngroup.com/articles/thinking-aloud-the-1-usability-tool/

Dube, R. (2021, August 28). How to drop multiple pins on Google Maps. Lifewire. Retrieved
March 11, 2023, from https://www.lifewire.com/drop-multiple-pins-on-google-maps-
5197232

Panagiotidi, M. (n.d.). Think-aloud testing for usability: How to get more out of your data.
Uxspot.io. https://uxspot.io/think-aloud-testing-for-usability.html

Sauro, J. (2018). 5 ways to interpret a SUS score. MeasuringU.


https://measuringu.com/interpret-sus-score/
17

Sauro, J. (2010, March 2). If you could only ask one question, use this one. MeasuringU.
https://measuringu.com/single-question/
Usability.gov. (n.d.). System usability scale (SUS). Usability.gov.
https://www.usability.gov/how-to-and-tools/methods/system-usability-scale.html

8.0 Team Project Reflections


In the planning stages of the usability test, identify the functions required to complete each task
in the usability test, and operationalize what passing the test means.
The comments by participants written on the spreadsheet in the qualitative analysis section
needed to be referenced or labeled in some way so that I knew exactly which task the comment
was referring to.
Although our usability test did not implement video cued retrospective or gaze cued
retrospective methods, the use of traditional methods of collecting data on user behavior and
feedback, such as think-alouds, scenarios, observations, and surveys provided valuable insights
into the user’s thought process and decision-making. Through the aforementioned methods, our
team was able to collect data on user behavior and feedback that included the discovery of
severe usability with Google Maps without the need for expensive technology.

9.0 Individual Reflections


Andrea Ali-Rechtfertig: The project taught me the importance of careful planning and
preparation and the various methods of data collection, along with the use of technology. During
the project, I learned a lot about the available technology which included knowing when to use
them appropriately. I gained an understanding of the various methods of data collection and the
different technologies that can be used to gain valuable insights from the participants. It was
interesting to learn about the GCR and VCR industry-standard cognitive methods and how they
can be used to identify specific usability issues. I would like to add that my team members were
a significant part of the success of the pilot test project, which helped us to achieve our goals
effectively.I would like to further acknowledge the exceptional skills of my team members
Christina, Liz, and Gerard. Christina was an excellent communicator, and she ensured that
everyone was on the same page throughout the project. Liz was exceptional in her ability to
analyze and further investigate a topic and method. Her skills in data analysis were instrumental
in helping us to analyze the data collected from the participants accurately. Gerard was the
most organized person I have ever worked with. His skills in project management kept us on
track and organized throughout the project. His attention to detail ensured that no aspect of the
project was overlooked, and everything was completed in a timely manner. Overall, each
member brought unique skills and strengths to the project, and I am grateful for the
contributions. Working collaboratively with them has been a great learning experience!

Elizabeth (Liza) Armstrong: I appreciated working on a team for our usability project. It
facilitated generating ideas, such as choosing to focus on Google Maps and selecting/refining
our 5 tasks. It also lightened the workload in terms of testing 8 users, which would have been
much more burdensome for a single researcher. As a team, I also enjoyed gaining new
perspectives and recommendations. For example, we often shared articles with one another
that could support particular sections. After reading our results section, written by a team
member, I also realized that my understanding of how to accomplish certain tasks in Google
Maps was limited. For example, to find the rating for the restaurants, the users I observed
18

hovered over restaurants to find their ratings; however, some of my team members’ participants
filtered the locations in Google to restaurants, which created a list in which the ratings for nearby
restaurants were provided. This situation highlighted for me that it would have been useful as a
group to operationalize and define each path to “pass” each task in the initial stages of our
project. Working as a group also made me aware of the effective strategies that my group
members were using in terms of organizing meetings, sending email updates with recaps, and
even formatting Table of Contents in Google. My group members came to our group already
having strong team working skills and set us up for success early on, such as by creating
observation worksheets and Google sheets for consistent testing and amassing of our data.
This process highlighted the importance of these collaborative skills in creating a successful
ultimate usability project.

Gerard Castaneda: This project provided me with insight to the nuances associated with
usability testing and the importance of evaluating systems holistically. I learned a lot from
working with my team on the design and implementation of this test that I would not have
realized otherwise. This collective effort allowed me the opportunity to view project tasks and
testing approaches from a variety of different angles. Furthermore, the use of developing
metrics and incorporating qualitative and quantitative data to support decision making will be
extremely valuable in my future work. I value the need for enhancing the user experience and I
will take lessons learned from this project and course forward as I strive to continue to improve
the overall system design of my future technology projects.

Christina Mayne: Participating in this project over the duration of the semester leading up to
the final usability testing report project has been an insightful experience. Although I had no
background in usability testing, it became obvious very quickly that usability testing is a
necessary, essential process to ensure that you are able to build an effective, enjoyable
experience for your participants whether they are interacting with a website, app, or product. I
have learned that usability testing is the primary means of identifying areas of trouble or
confusion and/or to identify areas of opportunities for improvement. During the design process
of an app, website, or product, it can be common for those who are working on a design project
to lose sight of potential usability issues as their own in-depth knowledge and close connection
to the design can blind them from seeing usability challenges. Therefore, it is important to bring
in new participants to participate in usability testing to determine if there are any usability issues
or bugs, confusion, and confirm a user’s ability to complete specific actions in a functional and
efficient way. Throughout this project, I have learned many different things in regards to usability
testing, but the main take home message can be summarized with the following quote. “To find
ideas, find problems. To find problems, talk to people.” - Julie Zhou
19

Appendix A
Testing Procedure

1. Read the orientation script to the participant

2. Obtain demographic information and document it on Demographic and Usability


Test Results (Google Sheet)Links to an external site.

3. Turn on the Recording Device to be used for session (Audacity/Zoom)

4. Begin Task #1:

1. Read Task

2. Start timer

3. Document Think Aloud Items (as task is being completed) on the


Demographic and Usability Test Results (Google Sheet).

4. Document Positive items identified (as task is being completed)

5. Document Negative items identified (as task is being completed)

6. End timer

7. Document if the participant Passed or Failed the completion of the


task

8. Document time in minutes and seconds used to complete task

9. Verbally ask the Single Ease Question (SEQ) question “Overall, how
would you rate this task to complete? 1-Very Difficult to 7-Very Easy”

10. Document SEQ Score

5. Repeat the procedure used for Task #1 for the remaining tasks #2 through #5.

6. Bring up the System Usability Scale (SUS) Questions found on the Google Form
on the computer and ask participant to answer the SUS Questions (or you can
verbally ask if preferred) https://forms.gle/xbS2QZapdSKcGbL66.

7. Thank the participant for their time and notify them that the test is concluded.
20

Appendix B

Orientation Script

[Administrators read the script below to participants at the start of each usability test
session.]

Hello, my name is [moderator]. Thank you for participating in today’s session. The purpose
is to evaluate the usability of Google Maps, particularly 5 tasks. We would like to keep this
session to about 45 minutes. Do you still wish to participate in this study?

Note: If the participant declines to participate, thank the client and conclude the
interaction. A new participant must be found.

In this session, I will ask you a few questions about yourself, and then read a brief scenario
and ask you to complete the 5 tasks in Google Maps. As you complete these activities,
please try to do whatever comes naturally to you, and I will observe you and take notes
while you do them. To aid in my observation, please try to think out loud while you’re
working on the tasks; in other words, let me know whatever comes to mind. I may also
record our session with (Audacity/Zoom) to support my observation. Please be aware that
this recording is only for reference purposes and the recording will be deleted after the
transcriptions of our interaction are completed. Do you consent to this session being
recorded?

Note: If the participant declines to be recorded, you may proceed with the test but
you will not be allowed to record the session.

After you complete each task or need to move on, please let me know, and I will ask you to
rate the difficulty of each task. At the end of our session, you will complete a brief online
survey about Google Maps.

Be aware this session is not a test, and there are no incorrect answers. If you have any
questions during the process, please ask. Additionally, if you feel uncomfortable, you may
stop at any time. Your feedback is valuable and will help us determine the usability of
Google Maps. Do you have any questions before we begin?
21

Appendix C

Demographics and Usability Test Results

This document is a duplicate of questions that is on the Google Sheet Usability Pilot Test -
Group 2 Data

Printing and using this document can be used if you prefer to write notes on a separate paper
prior to entering data on the Google Sheet.

Demographics:

Age (in years): ________________

Highest Educational Level of Test Participant:

Doctoral Degree
Masters Degree
Bachelor's Degree
High School Diploma
GED
Some High School
Elementary School

Is the Test Participant Hearing Impaired?

No
Yes

Is the Test Participant Visually Impaired?

No
Yes

What is the Test Participant's Native Language?

English
Not English

How does the Test Participant classify their Technical Proficiency?

Novice
Intermediate
Advanced
22

What Type of Computer was used for this test?

PC
MAC

Web Browser (We will attempt to use Chrome as 1st Choice)

Chrome

If not Chrome, enter what was used


23

Demographics and Usability Test Results

Task #1
Scenario:

You are taking your significant other (or close friend) for a special dinner next Friday at
6:00 p.m.

Task 1: Using Google Maps, find 2 dining establishments that have the highest reviews.

1. Did the participant Pass or Fail completing this task?

PASS
FAIL

2. Time in Minutes and Seconds (e.g. 2:45 = 2 min 45 sec)

_______________________ minutes:seconds

3. Positive Comments about the system

4. Negative Comments about the system

5. Highlights from participant's "Think-a-loud" thought process

6. SEQ Question: Overall, how would you rate this task to complete?

1-Very Difficult to 7-Very Easy

Rating = ________________
24

Demographics and Usability Test Results

Task #2

Task 2: Create a route of your preferred method of travel from your location to the
selected restaurant.

1. Did the participant Pass or Fail completing this task?

PASS
FAIL

2. Time in Minutes and Seconds (e.g. 2:45 = 2 min 45 sec)

_______________________ minutes:seconds

3. Positive Comments about the system

4. Negative Comments about the system

5. Highlights from participant's "Think-a-loud" thought process

6. SEQ Question: Overall, how would you rate this task to complete?

1-Very Difficult to 7-Very Easy

Rating = ________________
25

Demographics and Usability Test Results

Task #3

Task 3: Designate a date and time of departure to arrive by 6pm.

1. Did the participant Pass or Fail completing this task?

PASS
FAIL

2. Time in Minutes and Seconds (e.g. 2:45 = 2 min 45 sec)

_______________________ minutes:seconds

3. Positive Comments about the system

4. Negative Comments about the system

5. Highlights from participant's "Think-a-loud" thought process

6. SEQ Question: Overall, how would you rate this task to complete?

1-Very Difficult to 7-Very Easy

Rating = ________________
26

Demographics and Usability Test Results

Task #4

Task 4: Use the street view to get a panoramic view of the street in front of your
restaurant.

1. Did the participant Pass or Fail completing this task?

PASS
FAIL

2. Time in Minutes and Seconds (e.g. 2:45 = 2 min 45 sec)

_______________________ minutes:seconds

3. Positive Comments about the system

4. Negative Comments about the system

5. Highlights from participant's "Think-a-loud" thought process

6. SEQ Question: Overall, how would you rate this task to complete?

1-Very Difficult to 7-Very Easy

Rating = ________________
27

Demographics and Usability Test Results

Task #5

Task 5: Text a copy of the directions to your mobile device.

1. Did the participant Pass or Fail completing this task?

PASS
FAIL

2. Time in Minutes and Seconds (e.g. 2:45 = 2 min 45 sec)

_______________________ minutes:seconds

3. Positive Comments about the system

4. Negative Comments about the system

5. Highlights from participant's "Think-a-loud" thought process

6. SEQ Question: Overall, how would you rate this task to complete?

1-Very Difficult to 7-Very Easy

Rating = ________________
28

Appendix D

Description of Functions in Google to Successfully Complete each Task

Evaluators expected participants to use the Google Maps functions in the table below. These
descriptions were used to determine if the user passed or failed each task.

Task Description of the Google Maps features


that are relevant

1 Using Google Maps, find 2 dining Users must


establishments that have the highest ● enter a location in Google maps
reviews. ● hover over several icons with either a
wine glass or a fork and knife
(indicating restaurants). By hovering,
users can see the number of stars out
of a 5-star rating. The higher the star
rating, the better the restaurant is
considered. - OR - filter places on the
map to restaurants (select “More” and
then “Restaurants.” Then, skim
through the list of restaurants and
identify two with the most stars.)

Pass: Name 2 restaurants with the highest


stars near the chosen location

2 Create a route of your preferred method of Users must


travel from your location to the selected ● select a restaurant on Google maps
restaurant. ● select the “Directions” button in
Google Maps
● enter their starting location
● if desired, change the method of travel
from driving (car icon) to a different
method (e.g., select the man for
walking or the train for publish transit)

Pass: Point to a route that you prefer to


get from your location to the selected
restaurant
29

Task Description of the Google Maps features


that are relevant

3 Using a car or transit, designate a date Users must


and time of departure to arrive by 6 p.m. ● select “Arrive by” from the options
● select Friday
● enter 6 p.m.

Pass: Given the time that Google


estimates for the route selections, the
users must accurately say the time they
must leave their starting location to arrive
by 6 p.m.

4 Use the street view to get a panoramic Users must


view of the street in front of your ● select the “little man” in the lower right
restaurant. corner of Google Maps and then
select the street in front of the
restaurant – OR – drag the “little man”
onto the street in front of the
restaurant (both methods result in
“Street View”
● move the cursor to get a view of the
street panoramically

Pass: Users see and move a street view


image in front of their selected restaurant.

5 Text a copy of the directions to your Users must


mobile device. ● exit from the street view by selecting
“x” (top right screen)
● select “Send directions to your phone,”
a link on the left side of Google Maps
● select a phone or email address to
send the directions to – OR – add a
device to do so (by following directions
in Google Maps to “Connect a device
to your Google Account.”)

Pass: Users receive directions in their


mobile device.
30

Appendix E

SUS Survey (Sauro, 2010)

In Google Forms (https://forms.gle/xbS2QZapdSKcGbL66)

Strongly Strongly
Disagree Agree

1. I think that I would like to use this 1 2 3 4 5


system frequently.

2. I found the system unnecessarily 1 2 3 4 5


complex.

3. I thought the system was easy to 1 2 3 4 5


use.

4. I think that I would need the support 1 2 3 4 5


of a technical person to be able to
use this

system.

5. I found the various functions in this 1 2 3 4 5


system were well integrated.

6. I thought there was too much 1 2 3 4 5


inconsistency in this system.

7. I would imagine that most people 1 2 3 4 5


would learn to use this system very
quickly.

8. I found the system very 1 2 3 4 5


31

Strongly Strongly
Disagree Agree

cumbersome to use.

9. I felt very confident using the 1 2 3 4 5


system.

10.I needed to learn a lot of things 1 2 3 4 5


before I could get going with this
system.

You might also like