Enhancing User Experience of Interior Design Mobile Augmented
Enhancing User Experience of Interior Design Mobile Augmented
Reality Applications
Keywords: User Interface, Augmented Reality (AR), Indoor Decoration, Mobile Augmented Reality Application.
Abstract: Intuitive user interface design is of utmost importance to mobile applications, especially when dealing with
new technologies like Augmented Reality (AR). In this paper, a user study for evaluating AR 3D furniture
arrangement mobile application user experience is presented. In our gesture design, we used one hand to ease
the use of the application. Firstly, the user interface is developed based on the literature recommendations and
users evaluate it using a set of five tasks in terms of System Usability Scale (SUS), Handheld Augmented
Reality Usability Scale (HARUS), task completion time, and the number of user errors. The obtained
evaluation results are then used to alter the user interface. The research outcome can be used to help in
developing a better user experience for a wider range of AR applications.
101
Kandil, A., Al-Jumaah, B. and Doush, I.
Enhancing User Experience of Interior Design Mobile Augmented Reality Applications.
DOI: 10.5220/0010630400003060
In Proceedings of the 5th International Conference on Computer-Human Interaction Research and Applications (CHIRA 2021), pages 101-108
ISBN: 978-989-758-538-8; ISSN: 2184-3244
Copyright c 2021 by SCITEPRESS – Science and Technology Publications, Lda. All rights reserved
CHIRA 2021 - 5th International Conference on Computer-Human Interaction Research and Applications
of the task should require very little memorization, and appliances. The application supports portrait
and textual information should be in an view only and the UI was designed to accommodate
understandable format and easy to read. two-handed use. As a result, the program's main
Singh and Singh (2013) presented the main buttons were placed on the lower edges making one-
features in an AR application and its challenges. First, handed use impractical.
an AR application needs suitable sensors to read the Seow (2018) developed a 3D furniture AR
environment properly to recognize the scene. Second, application. The solution allows users to rotate the 3D
after recognizing the scene, the AR application needs furniture by pressing a button once to activate clock-
to use trigger matching and image augmentations to wise rotation and when pressed again will rotate the
understand the scene discovered to place/display the object counterclockwise. Due to furniture
augmented information. Third, AR application needs arrangement being cumbersome Motwani et al.
to provide technologies to allow interaction between (2017) propose to solve this issue by using AR on
user and trigger matching. Fourth, an AR application mobile devices. The AR use “Image Targets” that
should provide an information infrastructure such as serve as reference points to the 3D objects being
cloud services to help user’s longer-term context. rendered in real-time.
Lastly, AR requires a considerable computing and
communication infrastructure to allow all previous
technologies to work as expected. Several challenges 3 METHODOLOGY
can face users of an AR application such as
information overloading. Our goal is to enhance user interaction when using
AR interior design applications. We developed an AR
2.2 AR for Interior Design solution called Furniture Augmented Reality App
(FAR App). FAR App is developed using Android
Developing solutions for an interior design using AR Studio, where Kotlin and Java are the main
can help ease the process of placing different pieces programming languages. Google’s ARCore is used to
of furniture virtually into the physical room. Tong et implement the AR component of the application. In
al. (2019) proposed a real-time AR application called addition, Google’s Sceneform was used to render the
AR Furniture which allows users to see the furniture 3D furniture to be used in the real-world scene.
in different colors and different styles. The solution
utilizes deep-learning-based semantic segmentation
3.1 UI Design Choices
and a fast-speed color transformation. The system
uses eye-gaze to read the environment and provides Our application uses the smartphone’s touch screen
virtual reality (VR) content. as the main source of input for the user. As a result,
Hui (2015) proposes the AR3D model which for the application to be easy to use, we were very
creates a 3D model that is based on the original careful in the placement of buttons on the screen to
interior construction plan is presented. The AR3D ensure that they can be easily reached by the fingers
model contains different sources of information to even in one-handed use. We also made sure that the
provide a generated environment with real and virtual main camera screen was not cluttered as to not
objects. It allows users to interact with and get real- obstruct the camera view while also not
time feedback from the changes made as well as overwhelming the user with options.
showing the relationship between the virtual and real The hand gesture interface to interact with the 3D
objects and how both can get influenced by each other furniture is selected based on its familiarity to users
in the same space. The main advantage of the created when they interact with other applications in the
AR3D model is to help in reducing design errors that phone. The following are the selected gestures: pinch
happen due to inappropriate spatial partition, to zoom, tilt to rotate, and tab to place an object. In
management, and construction problems. Moreover, addition, a brief tutorial is available for users to help
AR3D models can help customers in understanding them get familiar with the app's different gestures.
the interior design project and thus save costs and Some other features were suggested to us by the
time for the designer when the physical construction users during the first phase of testing, such as the
is done. Finally, the AR3D model allows the designer “undo last item” button among others to be detailed
to practice every possible design concept available in a later section.
from the user and get feedback in real-time.
Tsai et al. (2016) present an approach that uses
AR to model 3D objects of various home products
102
Enhancing User Experience of Interior Design Mobile Augmented Reality Applications
The first step in evaluating our application is to set 4.1 First Version of the FAR
certain tasks that will be unified for all users to follow.
The following are the five tasks used to evaluate the
Application
proposed solution: choosing and placing specific
virtual furniture, search for a 3D object, resize the 4.1.1 SUS Result
selected object, rotate an object, reposition an object,
The SUS questionnaire is used to evaluate the overall
and clear the scene (i.e., removing all objects).
application usability. The following table shows the
The evaluation was conducted to gauge the
10 SUS questions used for the evaluation:
efficiency of individual task completions and get
feedback from users to improve the proposed solution Table 1: SUS Questionnaire.
usability.
SUS Questions Relevance to Application
3.3.2 Evaluation Process Usability
103
CHIRA 2021 - 5th International Conference on Computer-Human Interaction Research and Applications
I found this application very Measures the user I thought that the Testing the novel
Q8 cumbersome/awkward to friendliness of the Q1 information displayed on visualization metaphors that
use. application. screen was confusing. is introduced from AR.
I felt very confident using Measures the simplicity of I think that interacting with Testing the application
Q9
this application. the user interface. Q2 this application requires a while moving around the
lot of body-muscle effort. real environment.
I needed to learn a lot of Measures the learnability
Q10 things before I could get of the application. I felt that using the Measures the strains on the
going with this application. Q3 application was comfortable hands and arms when using
for my arms and hands. the application.
Each question of the SUS has a possible score using
I found it easy to input Testing the novel interaction
a five-level scale. The results are demonstrated in Q4 information through the metaphors that is introduced
Figure 1. According to the results, only 40% of application. by AR.
participants said they will use the application
frequently. However, 20% thought that they will not I think the application is Testing the novel interaction
use it more frequently and 40% are neutral. Q5 easy to control. metaphors that is introduced
by AR.
Moreover, 72% of the participants thought that the
application was not unnecessarily complex. In terms I think that interacting with Measures the amount of
of learnability, 80% of participants thought that they Q6 this application requires a information presented on
do not need assistance before using the app. In lot of mental effort. the small screen.
addition to that, 92% of participants thought that most
I thought the amount of Testing the novel
people would learn to use this application very
Q7 information displayed on visualization metaphors that
quickly and 84 % did not see that they had to learn a screen was appropriate. is introduced from AR.
lot of things before using the app. This illustrates that
the UI created in the first version of the app was a I felt that the information Measures the latency issues
success in terms of learnability with the majority of display was responding fast that are resulted from the
Q8
enough. limited processing power
the participants agreeing that it is easy to learn how to
and network connection.
use the application quickly with no complications.
Moving on to another important aspect of the I felt that the display was Measures the tracking and
application which is the various functions that the Q9
flickering too much. registration errors that are
application proposes. Below is a chart that resulted from factors such as
dynamics and lighting.
summarizes the SUS questionnaire.
I thought the words and Measures the legibility
Q10 symbols on screen were issues that are resulted from
easy to read. ambient light, glare etc.
104
Enhancing User Experience of Interior Design Mobile Augmented Reality Applications
Each question of the HARUS scale was scored using 4.1.3 Time-on Task Result
a 5 scale. The results are presented in Figure 2. It
shows that 84% of the participants thought that the The final part of our evaluation was calculating the
information displayed on the screen was not time-on-task to measure the efficiency of completing
confusing. These results show that our application each task successfully. The application was
succeeded in allowing the users to understand the developed during the COVID-19 pandemic some
novel visualization metaphors that were introduced participants are evaluated online, while others got
from the AR concept. Most MAR applications are tested in person. Participants that were tested online
used while moving in the real world thus some of sent us the screen recordings to calculate the time it
them can be very hard to use if they consume a lot of took them to perform each task successfully.
body-muscle effort. Our application showed that 88% The minimum, maximum, and average times-on
of users thought that the application did not need a lot task for all users for each task is presented in Figure
of body-muscle effort while using it. Furthermore, 3. The results show that the task with the highest
our application showed great results when it came to average was the first task and the task with the
having minimal strains on the hands of the users when second-highest was Task 5. This demonstrates that
using the application as 84% of the participants most users took a lot of time in finding the side menu
thought that their hands were comfortable while using that had the virtual furniture to place it in the scene
the application. Additionally, our application gave the and also took a lot of time to find the clear button that
users a comfortable experience. When using the was also placed in the side menu.
application 72% thought that they did not find the
device difficult to hold while using the application.
Also, 84% of users thought that the application was Figure 3: Version 1 Min, Max and Avg task times.
easy to use. This shows that our application
introduced simple and easy novel interaction
4.1.4 User Recommendations
metaphors that made the users understand how to use After testing, the participants sent multiple
the application with all its features without being suggestions on how to improve the user experience
confusing. MAR applications are susceptible to show from their user experience after using the app. Many
too much information on the small screens that most
participants wanted more options and buttons on the
mobile phones have. This can lead the users to require
main UI screen. One of the participants suggested
a lot of mental effort to know how to use the
adding an undo button, while another suggested
application. The evaluation shows that 84 % of users having a clickable button for the side menu, instead
did not think that interacting with our application of dragging from the side.
required a lot of mental effort. We minimize the
The following are the suggested features
mental effort by developing a simple and intuitive UI
list: dedicated button for the side menu / floating
that did not include a lot of buttons on the main AR
menu, undo button, move the “clear scene” button
interface activity. Small screens are not the only outside the side menu, tweak the tutorial, and tweak
limitation for mobile phones, however, limited calibration instructions.
processing power and network connection can also The last question on the survey asked participants
limit the mobile phones' AR capabilities, 76% of
if they encountered any problems with the
participants thought that the information displayed
application. Some participants reported that the
was responding fast enough.
tutorial when they start the app is too lengthy. Some
of the participants experienced minor lagging and
hiccups in the overall performance of the app. Others
105
CHIRA 2021 - 5th International Conference on Computer-Human Interaction Research and Applications
reported that they did not know there was a menu at 4.2.1 SUS Result
the side due to the lack of visual indicator (see Figure
4). One participant reported that the initial calibration Using the same SUS questions as the first version of
was inaccurate on white floors due to the calibration the application (Table 1) and following the same
dots being white as well. procedure mentioned before we gathered the
responses presented in Figure 6.
4.2 Second Version of FAR App
After analyzing the feedback given by the participants
in the first testing round, the side menu is removed
and replaced by a pop-up menu button on the main
screen. The second thing that got modified was the
length of the tutorial to be short and precise. In
addition, a visual indicator is added for the menu as
demonstrated in Figures 6.
The participants that were already tested in the Figure 6: Version 2 SUS Result.
first version evaluated the application after one
month. In addition, they were not given any time to In comparison to the first version, in terms of
familiarize themselves with the application and the frequency of use, 59% of participants felt that they
tasks are reordered to ensure unbiased results. would use FAR App more often after the UI changes
that were made. Another improvement is the reduced
complexity of the application with 92% of
participants stating that it is not unnecessarily
complex to use. Furthermore, 100% of participants
find the application easy to use this time. Also, 92%
of users did not need assistance to use the application
properly, compared to 68% in the first version. Lastly,
95% do not find the application
awkward/cumbersome to use, while 92% felt
confident in using the application, both up from 72%
and 76% respectively.
106
Enhancing User Experience of Interior Design Mobile Augmented Reality Applications
5 DISCUSSION
As is apparent in the results, these are the tasks that
are specifically affected by the changes made in the
second version of the application: object placement /
searching and the “Clear all” function.
These two tasks were affected due to making the
items menu easier to access and because of having the
Figure 11: For example, instead of placing a chair, the user
“Clear all” button directly accessible from the main places a sofa.
107
CHIRA 2021 - 5th International Conference on Computer-Human Interaction Research and Applications
108