0% found this document useful (0 votes)
100 views

2017 - Design Guidelines For User Interface For AR - DISERTATION

Augmented Reality Guidelines for User interface

Uploaded by

Carlwin Dayagdag
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
100 views

2017 - Design Guidelines For User Interface For AR - DISERTATION

Augmented Reality Guidelines for User interface

Uploaded by

Carlwin Dayagdag
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 85

Masaryk University

Faculty of Informatics

Design Guidelines for User


Interface for Augmented
Reality

Master’s Thesis

Jakub Blokša

Brno, Spring 2017


Masaryk University
Faculty of Informatics

Design Guidelines for User


Interface for Augmented
Reality

Master’s Thesis

Jakub Blokša

Brno, Spring 2017


This is where a copy of the official signed thesis assignment and a copy of the
Statement of an Author is located in the printed version of the document.
Declaration
Hereby I declare that this paper is my original authorial work, which
I have worked out on my own. All sources, references, and literature
used or excerpted during elaboration of this work are properly cited
and listed in complete reference to the due source.

Jakub Blokša

Advisor: Ing. RNDr. Barbora Bühnová Ph.D.

i
Acknowledgement
I would like to express my gratitude to my advisor Ing. RNDr. Barbora
Bühnová, Ph.D. for the support of my study and research, for the guid-
ance and expertise. Accordingly, I would like to thank my consultant
Ing. Michal Košík, M.Sc from Honeywell for the continuous collabora-
tion, great advice, and beneficial ideas. Also, thanks to my family and
friends for the support.

iii
Abstract
The paper analyzes user interfaces and guidelines for augmented
reality, reviews already existing interfaces for virtual reality and other
devices to determine the plausible application in augmented reality
with the addition of auctorial ideas and guidelines. The research ex-
tracts practical parts of the analysis, like the control types and the pre-
sentation elements suitable for the augmented reality devices in order
to suggest the right application within the guidelines.
The creation of the set of the guidelines for the user interface
designing is the purpose of this paper. Besides that, it describes po-
tential application of the guidelines, the main outcome of the work,
for the augmented reality technology in the industrial environment,
especially for the instructions and assembly tasks. The utilization
of the guidelines for the user interface creation focused on natural
human-machine interaction is another important part of the research.
It is hoped that this study and the guideline creation will help with
the future augmented reality application designing. Either to help
with the real application or with the further research of this field of
study to progress with more normalized techniques and approaches.

iv
Keywords
Augmented reality, user interface, user interaction, design, guidelines,
control

v
Contents
1 Introduction 3
1.1 Definition of the problem . . . . . . . . . . . . . . . . . . . 3

2 Problem analysis 5
2.1 Comparison — augmented reality vs. virtual reality . . . . . 5
2.1.1 Purpose . . . . . . . . . . . . . . . . . . . . . . . 5
2.1.2 Delivery method . . . . . . . . . . . . . . . . . . 7
2.1.3 Advantages and limitations . . . . . . . . . . . . 9
2.2 Variation of user interface design . . . . . . . . . . . . . . . 10
2.2.1 2D vs. 3D interfaces . . . . . . . . . . . . . . . . 10
2.2.2 Existing graphic user interfaces for AR and VR . 11
2.3 Input interfaces . . . . . . . . . . . . . . . . . . . . . . . . 15
2.3.1 Voice control . . . . . . . . . . . . . . . . . . . . 15
2.3.2 Gesture control . . . . . . . . . . . . . . . . . . . 16
2.3.3 Input devices . . . . . . . . . . . . . . . . . . . . 17
2.3.4 Eye tracking . . . . . . . . . . . . . . . . . . . . . 18

3 Selection and justification of solution methods 19


3.1 Use case scenario . . . . . . . . . . . . . . . . . . . . . . . 19
3.1.1 An aircraft engine repair . . . . . . . . . . . . . . 20
3.2 Analysis of key elements . . . . . . . . . . . . . . . . . . . 22

4 Design guidelines 25
4.1 Design guideline for the present technology . . . . . . . . . 25
4.1.1 Presentation . . . . . . . . . . . . . . . . . . . . . 25
4.1.2 Control . . . . . . . . . . . . . . . . . . . . . . . . 33
4.1.3 Showing instructions . . . . . . . . . . . . . . . . 39
4.1.4 Navigation . . . . . . . . . . . . . . . . . . . . . . 43
4.1.5 Notifications . . . . . . . . . . . . . . . . . . . . . 46
4.2 Design guideline change for the future technology . . . . . . 51

5 Evaluation 53
5.1 Application of design guidelines on the use case . . . . . . . 53
5.2 Evaluation table . . . . . . . . . . . . . . . . . . . . . . . . 55
5.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 58

vii
6 Conclusion 59

Bibliography 61

viii
List of Tables
4.1 Control types table 34
4.2 Command response table 37
5.1 Evaluation table, part one 56
5.2 Evaluation table, part two 57

ix
List of Figures
2.1 Comparison of reality, AR, and VR [3]. 6
2.2 Reality-virtuality continuum [2]. 6
2.3 Example of an UI overlay in FPS game [13]. 11
2.4 The map and GPS device in Far Cry 2 [14]. 12
2.5 GUI in Microsoft HoloLens UWP app [16]. 13
2.6 Graphic user interface in Google Glass [17]. 13
2.7 Comparison of AR devices [18]. 14
2.8 Menu in HTC Vive virtual reality [19]. 15
2.9 Screenshot of golf club selection in VR. 16
3.1 A concept map of key elements for AR interface
design. 22
4.1 Diagram of user viewing zones [33]. 26
4.2 Optimal zone for hologram placement for Microsoft
HoloLens [34]. 27
4.3 Content and workspace zones rendering (left) and zones
applicability testing (right) [33]. 28
4.4 Types of range selection sliders [33]. 29
4.5 Button animations to evoke the real-life effect [33]. 30
4.6 Example of UI elements placement on a hand [33]. 31
4.7 Hovercast menu placed around the fingertips [39]. 32
4.8 Technique of color harmonization. [42]. 32
4.9 Keyboard input for virtual reality applications [46]. 38
4.10 Inventory UI and interaction in Unreal Engine Daydream
dungeon VR project [50]. 42
4.11 Skype application call in Microsoft HoloLens [52]. 43
4.12 Landing a plane using Aero Glass [53][54]. 44
4.13 Example of overloaded car navigation UI [55]. 45
4.14 Simple and comprendious car navigation UI [56]. 46
4.15 Highlighted geometric elements to ease the navigation in
a virtual game [13]. 47
4.16 A way to highlight objects in the game Mirror’s Edge
Catalyst. [58]. 47
4.17 Overloaded UI example [65][66]. 50

xi
List of Abbreviations
6DOF Six Degrees of Freedom

AR Augmented Reality

FOV Field of View

FPS First Person Shooter

GUI Graphic User Interface

HMD Head-mounted Display

OHMD Optical Head-mounted Display

POI Point of Interest

RPG Role-playing Game

UI User Interface

UWP Universal Windows Platform

UX User Experience

VR Virtual Reality

1
1 Introduction
How to design the user interface for an augmented reality application?
From the title of this work, it is clear that with the potential usage
of the design guideline, the designer should be able to create a user-
friendly application for augmented reality.
Although, in the meantime, there is no well-known generalized
collection of the design guidelines for the user interface for augmented
reality. The design guidelines could be applied in the industrial and
work environment to make the task completion easier and more effec-
tive for the employee.
The goal of this thesis is to analyze the problem of the UI (user in-
terface) design for the AR (augmented reality) application, to research
the existing approaches and guidelines for the topic then to suggest
the guideline solution for the problem and to evaluate the solution.
The collaboration with Honeywell team will ensure that the analysis
and the solution creation will stay functional and related to the appli-
cation in the industry.
This thesis consists of following tasks:

∙ problem analysis,

∙ selection and justification of solution methods,

∙ design guidelines creation.

After the completion of these tasks, it will be necessary to do


the evaluation of the potential usage and application of the guideline
outcome in the industry in the future. For this reason, it is needed to
focus the design guideline creation in the industrial environment.

1.1 Definition of the problem


The main goal of the work is to find the optimal way of showing
data and creating user interfaces for augmented reality mostly for
industrial application. The purpose is not to create one definite type of
user interface, but to create a set of guidelines for the future creation of
user interfaces for AR. This problem is quite distinctive because there

3
1. Introduction

is not a lot of similar works, which have contributed to the creation of


guidelines for AR interface designers.
The field of user-friendly design for augmented and virtual reality
is still a bit unexplored. It is because of large scale and complexity of
UI design for AR. Therefore, the first objective of this work will be to
distinguish the essential and non-problematic or unnecessary parts of
the task. Listing design practices that designer should avoid will be as
significant as finding and determining good practices.
The problem does involve finding a suitable way of showing con-
crete information, to separate which information is necessary to show
and why. It is also important to establish the way how to interact with
this information, how to invoke them when needed or hide them when
the information has no use. Dividing the problem into smaller tasks
and ask precise questions is useful, as well. What is the correct layout
and what makes the layout incorrect? What colors to use and why?
How will user control shown information? Is there a complex way to
do so? Estimating situations when AR technology is or is not needed
is also crucial.
To discuss technical possibilities is only partly involved in the prob-
lem. The work will reckon with ideal possible resources to solve
the problem. The technical limitations will be researched further in
the problem analysis.

4
2 Problem analysis
Augmented and virtual realities are nowadays becoming more avail-
able and common. Accordingly, it is needed to provide more complex
and user-friendly designs by UX (user experience) designers and devel-
opers. As this is a relatively new area of design, there stands a problem
of how to do so.
As there is no generalized collection of design guidelines for de-
signers of user interfaces for augmented reality. Already existing user
interfaces and guidelines are mostly focused on specific devices e.g. for
Google Glass, Microsoft HoloLens, etc. A key part of developer guide-
lines is limited to computer games and virtual reality (VR) games,
because of their massive popularity.
The objective of problem analysis will be to bring together informa-
tion about AR and VR, to compare these two concepts and to deduce
which elements of VR design are likely to be reusable in AR design,
indicate whether some of the VR use cases would be eventually more
suitable to be applied in AR.

2.1 Comparison — augmented reality vs. virtual


reality

2.1.1 Purpose

Virtual reality and augmented reality fulfill different purposes, de-


spite the similar designs of the devices. Figure 2.1 presents the main
difference between augmented and virtual reality. VR replaces reality,
offering the user another virtually created one. AR is projecting infor-
mation on surfaces around the user in a surrounding space, adding
to reality. They are both helpful technologies which show a lot of
promise in many fields of modern usage [1]. Image 2.2 shows continu-
ous scale ranging from the completely virtual to the completely real
environment [2].
They can change how people use computers and information tech-
nologies in the future. For example, virtual reality games will put the
user in the entirely new space, where the user can interact with objects

5
2. Problem analysis

Figure 2.1: Comparison of reality, AR, and VR [3].

Figure 2.2: Reality-virtuality continuum [2].

of the virtual world. The user can also explore the world by immersing
in the virtually simulated street views of foreign cities, and so on.
The usage of such devices can vary. It can be used for education
purposes, work, and entertainment. Both AR and VR are suitable for
different forms of education. QR code in a book can create an audio
library in a classroom [4]. This code can be easily scanned and AR
device will start audio guide or visualize some information with ani-
mation, hologram or video. By using AR instead of VR device the user
can maintain the attention to space around, in this instance maintain
the attention in the classroom. Though augmented reality can be used
also for the similar purposes as VR, the main purpose of the device of
this type is to extend living reality to ease the everyday life.

6
2. Problem analysis

2.1.2 Delivery method

Virtual and augmented realities are usually delivered to the user


through various devices. For the context of this work, the main focus
will be on a head-mounted controller. VR headsets use head-mounted
displays (HMD) to show virtual space. These displays are not see-
through, because VR is supposed to separate the user from the normal
reality. AR devices are usually classified as an optical head-mounted
display (OHMD) and divided into two categories: optical see-through
and video see-through. A system based on video-see through type
presents video feeds from cameras inside HMD. It can be used for
remote cooperation with robotics or image enhancement such as ther-
mal imagery or night vision [5]. Since reality is digitized, it is much
easier to manipulate the picture of the objects from reality [6]. Opti-
cal see-through systems combine computer-generated imagery with
real-world view through the glasses, usually through a slanted semi-
transparent mirror. Optical techniques offers more safety for the user.
The user can see-through when power fails, making this an ideal tech-
nique for military and medical purposes [6]. The user should be able
to see through the display, to interact with real world embraced with
virtual add-ons.
AR can be delivered through the smartphone, too. Mobile AR got
popular with the rise of the game Pokémon GO. People have hunted
virtual creatures in the real world by using this game [7]. Common
usage of AR application is also QR code scanning. There are five types
of AR for such device — a projection based, recognition based, location
based, outlining and superimposition based [8]. Projection based AR
uses projection onto objects to add to the reality. Recognition based AR
focuses on the recognition of the object to provide more information
about it. The method of location based AR is to get the information
about the users’ location with GPS, compass or accelerometer to add
related information about the objects that can be seen by the user.
Outlining AR is based on the object recognition in order to outline
the parts of the object to help the user with navigation, instructions
assembly, education, etc. Superimposition AR provides alternative
view of the object, like X-Ray [8]. For example, the game Pokémon GO
is location based and any QR code scanning application is recognition
based.

7
2. Problem analysis

Ocularity

HMD device has two types of ocularity. The HMD device can be
monocular or binocular. Ocularity type makes difference in how man-
ual tasks are done with a wide AR image covering the foreground.
"Despite their significant advantages, there are some problems with
AR systems. AR systems of certain kinds involve widespread AR im-
ages covering a wide range of the visual field, thus rendering it difficult
for the observer to simultaneously view the real world. If the AR image
is sufficiently large to cover the entire field of view (FOV), it becomes
impossible for an observer to perform any task in the background." [9]
As a team of Japanese scientists mentions in their study — "Com-
parison between Binocular and Monocular Augmented Reality Presen-
tation in a Tracing Task" [9], they were examining superiority of monoc-
ular presentation over binocular presentation. They did experiments
where they compared the monocular condition with the binocular
condition of AR image presentation for tracing accuracy and the per-
ception of an AR image. They investigated information acquisition
from the AR image during the tracing task.
"In this study, we established the superiority of monocular AR pre-
sentation over binocular presentation when a wide AR image, which
is suitable for inspection or assembly tasks, covered the background.
In the monocular condition, the AR image was less visible, resulting in
more accurate performance in tracing task than in the binocular con-
dition. When participants were required to acquire information from
the AR image, as well, their information extraction performance in
the monocular condition was equivalent to that in the binocular condi-
tion. These results show the advantages of monocular AR presentation
for wide AR images. Previously, it has been shown that the monocular
AR presentation is feasible with use of a concave reflector to focus
light to one. Results obtained in this study show the advantage of
monocular AR over binocular AR, particularly for presenting wide
AR images." [9]
This advantage is relevant only when there is need of manual tasks
involving tracing accuracy and mixed contact of real world objects
with AR presentation. For example, binocular Microsoft HoloLens
uses other approaches to maintain accurate, not covering wide part of
FOV, projecting holograms in surrounding space.

8
2. Problem analysis

2.1.3 Advantages and limitations

As stated before, VR is meant to put the user in other reality. While


the user is seeing virtual reality, the body of the user is still in the nor-
mal one. Therefore the users’ movement possibilities are limited.
The user completely immersed in the experience can turn around
and maybe move to the sides, but chances of wide motion are deter-
mined by real space around the user. For this purpose, the user can
use a special treadmill for VR and also, there were developed various
forms of user movement inside the virtual reality — teleport system
with the use of gaze and VR controllers, shifting the tracking space in
the virtual world according to the movements of the user in real world,
and others [10] [11]. These techniques try to obviate the motion sick-
ness problem from VR experience. Motion sickness in VR is induced
by 2 main factors: image refresh rate lower than the brain process,
which will cause a discord making the user sick and the disparity
between visual and vestibular stimulation — quick noncontinuous
motions, running or jumping in one reality with a still body in other
reality [10].
The main advantage of VR is that the designer can deliver to
the user almost whatever can be imagined. If stepped back from tech-
nical limitations, the designer is able to create a whole new reality of
any type.
AR technology is based on the mobility element as it is supposed
to help the user with everyday life and other tasks. One of AR limita-
tions is rendering digital data into meaningful graphics and scaling
it to fit the perspective of the field of users’ view because the size of
FOV is a problem for AR devices right now. Also, some older mobile
AR systems were cumbersome, requiring a heavy backpack to carry
the computer, sensors, display, batteries, and everything else [6]. This
limitation is quite obsolete with the creation of devices like Microsoft
HoloLens. Although, battery endurance is a common problem for any
mobile device. Optical and video see-through displays are getting
better for outdoor use, but still there stands a problem on the designer
to handle low brightness, contrast, resolution, and other. AR is also
limited by the real world objects. The designer can cover them with
some pattern, put virtual objects on them, change their color and so
on, but can not erase them. Designers have to count on the world

9
2. Problem analysis

around when developing AR user interface. It is potentially possible


to think about this as an advantage, too. Other limitations can be depth
perception, fatigue and eye strain, overload, and social acceptance,
too [6]. For instance, the problem of "refocusing" is when the user has
to switch focus between the distant real world and visual elements
of the display up front. Such refocusing could cause strain and fa-
tigue to the eyes [12]. With the visual elements positioned in the front
of the users’ view, the user may have the problem with the depth
perception.
Despite numerous limitations, AR technology could be really worth-
while in many fields of use. Good user interface design would help
with many of these limitations, to achieve better user experience.

2.2 Variation of user interface design


Although AR and VR technologies are relatively new, there are a lot of
various user interface designs for them. These designs can vary due to
their purpose, but they have to face the same problem, how to design
a proper user interface for devices that are built mostly on interaction
with the 3D world (either real or virtual)?

2.2.1 2D vs. 3D interfaces


For a long period of time user interface design was mainly focused on
2D interfaces. It is understandable because interfaces were designed
mostly for computers, tablets, and smartphones. Interfaces with 2D
elements are simple and easy to catch on. They are often used for flat
displays, where 3D design could be illusive. Although they seem to
be quite abstract because the world is three dimensional.
Only game interfaces evoke a little similarity to the augmented
and virtual reality applications (real world or real world simulation).
For example, FPS game (First person shooter) designers try to emulate
somewhat alike image of the real world view with added interface. It
could be useful to draw from the previous experiences of designing
interfaces for such games.
Windows, icons, menus, and pointing are used as the conventional
desktop UI. Although, this does not apply that well to AR systems. Not

10
2. Problem analysis

Figure 2.3: Example of an UI overlay in FPS game [13].

only is interaction required with six degrees of freedom (6DOF) rather


than 2D, also the use of common devices like a mouse and keyboard
are laborious to wear. This could reduce the AR experience [6].
Figure 2.4 shows usage of 2D elements to evoke real object in FPS
games. It shows the map and GPS device as it would look like in real
life. Therefore it appeals more natural for the user. These types of
elements could be used for designing UI for AR and VR applications.
There stands a question, when is it more practical to use 2D or 3D
elements? The user interface can be strictly 2D based as well as 3D
based, but more useful could be mixing 2D and 3D elements in one
particular design set.

2.2.2 Existing graphic user interfaces for AR and VR

Examples of GUI for augmented reality

Microsoft HoloLens is one of the most advanced and accessible AR


devices. GUI (graphic user interface) contains holograms placed in
the space around the user. It runs on Windows 10 operating system,

11
2. Problem analysis

Figure 2.4: The map and GPS device in Far Cry 2 [14].

therefore these holograms and menus are designed with a similar


style as it would be designed for laptops, tablets, and other Universal
Windows Platform (UWP) devices. This user interface menu is 2D
based with 3D feeling, it merges what people are used to with what
is natural. Figure 2.5 shows how the user interacts with the menu
by using a simple gesture. The user can use various forms of input
to control the device. Microsoft HoloLens can also offer 3D based
experience with the mixed reality game RoboRaid, with the usage of
spatial mapping and holograms [15].
Google Glass strictly places GUI in the right upper corner of users’
vision. It is placed about 30 centimeters in front of the user and the user
can see and interact only with 2D based interface. It is partly see-
through and really simplified to fulfill the purpose of daily usage as it
is shown in figure 2.6.
There are many AR devices using other forms of user interface.
Some of them being still developed. For example, Meta 2 and Atheer
AiR Glasses are using the quite similar holographic approach as Mi-
crosoft HoloLens. On the other hand, Epson Moverio is mixing holo-
graphs with elements known from virtual reality applications and

12
2. Problem analysis

Figure 2.5: GUI in Microsoft HoloLens UWP app [16].

Figure 2.6: Graphic user interface in Google Glass [17].

13
2. Problem analysis

Figure 2.7: Comparison of AR devices [18].

games. A comparison of few AR devices by Adam Dachis shown in


figure 2.7, published by Next Reality on 14th October 2016 [18].

Examples of GUI for virtual reality


Virtual reality devices are spread and well-known right now. They
have gained the massive popularity mostly from the gaming industry.
As mentioned before, computer games have made a huge impact on
the development of the user interface design for AR and VR.
As there are lots of different devices, such as PlayStation VR, Ocu-
lus Rift and HTC Vive, there are lots of design approaches. Some of
these devices are paired with consoles (PlayStation, Xbox) and some
of them are paired with personal computers. This affects the final user
interface. PlayStation VR has similar GUI as PlayStation console, HTC
Vive and Oculus Rift uses mainly SteamVR app as games library.
Although these devices have many diverse features, their main
menus have something in common. They are 2D based menus floating
in a space, somewhat similar to Microsoft HoloLens menu.
User interfaces in-game are slightly variant. It really depends on
the type of the game, but the majority of game designers uses a mixture
of 2D and 3D based elements. These elements are supposed to evoke
us real life objects. Using just 3D based elements would not be as
eye-catching, as an addition of the 2D factor to make them far more

14
2. Problem analysis

Figure 2.8: Menu in HTC Vive virtual reality [19].

visible and clear-cut. Figure 2.9 displays the mixture of 2D and 3D in


the user interface.

2.3 Input interfaces


There is a variety of control interfaces for AR and VR. It is standard to
control with voice, gestures, some special input devices and possibly
with eyes. A mixture of these types of control can be effective, too.

2.3.1 Voice control


Voice control is in the meantime well known and used. The user can
control a smartphone, tablet or notebook with simple or more complex
voice commands. The most popular artificial intelligence software
that can recognize voice input are Siri, Cortana, Google Assistant, and
Alexa. They differ slightly in what they can achieve, but globally users
are able to control their devices with common commands to open
an app, text search, navigate, send a simple text message and even do
the shopping, check the news or book a flight.

15
2. Problem analysis

Figure 2.9: Screenshot of golf club selection in VR.

As the technology advances, the users will be able to completely


control their devices via voice commands. Right now Microsoft
HoloLens offers an option of multiple voice commands. The user
has to start a sentence with "Hey Cortana", so the command processor
will recognize the user want to use voice control [20].

2.3.2 Gesture control


Gesture control is really important for the ordinary home user, but
for application in the industry, it could be cumbersome. There can be
a situation in which, the user would not want to control the device
with voice and to use some input device could be bothersome, too.
Therefore to control the device with gestures seems to be really natural.
In augmented reality of Microsoft HoloLens, the user can use ges-
tures to interact with holograms and menus. Hand recognition is mod-
ern technology which uses special sensors to track the movement of
users’ hand and fingers. "HoloLens looks for hand input within a cone
in front of the device, known as the gesture frame, which extends
above, below, left and right of the display frame where holograms

16
2. Problem analysis

appear." [21] There is a set of simple gestures. The most common is


air-tapping, which is equivalent to a classic mouse click. Also, the user
can pinch to drag holograms, resize them and so on.

2.3.3 Input devices


Input devices give the user many forms of control. There are specific
devices for controlling the motion of users’ hand, fingers, feet or whole
body.
Simple HoloLens Clicker can be used instead of using air-tap ges-
ture to control the user interface. It is small remote with single button
along with rotational tracking [22].
Similar to this device is Daydream Controller developed by Google.
Although the controller is created for VR, it is possible to use it for more
than the clicking. It can recognize swipe, double tap and other combi-
nation of touches. The remote has 2 buttons, one clickable touchpad,
volume buttons and accurate rotational tracking [23]. Daydream VR
can be completely controlled by the device of this type, but Daydream
Controller could not be adequate for more specific and particular con-
trol situations in AR technology. With a combination of the simple
input device, gestures and voice commands, it is prospective to create
a well-balanced UI control.
Controllers for virtual reality are more complex. They can consist
of multiple buttons, triggers, analog sticks and trackpads. This type of
control is now used to play VR games all around the world. The most
popular devices are HTC Vice Controller (also known as SteamVR
Controller) [24], Oculus Touch [25] and Move Controller for PlaySta-
tion VR. The potential for usage of these devices in AR is indisputable.
Mentioned devices can be more precise than the gestures when used
correctly in a technology industry, they can gain a big advantage in
work with virtual sculptures, 3D modeling and other positions with
an emphasis on punctuality. The combination of these devices or ges-
tures with voice control could create a mixed form of control. This can
be helpful when there are hands needed for a task.
There are several types of input devices to track the hand, finger
and gesture motion. Some of them stand on their own, such as Leap
Motion [26] or Project Soli [27], others are built in AR or VR devices.
For example, Leap Motion allows to control a personal computer with

17
2. Problem analysis

hand and finger gestures. Leap Motion can be also paired with Oculus
Rift to control the virtual reality. It is possible to think about these
devices as extensional gesture tracking remotes to the AR and VR
equipment.
Another possibility is to track the whole body. Body tracking can
be useful for measuring fitness and health levels, improving sports
performance, film and game production and many others. One of
the noted devices of this type is Perception Neuron. "Perception Neu-
ron is an adaptive Motion Capture device that allows the user to track
the movement of his or her entire body." [28]

2.3.4 Eye tracking


The interface of eye tracking systems detects involuntary and volun-
tary eye-blinks. Voluntary eye-blinks can be interpreted as control
commands. Such commands are used in special user interfaces for
people with severe physical impairments [29]. Although this seems
to be a very specific application, eye tracking could be very useful to
control the UI of AR when the ordinary user would not be able to use
hands (e.g. air-tap gesture) or voice control.
Eye-gaze tracking is a way of replacing "virtual" gaze from VR and
AR, which could be limited only to head movement. The combination
of this natural eye-gaze with other types of mentioned control could
create an interesting solution for control design for UI.

18
3 Selection and justification of solution meth-
ods
How to present design guidelines correctly? Design guideline should
consider not just the "what" to do, but also the "why", "how", and
"when" to do. The guideline needs to be practical and well proven.
On the other hand, a guideline should not suggest the specific solu-
tion, but to direct a designer to the solution creation. It is possible
to write a guideline as a list of recommendations. This way it could
cause notional image. For that reason one of the solution methods
will be showcasing a guideline applied on use case scenario. Because
the topic is quite abstract, it will be appropriate to create the use case
scenario which shows the most important segments of AR usability
for the maintenance, instructions assembly or other types of the ap-
plication in the industrial environment. From this use case scenario
should stand out a set of key elements of the design guideline for AR.
Besides that, it will be helpful to visualize this set of key elements to
better understand the connections and relationships between each of
them. It will be decisive to section guidelines by these key elements.

3.1 Use case scenario


The use case scenario will be focused on the process of a repair work
with AR headset. This type of process can show the potential of AR
usage mostly for work, industry and also partially for home usage.
That will point out the most important elements of AR usage.
For the use case scenario, it is important to define user personas.
A user persona is some fictional character created by the designer
to represent the type of a product user to achieve user-centered de-
sign [30]. Usually, a persona has defined demographic, geographic and
other information, like age, country residence, personality or social
class. In this case, there will be three simple user personas. It is not
needed to describe much specific information about them. They will
work in the aircraft industry and vary due to their experience level
of doing a repair work, which will show the main difference in AR
application design for such use case. All of them will use hypothetical

19
3. Selection and justification of solution methods

AR device composed of the newest AR technology to emulate what


could be achieved with AR device to date.

Persona 1: Thomas (Amateur) Thomas is a novice service technician


in the aircraft industry, new in the repair tasks.

Persona 2: Catherine (Intermediate) Catherine is an aircraft de-


signer working on the repair tasks occasionally.

Persona 3: Jeffrey (Professional) Jeffrey is an advanced service tech-


nician for an aircraft repairment. He works on the repair tasks daily.

3.1.1 An aircraft engine repair


All of the personas have to diagnose the problem of a nonfunctional
aircraft engine and fix it by using mentioned hypothetical present
technology AR device.
Thomas is wearing his AR headset. When he gets notified from
the system, that there was an error during the control process of
an engine and the engine requires attention, diagnosis and potential
repair. How to show him this AR notification when he is walking
down the stairs? There are various ways of notifying the user, but
some of them are inappropriate. Do not cover the center of users’ FOV
on the move.
Other two personas are trying to diagnose and repair the aircraft en-
gine in the work. Jeffrey does this as his daily routine, but for Cathrine,
it is an occasional task.
AR technology should help with the repair, that is why an in-
struction manual is needed. They have to control the user interface
to start the engine diagnosis and repair. There are many control ap-
proaches to use. It should be up to their decision what type of control
is convenient for them. The designer should offer a diverse scale of
approaches. Enable change of control type in different situations. For now,
they have chosen control with gestures and application UI elements.
All personas have started the instruction manual application.
AR device will start the instruction manual with information about
the tools required for the diagnosis and potential repair, but for Jeffrey,

20
3. Selection and justification of solution methods

this is an elementary job and such guide is not needed at that time.
Allow hiding/showing the manual. Thomas as an amateur will need exact
instructions and Cathrine will also need them for now because it is
the first time she will apply her theoretical knowledge. Vary the manual
according to the user experience level. How to show them tools needed for
the job, when some of them may be necessary only during late steps
of instruction process? What to do, when some of the tools need to
be reserved and are available only for a limited time? This is common
practice in various industries. Notifications could help with informing
the user about the availability.
As the instruction will start, it is questionable whether to show
instructions visually with animation, pure text manual, video or to use
voice instructions. This is up to the designer of such application, but
the variability of these approaches is wanted. In this use case scenario,
there will be holographic animation used to guide the users.
All of the personas are directed to detach part of the engine. As
mentioned before, for Jeffrey it is an easy job, therefore he chose to
hide his instruction manual layout for now. Thomas and Cathrine are
guided with holographic animation to do the task. How to highlight
objects and animate the instruction? Also, during this action, the users
are not able to use their hands in order to control the UI and they could
be easily limited another way in the industrial working environment.
Do not forget about extreme working conditions.
They have progressed to the state when the AR device diagnosed
the problem with a certain component of the engine. This was helpful
for all of them, as the AR device done the job on its own. It is nec-
essary to replace this component with a new one. They are able to
be navigated to a specific area of the warehouse, where they could
obtain the component. With the AR device, there is a possibility to
locate the component precisely and also to emphasize the object to
find the exact place. Different design methods should be used for var-
ious navigation interface utilizations. Navigation interface should vary
depending on the transport type.
Thomas, Cathrine, and Jeffrey have got the new component and
they will replace the old one. For Jeffrey, it is an easy task, but after fin-
ishing such replacement he has to inform his supervisor. Enable remote
cooperation. He uses his AR headset to contact the supervisor. Connect
AR and VR usage for cooperation. The supervisor will use VR headset

21
3. Selection and justification of solution methods

Figure 3.1: A concept map of key elements for AR interface design.

to share Jeffrey’s FOV. Thomas and Cathrine need professional help to


finish the task. They will also use their headsets to communicate with
an extern technician. How to design this FOV sharing or other AR
based communication? After finishing the repair AR device should
notify the user about the successful completion of the instruction
process.

3.2 Analysis of key elements

The use case scenario has outlined the first set of guidelines and some
key elements of AR usage. Figure 3.1 represents concept map view
of the set of key elements. The nodes are connected to show their
mutual relationships. They are differentiated by color to represent
a distinct group. It is needed to settle a few main fragments of common
AR usage to focus on them in the development of design guidelines.

22
3. Selection and justification of solution methods

The key elements can differ on each use case scenario, but some of
them maintain important.
From the use case scenario, it is clear, that one of the key elements
is to notify. Notifications are a huge part of the modern utilization
of IT devices such as smartphones, tablets, computers, etc. It is quite
complex problem to design proper UI for notification on AR device.
A lot of factors has to be included in process of designing. Almost
every application has its own form of notification. The most notorious
are notifications of social networks like Facebook, Twitter, Instagram
and so on. People use these social network applications on daily basis,
they want to be informed, but not to be bothered and distracted. On
the other side, a basic phone call can be presented as some form of
notification, at least the ringtone and vibration part of it. With many
various applications, phone calls, text messages and other notifications
from a native system of the device, it is necessary to set the priority
on each notification.
Another key element is to navigate. Navigation devices are used
in every mean of transport. The most common are GPS navigation
applications for smartphones, but GPS navigation is built-in standard
for modern vehicles. All transports need to be navigated, either it is
on the land, in the water or in the air. Driving a car, riding a bike or
going by foot could be much easier with navigation. As said before,
navigation in cars is the standard, however, to navigate bike rider is
more complicated. There is less space for the additional screen with
GPS navigation. For someone going by foot, it could be impractical to
stare into the mobile device with navigation while walking. A good
solution can be to use AR technology with navigation in the field of
view of a user. This type of navigation can be used practically in each
and every possible situation. For all that, navigation is a really needful
element of AR usability.
AR technology is efficient with visual presentation. This could
be helpful to show instructions of procedure in which user could be
variously experienced. Showing instructions of the task to a com-
pletely unexperienced user is tricky, but this user can do a job which
he would not be able to finish without a manual. Hence instruction
showing is yet another key element of AR usability. Proper visual-
ization of instructions, manual or any other guide could be useful in
business, industry and also for common daily users. Unschooled em-

23
3. Selection and justification of solution methods

ployees could be capable of handling a job in which more experience


is needed. It is possible to avoid eventual mistakes in work process
as well. The see-through augmented reality display usually results in
faster task finish time than the computer-aided instruction or the paper
instructions. Also, subjects make fewer errors using the augmented
reality conditions compared to the other instructional media [31].
It is inevitable to not overlook the controllability of an AR de-
vice. Control and presentation of an interface are essential during
the whole period of AR usage. Controlling is maybe the most impor-
tant key element of AR usability. It is inclusively related to other key
elements, considering user needs to be able to control every nuance
of the device. Control and presentation are bonded together because
when UI shows some elements, the user is encouraged to interact with
them somehow. There are far more factors and their combinations to
incorporate when designing AR interface, considering AR technology
gives a lot of forms of control and presentation.
AR technology has potential to offer the user ability to communi-
cate on a new and different level. Communication with AR device
could be used to cooperate on a complex task, as well as instruction
showing process. AR communication has the opportunity to create
a more visual form of interaction, by combining AR and VR usage,
sharing FOV or just by more real-like video conference. On the other
side, for text communication, there will be necessary to use extern
input devices or in the near future voice control when this type of
control will progress on the required standard. For the purpose of
this work, communication will be only partly discussed. Although,
to create design guidelines for communication used to cooperate to
complete a task in assembly instructions will be needed.

24
4 Design guidelines
This chapter is the core part of the work. Design guidelines are the main
output of this work to help the future AR designers with major prob-
lems of UI creation for AR devices in the industrial application. Al-
though guidelines can not be used in every case, they are meant to be
as generally applicable as possible.

4.1 Design guideline for the present technology


There are several technical limitations in AR technology in the mean-
time. The most advanced and commercially available device is proba-
bly Microsoft HoloLens. Many types of devices are out there, but some
of them are still clunky and their potential is not fully reached. One of
the biggest limitations is that the device itself is quite big. Mostly it is
wearable headset or glasses. Besides that, voice and gesture recogni-
tion is not perfect either. Voice and gesture commands are as simple as
possible. More complicated commands can be easily misrepresented
or lead to the error state. Field of view on such devices is constrained.
This makes it harder for the designer, because of bounded space where
the content can be shown.

4.1.1 Presentation
Combination of 2D and 3D design paradigm
UI design is used to prefer 2D design paradigm. It is common, because
of flat screens on many information technology devices people are
using, from notebooks, tablets to smartphones and smart watches. On
the other side in VR this 2D UI can feel confusing or unnatural. There
is often used 3D paradigm or combination of those two.

Immediate content should be shown only in natural viewing zones.


"The human visual system has a binocular FOV exceeding 180o hor-
izontally, yet current head-mounted virtual reality devices, such as
the Oculus Rift, are limited to around 90o horizontally." [32] To date,

25
4. Design guidelines

Figure 4.1: Diagram of user viewing zones [33].

designers for AR/VR have to work with technical limitation of nar-


rowed field of view.
Mike Alger has created an image of viewing zones for VR devices
with circa 90 degrees horizontal FOV in his paper Visual Design Meth-
ods for Virtual Reality. This scheme is designed mostly for stationary
positioned user working behind a desk. There is No-No Zone right
around the head where nothing should be permanently. The main
Content Zone ahead of the user, the Peripheral Zone where the user
still can see, but it is not meant for necessary elements and back behind
the user is Curiosity Zone, where the user has to turn around to see
the content. Diagram of this viewing zones can be seen in figure 4.1.

26
4. Design guidelines

Figure 4.2: Optimal zone for hologram placement for Microsoft


HoloLens [34].

With 3D rendering this 360 degrees scheme to a stereometric sphere, it


forms spacial viewing zone for AR/VR shown in figure 4.3. Although
it is designed for stationary VR usage, possible application in AR is
indisputable. With narrower FOV for the user interface, it affects a lot
of design choices, but the main idea of the content zones remains.
Designers for AR interfaces should take such content zones in mind.

Spacing is important in content to environment ratio. More screen


space makes computer workers more productive. There are fewer inter-
ruptions to tasks because less time of cognition is spent on navigating
through the interface. The user has a lot of screen space with the 360-
degree environment including z-depth by using AR in comparison
to the flat screen devices [35]. "There’s a massive productivity gain
in letting users spatially organize their tasks. This can lead to a 40%
improvement in productivity, and VR offers the opportunity to have
a vaster canvas." [36]
Designers from Samsung have tried to transition from 2D to the 3D
paradigm for their AR/VR environmental UI design. At first, they
were using 2D paradigm to create main menu gallery, but as this is
working properly for smartphones and such, in VR the similar spacing

27
4. Design guidelines

Figure 4.3: Content and workspace zones rendering (left) and zones
applicability testing (right) [33].

between content (which mobile phone galleries are using) is creating


coalescing effect. For instance, mobile gallery application and real
gallery space are both focused on exposing the content. In mobile
gallery app there is a value on primacy of content and the space
between the content is not meant to be noticed at all, it is a mark
of a good design, but in real gallery space relationship of content
and the environment is far different, content seems to be small and
the environment is more spacious. It is not easy to find the proper
ratio of content and environment. They went from flat 2D design filled
with content to the realistic game-like 3D design. The balanced and
blended version made content still front and center, but the user got
the sense of a place [37].
From designers’ point of view, the AR is something in between
the real life and VR, the environment is real plus the elements and con-
tents are virtual. As an outcome of this, when designing users’ menus
and interface it is practical to use spacing between elements. This can
create more natural feeling from the UI because in AR the environment
is also important.

Use volumetric elements. Intersection, when finger goes through


the button, is considered a mistake in the real world. In the digital

28
4. Design guidelines

Figure 4.4: Types of range selection sliders [33].

world, this can be fine, it is a somewhat natural property of the virtual


possibilities. This leads to the creation of volumetric elements. Instead
of 2D slider element (used in 2D application design), there is a way
to put the hand into the cylinder and move it up and down to adjust
the wanted value easily. Objects can be dense, packed together like
folders in 2D operation systems. Figure 4.4 creates an example of such
range selection elements [33].

UI elements responsiveness is the key. The design of UI elements is


firmly bonded to the key element of control, as the purpose of many UI
elements is to control or navigate through the menus. Real life control
elements are often in form of buttons, keys or knobs. For example,
real space button makes the percept of pressing the key itself reactive.
So what makes the main point of well-designed UI element for AR?
It is mostly the response of the element to the users’ interaction. This
haptics makes the elements feel more usable.
When designing the virtual control element (e.g. button), the de-
signer needs to think about the interaction. How to evoke real-life
haptic effect of pressing a button? When any material is pressed by
a finger there is some form of haptic sense (especially resistance), but
what to do when there is no sense after pressing the button in aug-
mented reality? A finger will go through the button when attempting
to press it in AR. Can this sense of pressing be simulated?
By seeing the finger go through the button, it points out that this
isn’t entirely unnatural. It looks like a finger dipping into the water.
That seems to be an opportunity to incorporate the concept of human
nature in the user interface. One way of doing this is to indicate the con-
tact by animation with a combination of coloring after the interaction.

29
4. Design guidelines

Figure 4.5: Button animations to evoke the real-life effect [33].

This way is shown in figure 4.5 [33] Combining the animation with
the sound effect can immerse the user into the interaction. Therefore
it is useful to add sound effects to boost up the animation response
effect.
There are also other ways of doing so. To simulate the resistance it
is possible to use some input surface device. Such device can simulate
haptic feel with vibrations or other technique. Although there is no
separation from the user and the content as it is on the tablets and
smartphones (by the piece of glass), the feel of element responding to
the interaction is even less apparent. Animation can simulate the visual
feel coming with sound effect, but the feeling is not as satisfying as
the holistic effect of haptic response [38].

Use human body for simple UI elements placement. AR technol-


ogy can project virtual elements on any surface. Many people are used
to wear watches or other devices on their hands. Hand cognition is
one of the most intuitive perceptions. Therefore, it could be practical
to use the users’ body (primarily hands) for UI elements placement,
too. Figure 4.6 shows potential usage of such approach. There can be
a problem with the projection because of the hair on the body part, but
the projection does not have to be placed directly on a body part, for
instance, it could be hovering around the fingertips. Zach Kinster built
Leap Motion tool for VR developers called Hovercast menu, figure
4.7 shows how is this menu interface used. It is highly customizable
and can include many nested levels of sliders, triggers, toggles, and
selectors [39]. This idea can be also implemented in AR user interface.

30
4. Design guidelines

Figure 4.6: Example of UI elements placement on a hand [33].

The color of UI elements can make the perception more natural and
visible. Recently, the design trend has favored minimalism and sim-
ple text or colored regions. A well-designed website will use color,
distance, and typography to clearly communicate a purpose and often
persuade some sort of action. Contrasting elements of sight like light,
color, and motion naturally draw users’ attention [33]. A huge varia-
tion of color schemes is possible to be designed for an AR application.
Choosing a color palette could be tricky considering that it is meant
to be designed for applications and objects that are real-life alike.
Outdoor and indoor settings with the broad range of uncontrol-
lable environmental conditions that may be present especially change
in natural lighting and wide variations in backgrounds or objects in
the scene can create a challenge for AR interface design in present-
ing augmenting information. Active design styles that react to ambi-
ent illuminance and more complicated and structured backgrounds
will possibly resolve the issue [40]. "The designer must also consider
what colors can and can not be seen in extreme glare or light condi-
tions. Augmented reality applications have many moving and rotating
graphics. How does the movement and angle of the screen affect how
colors and contrasting items appear?" [41] How to choose right color

31
4. Design guidelines

Figure 4.7: Hovercast menu placed around the fingertips [39].

Figure 4.8: Technique of color harmonization. [42].

scheme? The designer needs to take in mind different environments.


UI elements should be visible on the bright background as well as on
the dark one.
Color harmonization can partly help with this issue. This tech-
nique is able to harmonize the environment using slight modifications.
The presented approach is able to re-color virtual and real-world
items, achieving overall more visually pleasant results. Although, this
technique could not be used in some industrial environments where
color-coded markings are used. For instance, red could indicate dan-
ger and should not be changed to another color. Color harmonization
is showcased in figure 4.8. Notice the slight changes of hue values on
the formerly blue arm of the figure riding the motorbike. [42].

32
4. Design guidelines

Do not forget about the lighting. As the lighting could create a chal-
lenge for AR interface design, including the lighting in the designing
process could be helpful, too. Virtual objects with specific lighting
seem more natural to the user because real objects are normally af-
fected by lighting. This illuminance could be used to highlight the UI
elements to make them more evident and visible.
The algorithm Elastic Fusion is capable of detecting multiple light
sources in its environment to construct the 3D object. It is possible to
gather information for projecting an augmented reality representation
with environment lights and including them in its model information.
The model information is then used to provide more realistic lighting
effects for augmented reality representations [43].

4.1.2 Control
General
Enable change of control type in different situations. As there is
a large scale of control types for AR, the designer has to take in mind
whether the control type is suitable for the specific situation. A worker
in a noisy environment is probably not going to use voice to control
AR instruction manual. To strictly have only one type of control will
limit AR device usage. Therefore, there should be an opportunity of
control type change. The user should be able to switch between control
types or to even use a combination of them at any time.
There is a list of the control types in the table 4.1. This table rep-
resents appropriate and inappropriate usage of each control type.
As there are many input devices for AR with different control tech-
niques, the view on the input device control type is really simplified.
So, the special and detailed control means that some of such devices
are useful only in certain situations because of their various purposes.
Multimodal interaction can improve basic control types, but often
inherits previous disadvantages of the simple control types. There are
more combinations of the control types than listed in the table 4.1.
Many of those are suitable only for specific use cases. The designers
should always allow change of control types with the possibility of
control types combining.

33
34
Table 4.1: Control types table
Control type Appropriate use Inappropriate use Multimodal enhancement
Gestures Common control When hands interac-
tion needed
Voice Common control, In the noisy environ-
when hands interaction ment
needed
Input device Common control, spe- For some devices
4. Design guidelines

cial and detailed con- when hands interaction


trol needed
Eye-tracking (with blinks) When hands interac- Common control
tion needed
Gestures + Voice Great for common con- In the noisy environ- Enhances both types for com-
trol ment mon control with restriction
to hand use
Gestures + Input device Common control, spe- When hands interac- Gain of more precise con-
cial and detailed con- tion needed trol in special situations, lim-
trol iting modality of basic control
types
Gestures + Eye-tracking Common control When hands interac- Addition of eye-tracking
tion needed modality with no added
disadvantage
Voice + Input device Common control In the noisy environ- Possibly restricting for hand
ment usage
Voice + Eye-tracking Common control, great In the noisy environ- Additional eye-tracking
when hands interaction ment modality makes this type of
needed control suitable for instruc-
tions assembly
Eye-tracking + Input device Special and detailed Common control Very specific modality, possi-
control ble hand use restriction
4. Design guidelines

Avoid the misrepresentation of the commands. To date, technical


limitations of gesture or voice recognition could still lead to the mis-
representation of the commands.
One of the gesture recognition devices is Leap Motion. Recog-
nition of Leap Motion can represent hand movement and gestures.
When hands are correctly tracked, Leap Motion methods could accu-
rately detect the hand gestures. However, there are some cases when
the Leap Motion fails to detect every finger. The detection region for
Leap Motion Controller is quite small. Hand tracking data becomes
unstable when hands are near the detection region boundaries or
when the hands are crossed [44].
Voice recognition engines are still getting better. This recognition is
always started after some specific phrase, for instance, "Hey Cortana"
on Microsoft HoloLens. After that, the user can say other specified
voice commands. These engines are not able to interpret really com-
plex sentences with multiple clauses. Also, not many languages are
supported. Misrepresentation can happen by using slang words or
trying to control the AR device while speaking to another person at
the same time.
There are a few ways of avoiding such problems. Making com-
mands as simple as possible for the user is one of them. Besides that,
the user could get a response from the system after his command to
not cause any trouble state of the device. Despite all that, the current
technology level of such engines is sufficient for the application in
the industry.

Combine various control types to achieve more control. Mixing


control approaches can lead to the creation of more complex control
type. More delicate and detailed object control could be enabled this
way. The combination of any types of control mentioned before could
lead to ease of user interface control. Imagine pointing with finger
at some object with voice command to change the objects’ size. With
support for multi-handed interaction, the user can achieve parallel
activity with multiple objects. "Combining speech and gesture input
can create multimodal interfaces that allow users to interact more
efficiently than with either modality alone." [45] To date, it could be
practicable to create an interface of such type to be applied for the use

35
4. Design guidelines

in the industrial environment. In the near future, mixed control type


interfaces could be common practice.

Command response
The user should be notified by command response. Command re-
sponse is tied with control elements presentation. The effect of com-
mand response should inform the user about the completion of the ac-
tion. When controlling with gestures or by virtual clicking, UI elements
should react to the user action either visually, with a sound effect or
with other types of response. Repeating the user demand could be
a potential response on voice commands. Asking about command
confirmation could be another form of response. Nevertheless, this
form could be bothering the user and should be used moderately with
the right reason.
In the table 4.2 are listed various commands and appropriate re-
sponses to create an image of a proper command reaction. There are
more types than listed with other response types suitable for the exact
UI design. The right form of the response differs due to the situation
and used control type.

Recognize significant commands. The designer should not forget


about differentiating between significant and less important com-
mands. For instance, positioning an object is not as important as
deleting a file. After such significant command, the user should get
the response of command confirmation or other types of interaction
response.

User input
Use creative approach for user text input design. Translating user
input into text form could be an issue in AR technology to date. How
to write a simple text message or even more complex text? Of course, it
is possible to use an input device such as a keyboard. Additionally, con-
trollers used for VR could be suitable for this type of action. There are
a few of virtual keyboard input applications that work on drum-like
keyboard principle with sound effect response similar to typewriter

36
4. Design guidelines

Table 4.2: Command response table


Control category Command Appropriate response Inappropriate response
Gesture
Click gesture Emphasize the element Voice response and
on click, sound (click) command confirmation
effect, possible anima- dialog
tion
Button press ges- Animation is eligible, Voice response and
ture emphasize the element command confirmation
on press, sound (key dialog
press) effect
Menu activation Animation is eligible, Voice response and
gesture sound effect command confirmation
dialog
Object manipula- Emphasize the object, Voice response and
tion gesture sound effect on selec- command confirmation
tion only, continual re- dialog
action
Significant com- Command confirma- Direct execution of the
mand gesture tion dialog command
Voice
Select voice com- Command echo – voice
mand response ("Selecting...")
+ similar to click gesture
Object manip- Command echo – voice
ulation voice response ("Resizing...,
command Placing...") + similar
to object manipulation
gesture
Significant voice Voice response – com- Direct execution of the
command mand confirmation dia- command
log (e.g. "Do you want
to delete...?")
Other
Eye-tracking or Similar to click gesture
Input device + vibrations of input de-
selection vice
Voice + gesture Combination of both re-
commands sponse types, voice re-
sponse can be domi-
nant
Other multi- Generally similar to ges-
modal com- ture responses
mands

37
4. Design guidelines

Figure 4.9: Keyboard input for virtual reality applications [46].

[46]. This type of virtual keyboard can be operated by hands, too. Al-
though, this in-air virtual typing might lead to writing errors. Future
technology advantage could solve this fault issue. Voice controlled text
input could be a possible solution for the future. Another approach
could be to digitize handwritten text with digital pen and contact
surface. The simple connection between mobile phone and AR could
work, too.
What about not to use text input at all? AR technology offers other
opportunities. Instead of a text message, the user could be able to
send a short video message. It is up to the designer to decide what
approach is suitable for the situation.

Is standard login/unlock page design obsolete? Usual login page


contains some text boxes for login and password. Also, it could consist
of another form of authentication, such as CAPTCHA [47] — determin-
ing whether or not the user is human, or text message authentication.
Is this type of login input and authentication necessary in AR? The user
could be logged in easily by biometric authentication of the face, eye
or voice recognition.

38
4. Design guidelines

There are also gestures used for unlocking the smartphone or tablet.
This style of input can be applied in AR applications design. With
special in-air gesture, it could work similarly. Even though, the user
should be always able to choose what type of authentication to use.

4.1.3 Showing instructions


General
Offer multiple instruction showing variations. As the instructions
showing is related to the task itself, the form of the presentation should
vary due to the task type. Also, a preference of the user should be
taken in mind when designing an interface presentation for the AR
assembly instructions. This instruction assembly could be delivered
to the user with animation, holographic objects, text, sound, video or
the combination of some of them. Although, the user should always
remain in control over the appearance because some of the tasks could
be very specific. In addition, the type of the presentation should agree
with the environment conditions to achieve photometric and geometric
consistency. For example, the photometric issues include shadowing
and chromatic adaptation. The category of geometric issues contains
dealing with the identification and correct placement of virtual objects,
by relying on occlusion, texture, and size [48].

Allow hiding/showing the manual. Some of the steps of the work-


ing process shown with animation could need visual attention to be
done. It would be practical to allow the user to hide the animation
to finish the process step. After finishing such step, the user should
be able to show next step whenever the user wants. This hiding or
showing could be automatized, but to date, it may cause the problem
of implementing such behavior.
This could also be helpful for more skilled users because they
would be able to skip steps of the instruction process or hide them
when not needed.

Vary the manual according to the user experience level. As the use
case scenario drew out, instruction manual should be suitable for

39
4. Design guidelines

diversely experienced users. A worker with less experience should re-


ceive more detailed information about the work process via the instruc-
tion manual than more experienced one. Showing relatively routine
steps from the instruction manual to the experienced worker could be
annoying.

Give a feedback to the user. The process of instruction showing has


to give the feedback to the user, reflecting the users’ action, to convey
if the user is proceeding correctly. It is substantial to give suggestion
in real-time when the user is proceeding incorrectly or if there are any
alternatives [48].

Ensure that the AR will be helpful. The user has to feel that it is
worth using the AR system in order to complete the assembling task
because of its complexity. Ideally, the task should be so complex that
it is very complicated to finish without the AR system, at least for
an unexperienced operator.
Also, it should be tested and possibly proved that the AR system
makes the user more efficient in such task. Otherwise, the user would
not see the meaning in the AR usage. Quality training of new AR
users is really important in order to make sure that the system is used
efficiently [49].

Specific
Emphasize parts of an object to be moved. It is necessary to identify
which part or object has to be moved when there are objects used to
instruct. Besides that, this is important to inform the user about which
part of the body must be moved [48].
How to highlight the objects which are used in instruction manual
process? There are many forms of highlighting, e.g. with arrows, text
and depicted color, shape or object as a whole. Different patterns and
styles can be used, it depends on the purpose of the instruction manual.
One of the approaches is to highlight with polygon grid. This type
of highlighting is used in Microsoft HoloLens spatial mapping. In
this case, there are used triangles to provide a detailed representation
of the surfaces in the environment around the user. Such technique

40
4. Design guidelines

could be applied to an object highlighting to maintain real-world


visual with the marked shape to ease the contact with the real object
during the work progress guided by the manual.

Show the user work tools before needed. The standard manual
has a list of tools necessary for the work. Such instruction manual is
usually in the detailed booklet of written instruction steps. How to
practically show the user of AR the tools that are required to do one
of the instruction steps? It can be shown before the user starts the job.
However, there can be a huge set of work tools for the job and under
some working conditions, the tool can be reserved for another worker.
What to do if the user has only limited access to some of the tools
needed for the job? Inventory of the user with reservation detail of
the tool can be added to the work progress of instruction manual.

Tools inventory layout could be inspired by RPG games. Inven-


tory of work tools can be needed for some instruction manuals. Such
inventory sometimes offers detailed description view of the tools. This
principle is usually used in role-playing games (RPG). The inventory
layout ordinary consists of box grid view filled with objects. The figure
4.10 shows the layout of inventory in mobile RPG game for Daydream
VR. Although, it has 3D elements and it is simplified as possible, is
the layout like this suitable for the tools inventory for AR instruction
manual? It could be, in some cases when there is no problem of cover-
ing users’ FOV for a moment. Another approach could be to project
inventory as holographic objects on the wall, similarly to the tool wall
rack. What to do when there is not as much space for the inventory? It
could be reduced to small gallery-like view with the option to swipe
through the tool holograms.

Indicate the movement. The instruction showing has to indicate


the correct path of the movement and, in some cases, it could present
the velocity or acceleration of such motion. The path sets the trajectory
of the movement, whereas the correctness indicates the right way
to complete the task [48]. When using animation for the movement
indication, it is preferable to use as elementary animation as possible.
The complicated animation could confuse the user when proceeding

41
4. Design guidelines

Figure 4.10: Inventory UI and interaction in Unreal Engine Daydream


dungeon VR project [50].

with the task. It is better to split the step of a task into multiple simple
steps, to make the animation more obvious.

Do not forget about extreme working conditions. The design of


instruction manuals for work environment should be set up for bad
working conditions. How can the user control the UI of instruction
manual when working with hands in a noisy environment? Voice
command recognition is limited by the noise and no gestures or virtual
clicks can be made without free hands. Maybe eye-blink detection or
micro gestures with finger movement could be the solution. Specific
use cases like this can appear when designing for AR UI. Therefore,
control types should be variable all the time for the working user.

Cooperation
Enable remote cooperation. In the instructions assembly, it is useful
to create an opportunity for the user to remotely cooperate with some-
one more experienced if there is a problem with the task completion.
This communication style could be also helpful for the remote work
quality revision in the industry. It could make this type of a task easier

42
4. Design guidelines

Figure 4.11: Skype application call in Microsoft HoloLens [52].

to finish. For example, Microsoft HoloLens is using Skype application


in order to connect the users. It enables to share the vision and also,
diagrams, holograms or instructions can be drawn in the users’ FOV
to help with any task. This application allows the user to combine AR
and PC or tablet usage [51].

Connect AR and VR usage for cooperation. A combination of AR


device and ordinary tablet or smartphone to connect the users on both
sides to enable cooperation is feasible. But, when sharing AR users’
vision with tablet user, it loses the three-dimensional reality perception.
Therefore, another variation could be to connect AR and VR devices
in order to share the vision properly. This connection could create a
good form of usage in the task completion for instructions assembly in
the industry. Although, this could potentially create a problem with
motion sickness for the VR user. It is up to the designer to solve this
specific problem.

4.1.4 Navigation
General

43
4. Design guidelines

Figure 4.12: Landing a plane using Aero Glass [53][54].

Navigation interface should vary depending on the transport type.


AR navigation is suitable not only for a vehicle navigation (for car,
bicycle, plane, etc.), but also for pedestrian or indoors navigation. For
all that, the designer has to take in mind the variability of transport
types.
Needs of the user are different when traveling in a car or by walk.
There is no point of showing speed for the walking user, but when
the user is driving a car it could be crucial. Navigation can be used for
the very specific purposes as well. Pilots would be able to use AR aerial
navigation to execute perfect takeoff or landing. For example, Aero
Glass is aerial augmented reality navigation for pilots using smart
glasses [53]. The navigation type shown in the figure 4.12 is particular
for aerial navigation and can not be used for the navigation of another
type without interface changes because the indicators of height, wind
direction and others would not be helpful for pedestrian navigation.

Navigation needs to be understandable. The main goal of naviga-


tion is to give clear-cut instructions to guide the user from one place
to another. Navigation in AR can be delivered to the user with many
variations. Either it is a visual form or audio navigation, it has to be

44
4. Design guidelines

Figure 4.13: Example of overloaded car navigation UI [55].

apparent. For the majority of situations, the most evident navigation


is visualized. Although with an overload of visual elements the user
could get distracted or even misguided.
Visual navigation elements in the figure 4.13 create the confus-
ing effect. There are many arrows pointing diverse directions, map in
the left top corner, unnecessary large-scale speed clock in the right and
there is not much space left for the situational view of the surround-
ing road. These UI elements are an example of a bad car navigation
interface design in comparison to the figure 4.14. The simplicity of
design for this car navigation makes the road more visible and for that
reason, the user can have a better view of the situation around.

Highlight real life objects when navigating to the POI. As the AR


technology gives the advantage of virtual and real life object combina-
tion, it is practical to outline real 3D objects along the navigation route.
This can make the navigation more understandable for the user.
Object highlighting is relatable in many forms of navigation. For
instance, in the figure 4.12 it is possible to see highlighted runway to
facilitate the landing. Another way to use this practice is shown in
the figure 4.15. This technique comes from the FPS games, where it is
used to easily guide the player to his POI (point of interest) to complete
a quest [13]. When navigating the pedestrian, highlighting the doors,

45
4. Design guidelines

Figure 4.14: Simple and comprendious car navigation UI [56].

the turns on the walls and other objects can help to specify the path.
As the figure 4.16 shows, with object highlighting it is possible to
visualize complicated path. It was used in the game Mirror’s Edge
Catalyst to guide the players along the route when using parkour and
speed run to pass the complex obstacles. It is a way to highlight objects
in front of players to give them a hint on where they’re going next.
Apart from highlighting there is used also a red trail to guide [57].

4.1.5 Notifications
General
Notifications should not be the primary communication channel.
There are a few bad practices how not to use notifications. For in-
stance, it is ineligible to use notifications for promotion or advertising.
An application that user have never opened should not create any no-
tifications. Operations that does not require user involvement should
not pop up.
Good notification is supposed to be personalized and well timed.
The notification itself has to show relevant information to the user, but
not to overwhelm. Diversity in notification forms and presentation is
also efficient [59].

Let the user set the notification preferences. User preferences may
be different. One user does not want to be bothered, another one

46
4. Design guidelines

Figure 4.15: Highlighted geometric elements to ease the navigation in


a virtual game [13].

Figure 4.16: A way to highlight objects in the game Mirror’s Edge


Catalyst. [58].

47
4. Design guidelines

could want to stay updated in every moment. Let users quickly silence
or block an app’s notifications, it could give users relief from overly
interruptive notifications. Additionally, it is good to consider offering
layered settings, such as letting users choose notification sounds or
receiving specific notification types. Enable users to directly change
notification settings [60]

Presentation and Positioning


Form of presentation needs to be responsive. When the user is com-
pleting a task that requires visual attention, it is better to show visual
presentation of notification. Presenting audio signal or another form
of notification would divide users’ attention between two cues (au-
dio and visual). Limiting cues to one mode would mean a benefit for
the user [61].
Because of this, it is preferable to make the presentation of notifica-
tions responsive. Which presentation is suitable for various conditions?
It really depends on a situation. The designer should consider every
possible setup, to create an adaptive form of the notification system.
For example, when the user is in a noisy environment, there is
no point of notifying with audio signals. Visual form or maybe even
vibrations would be more effective in this situation, although it is
desirable to use sound notification when the user is able to hear them.
With the usage of AR technology, it is feasible to put notifications
on real life objects as well. Notifications like this could pop up in
the box on users’ work desk, they could be pinned on the wall and
so on. There are many other possible combinations of the forms of
notification presentation to take in mind.

Create notification priority levels. Even though notifications may


distract the user, some of them can be truly important. Activity no-
tification from the social network has normally lower priority than
major system failure notification, therefore it is appropriate to create
different priority levels of notifications. Notification position and form
should differ due to priority. These categories can be marked with
various labels or assigned colors. Low priority notification may be
located in the area designated just for notifications, e.g. in top right

48
4. Design guidelines

corner of FOV and high priority notification can be shown in the more
visible position to catch the users’ attention.
On the other side, it is practical for the user to choose whether
one or another notification tends to take part in low or high priority
category [60].

Do not cover the center of users’ FOV on the move. Showing noti-
fication in an inappropriate part of users’ FOV can be foul. In some
situations, it may cause the harm for the user. If the user is moving
from one place to another and notification window pops up in the cen-
ter of users’ field of view, covering his vision, the user can slip or even
trip up.
The designer needs to take in consideration if the user is in a sta-
tionary position or on the move.

Distraction
Visual notifications should be used moderately. "Another thing
to consider with AR displays is that whenever new information is
presented, or if there is something that requires our attention, that
information tends to get visualized as pop-ups that jump or slide
directly into our fields of view. While pop-ups in an AR display are
effective at grabbing our attention, it doesn’t require scientific studies
[62] to know that pop-up notifications disrupt our productivity, even
if we’ve come to accept them as part of our everyday workflows. So
how do we design a better notification system in AR?
One method that comes to mind is to have notifications be color-
coded – essentially using different visual cues, e.g., changing color
hues, to subtly draw users’ attention to the notification. Another
method is to have a designated notification area (read: notification
inbox) in an AR display where users can access the notifications when
they choose to do so. This would enable users to maintain their pro-
ductivity and concentration." [63]

Use only relevant and necessary notifications. Various studies re-


fer to the fact, that notifications can be really distractive even if they
are short in duration and irrelevant to the user. It is needed to protect

49
4. Design guidelines

Figure 4.17: Overloaded UI example [65][66].

the user from such unnecessary distraction while using AR device.


"It is well documented that interacting with a mobile phone is as-
sociated with poorer performance on concurrently performed tasks
because limited attentional resources must be shared between tasks.
However, mobile phones generate auditory or tactile notifications to
alert users of incoming calls and messages. Although these notifica-
tions are generally short in duration, they can prompt task-irrelevant
thoughts, or mind wandering, which has been shown to damage task
performance." [64]

Do not overload users’ FOV with notifications. Distraction may be


also caused by the overload of users’ field of view. The desirable case
is to have a simple layout with well-balanced ratio of real world view
and system elements. When using visual notifications, it is needed to
filter them and to show only the most important ones. Good practice
to avoid the overload is to minimize the time when the notification
pop-ups are shown on users’ FOV. Figure 4.17 is an adequate example
of bad notification UI design.

50
4. Design guidelines

4.2 Design guideline change for the future technology


It is expected that the AR technology will progress in the future. There-
fore, for the purpose of this work, it is useful to count on such progress
to erase the potential technical limitation from the meantime. Without
the technical limitation, some of the design guidelines may change.
These guideline changes are described in this section. Some of the sug-
gestions for the changes are based only on the presumptions of tech-
nical progress. The application of the other guidelines will probably
remain with the less notable changes.
The limitation of the users’ FOV from the guideline: Immediate
content should be shown only in natural viewing zones should minimize
in the future. So, the designer would not need to bound the UI to fit
the limited FOV anymore.
With technical progress, the responsiveness of the UI from the guide-
line: UI elements responsiveness is the key could become more authentic,
as the voice recognition systems will reach the level similar to the nat-
ural human language, the holographic objects will have better details
and lighting, hence the animation of the response will cause more au-
thentic effect. But, the importance of the command responses remains.
The guideline: Avoid the misrepresentation of the commands also de-
pends on the future technical progress. As mentioned before, with
the voice recognition systems able to recognize the complete natural
human language, the voice commands will get more complex and
sophisticated with less space for an error or misrepresentation.
Some of the guidelines are very specific and their usage depends
on the designers’ choice. For instance, application of the guidelines
Use volumetric elements and Do not forget about the lighting could get
obsolete in the future because they are already not useful in many
situations.

51
5 Evaluation
This chapter is meant for a consideration and a review of the main
objective of the work — the design guidelines creation. It is desired to
evaluate whether the outcome of the work will mean the real applica-
tion of the design guidelines in the industry or the other field of use.
If the goals of the work set in the introduction were achieved, will be
the content of the discussion.

5.1 Application of design guidelines on the use case


An example of UI created based on the listed design guidelines to
suite the use case scenario will give an imagery of the practical usage.

UI based on the aircraft engine repair use case The first key el-
ement which will inform the user about a potential problem with
the aircraft engine is the notification. This type of the notification is
important so it should be in high priority level based on the guideline
Create notification priority levels. The notification comes with the sound
effect and pops up in the top right corner of the users’ FOV in the des-
ignated area for the notifications to capture the attention of the user,
but not to distract the user. This notification will change the color
cue in the designated area to red to signal the important notification.
The guideline Visual notifications should be used moderately leads the de-
sign to not use the visual notifications excessively.
There are various approaches for the user to control the interface.
In the beginning of the repair use case, the user can control the inter-
face easily with gestures, but during the repair itself the user have to
use the hands for the task, so it is better to switch to the voice control.
The problem with the voice control is the noisy environment. Another
solution is to use the eye-blink as a form of the click or tap gesture. This
variability of control is based on the guideline Enable change of control
type in different situations. If the environment was not noisy, effective
control type for this situation would be the multimodal combination
of voice control with the visual gaze. The user would be able to look
at the object and control it with the voice. In the instruction assem-

53
5. Evaluation

bly showing there are needed the commands for next or previous
step, a pause of the animation, manipulation with holographic ob-
jects like rotation and size changes, etc. All of this could be done by
the multimodal control type of voice control and gaze.
Visualization of this manual depends on the experience level of
the user. Based on the guidelines Vary the manual according to the user
experience level and Allow hiding/showing the manual skilled user will
need only some specific parts from the manual, others can be hidden.
For such instance, the pure text manual would be a complement as
the experienced user would be able to scroll to the necessary parts of
the manual. Presentation type depends on the user preference. With
the voice commands like "Hide the manual" or "Show the manual"
there is an option to pop-up the animation or text representation of
the manual at any time.
The user will be navigated with simply outlined objects and pro-
jected trajectory as figure 4.16 shows. This navigation example follows
the guidelines Navigation needs to be understandable and Highlight real
life objects when navigating to the POI. Also, this depends on the type of
transport (the guideline Navigation interface should vary depending on
the transport type).

Summary This is a summary of one example of the UI based on


the listed guidelines. These are the defined key elements used for
the instruction assembly UI creation.
∙ Control:

– multimodal, the combination of voice and gaze,


– voice commands like "Next", "Previous", "Hide", "Show",
"Pause", "Start", "Enlarge", etc.,
– ability of the user to change the control type.

∙ Presentation of the manual:

– holographic animation and object emphasis,


– color outlining used to emphasize the objects,
– the user is able to change to the text or audio manual when
needed.

54
5. Evaluation

∙ Notifications:

– in the designated area in the top right corner of the FOV,


– color cue changes to signalize the notification,
– sound effect,
– pop-up only when important.

∙ Navigation:

– object highlighting,
– tracing used to show the trajectory,
– turning points emphasized.

∙ Communication:

– possibility of FOV sharing,


– diagram and instruction drawing ability.

5.2 Evaluation table


To evaluate the outcome from the design guidelines is a subjective
task. Forming the design guidelines into an evaluation table may help
with a tabular summary of the work. This table includes a list of
the guidelines with a description of the field of their potential usage
in the present time or the future, the importance, subjective rating,
and the sources as can be seen in the table 5.1 and 5.2. The field of use
of the guidelines can be general, partly general or specific. The general
guidelines could be applied in many use cases, specific ones are limited
to the special occasions. Some of the guidelines are originated from
various sources, VR technology, games, common practice and some
of them are fully or partly auctorial. The guideline evaluation could
mean that the guideline is important for the most UI designers or
relevant to its field of use, but not crucial generally and possibly very
specific. The evaluation of the future application of the guideline
indicates if the guideline will be used in the future without changes
or the technical improvement will ensure that the guideline will be
useful only with major alternation or not useful at all.

55
56
Table 5.1: Evaluation table, part one
Category Design guideline Field of use Origin Auctorial Future application Evaluation
Presentation
Immediate content should be shown only General Mike Alger + Microsoft No Less restrictions, FOV will expand Important
in natural viewing zones. HoloLens Guideline
Spacing is important in content to envi- Partly Samsung designers No Remains Relevant
ronment ratio. general
Use volumetric elements. Specific Mike Alger Partly Possible. Up to designers choice Relevant
5. Evaluation

UI elements responsiveness is the key. General Mike Alger + David Partly Will become more authentic Important
Birnbaum
Use human body for simple UI elements Specific Mike Alger + Zach Kin- Partly Up to user accustomization Specific
placement. ster
The color of UI elements can make the per- Partly Multiple sources No Remains Important
ception more natural and visible. general
Do not forget about the lighting. Specific ElasticFusion Partly Possible. Up to designers choice Specific
Control
Enable change of control type in different General Yes Remains Important
situations.
Avoid the misrepresentation of the com- Partly Lin Shao Yes Less, with technical progress Specific
mands. general
Combine various control types to achieve Partly Multiple sources Partly Remains Relevant
more control. general
The user should be notified by command Partly Games and common Yes Remains Important
response. general practice
Recognize significant commands. Specific Common practice Yes Remains Important
Use creative approach for user text input Specific Jonathan Ravasz Partly Possible. Up to designers choice Specific
design.
Is standard login/unlock page design ob- Specific Yes Possible. Up to designers choice Specific
solete?
Instructions
Offer multiple instruction showing varia- General Multiple sources Partly Remains Important
tions.
Show the user work tools before needed. Partly Yes Possible. Up to designers choice Specific
general
Allow hiding/showing the manual. Partly Yes Remains Relevant
general
Vary the manual according to the user ex- General Yes Remains Relevant
perience level.
Give a feedback to the user. Partly Multiple sources No Remains Important
general
Ensure that the AR will be helpful. Specific Multiple sources No Remains Important
Emphasize parts of an object to be moved. Specific Games and other Yes Possible. Up to designers choice Specific
Tools inventory layout could be inspired Specific Multiple sources Yes Possible. Up to designers choice Specific
by RPG games.
Indicate the movement. Partly Gen- Multiple sources Partly Remains Relevant
eral
Do not forget about extreme working con- Specific Multiple sources Yes Remains Specific
ditions.
Enable remote cooperation. Specific Skype for Microsoft No Possible. Up to designers choice Specific
HoloLens
Connect AR and VR usage for coopera- Specific Yes Possible. Up to designers choice Specific
tion
Table 5.2: Evaluation table, part two
Category Design guideline Field of use Origin Auctorial Future application Evaluation
Navigation
Navigation interface should vary depend- General Yes Remains Important
ing on the transport type.
Navigation needs to be understandable. General Multiple sources Yes Remains Important
Highlight real life objects when navigat- Specific Games Yes Possible. Up to design- Relevant
ing to the POI. ers choice
Notifications
Notifications should not be the primary General Material design No Remains Important
communication channel.
Let the user set the notification prefer- General Material design Partly Remains Relevant
ences.
Form of presentation needs to be respon- Partly Multiple sources Partly Remains Important
sive. general
Create notification priority levels. Partly Material design No Remains Important
general
Do not cover the center of users’ FOV on Specific Yes Will be more auto- Relevant
the move. mated
Visual notifications should be used mod- Specific Multiple sources No Possible. Up to design- Specific
erately. ers choice
Use only relevant and necessary notifica- Partly Multiple sources No Remains Specific
tions. general
Do not overload users’ FOV with notifica- Partly Material design Partly Remains Relevant
tions. general

57
5. Evaluation
5. Evaluation

5.3 Discussion
Is there practical application of the main outcome of the work —
the design guidelines? If so, what could have been done to achieve
better functionality?
As the work was partly developed with Honeywell team, precisely
Michal Košík, several ideas, and guidelines are based on this collabo-
ration. Therefore, the practical usage remains at least in some fields
of application in the industry. Many of the guidelines are proven and
general enough to find the application in the work environment. De-
veloping some AR application with UI based on the guidelines would
probably help with proving the potential of the guideline outcome
itself, but it would limit the usage on a restricted use case. Although,
with this limit, the guidelines could be more detailed and exact. As ex-
pected, the subject itself is extensive and could not be covered precisely
on the every part of it.
The visualization of the guidelines could be better with a presenta-
tional web page or an application. Such project would be beneficial to
accomplish in the future. The topic of UX design for AR devices will
be growing with the increase of AR devices. Hence, the purpose of
the work remains. More precise design guidelines will be needed in
the future to make the work of the designers easier.
Various approaches could be applied for the successful completion
of the main tasks of the work. The creation of definite user interface
would not be sufficient for more general usability. As a consequence of
the chosen solution method, the guidelines may appear more general
which is not improper for such outline.

58
6 Conclusion
The problem was examined in order to obtain theoretical knowledge
about possible approaches for creating a user interface for augmented
reality and also how to control such interface. The analysis of the prob-
lem determined the main and key elements of the UI designing for
the AR devices and this research led to the creation of the design
guidelines. Preferably, they were inspired by the already existing user
interfaces and guidelines. The guidelines included the possible solu-
tion for the limitations which the analysis outlined.
The usage of the design guidelines is appropriate for the applica-
tion in the industrial instructions assembly. With further research, it
would be possible to establish an even more advanced set of the design
guidelines.
The assigned tasks of the work were fulfilled, but there is a space
for the further research and also practical testing. The detailed research
of the control types like voice, gesture and multimodal control would
help with more specific control related guidelines. The subject of
the design for new AR devices is relevant and really progressive.
There is a lot of potential in this type of technology in the future.

59
Bibliography
1. Virtual Reality vs. Augmented Reality [online]. Augment [visited on
2017-04-26]. Available from: http : / / www . augment . com / blog /
virtual-reality-vs-augmented-reality/.
2. Reality–virtuality continuum — Wikipedia, The Free Encyclopedia [online].
Wikipedia [visited on 2017-04-26]. Available from: https : / / en .
wikipedia.org/wiki/Reality%E2%80%93virtuality_continuum.
3. Is Augmented Reality the future of technical documentation [online]. TC
World [visited on 2017-05-10]. Available from: http://www.tcworld.
info/uploads/RTEmagicC_313_Lumera_4.jpg.jpg.
4. Using QR Codes to Build a Classroom Audio Library [online]. Edutopia
[visited on 2017-04-26]. Available from: https://www.edutopia.
org / article / using - qr - codes - to - build - audio - library -
jacqueline-fiorentino-christopher-albrizio.
5. ROLLAND, J. P.; HOLLOWAY, R. L.; FUCHS, H. Comparison of optical
and video see-through, head-mounted displays. In: Proceedings of
SPIE - The International Society for Optical Engineering. 1994.
6. VAN KREVELEN, R. Augmented Reality: Technologies, Applications, and
Limitations. 2007. Technical report. VU University Amsterdam.
7. Pokémon Go — Wikipedia, The Free Encyclopedia [online]. Wikipedia [vis-
ited on 2017-04-01]. Available from: https://en.wikipedia.org/
w/index.php?%20title=Pok%C3%A9mon_Go&oldid=772392228.
8. Different types of augmented reality [online]. Digit [visited on 2017-04-01].
Available from: http : / / www . digit . in / technology - guides /
fasttrack-to-augmented-reality/different-types-of-augmented-
reality.html.
9. KITAMURA, A.; NAITO, H.; KIMURA, T.; SHINOHARA, K.; SASAKI,
T.; OKUMURA, H. Comparison between Binocular and Monocu-
lar Augmented Reality Presentation in a Tracing Task. The Jour-
nal of The Institute of Image Information and Television Engineers. 2015,
vol. 69(10).

61
BIBLIOGRAPHY

10. Going after a new VR Navigation System [online]. Medium, Beloola [vis-
ited on 2017-04-26]. Available from: https://medium.com/beloola-
all- our- news- updates/going- after- a- new- vr- navigation-
system-5439e0e860a2.
11. Don’t just teleport - How to walk around something that is bigger than your
tracked space [online]. Vision VR/AR Summit [visited on 2017-04-26].
Available from: https://www.youtube.com/watch?v=At_Zac4Xezw.
12. YU, J.; KIM, G. J. Eye Strain from Switching Focus in Optical See-
Through Displays. In: Lecture Notes in Computer Science. 2015.
13. FAGERHOLT, E.; LORENTZON, M. Beyond the HUD - User Interfaces
for Increased Player Immersion in FPS Games. Sweden, 2009. Master’s
thesis. Göteborg: Chalmers University of Technology.
14. Paraglider [online]. Far Cry 2, Wikia [visited on 2017-04-26]. Available
from: http://farcry2.wikia.com/wiki/Paraglider?file=FC2_
Paraglider.jpg.
15. Microsoft HoloLens: RoboRaid [online]. Microsoft HoloLens [visited on
2017-04-26]. Available from: https://www.youtube.com/watch?v=
Hf9qkURqtbM.
16. Microsoft HoloLens, a headset to see holograms, to be available ‘in Windows
10 timeframe’ [online]. Financial Post [visited on 2017-04-26]. Avail-
able from: http://wpmedia.business.financialpost.com/2015/
01/hololens-microsoft.jpg?quality=60&strip=all&w=620.
17. Experiences With Google Glass [online]. Kevin Boos [visited on 2017-04-26].
Available from: http://www.blogcdn.com/www.engadget.com/
media/2013/04/glass-1367355098.jpg?w=240.
18. What’s the Difference Between HoloLens, Meta & Magic Leap? [online].
Mixed Reality News [visited on 2017-04-26]. Available from: https:
//img.reality.news/img/original/21/65/63599884626929/0/
635998846269292165.jpg.
19. Steam VR HTC Vive [online]. Best Of Micro [visited on 2017-04-26].
Available from: http://media.bestofmicro.com/Q/L/571197/
gallery/SteamVR-Vive-page_w_450.jpg.
20. Cortana on HoloLens [online]. Support Microsoft [visited on 2017-03-25].
Available from: https://support.microsoft.com/en-us/help/
12630/hololens-cortana-on-hololens.

62
BIBLIOGRAPHY

21. Gestures [online]. Developer Microsoft [visited on 2017-03-25]. Avail-


able from: https://developer.microsoft.com/en-us/windows/
holographic/gestures.
22. HoloLens Clicker [online]. XinReality [visited on 2017-03-29]. Available
from: https://xinreality.com/wiki/HoloLens_Clickers.
23. Daydream Controller [online]. XinReality [visited on 2017-03-29]. Avail-
able from: https://xinreality.com/wiki/Daydream_Controller.
24. SteamVR Controllers [online]. XinReality [visited on 2017-03-29]. Avail-
able from: https://xinreality.com/wiki/SteamVR_Controllers.
25. Oculus Touch [online]. XinReality [visited on 2017-03-29]. Available
from: https://xinreality.com/wiki/Oculus_Touch.
26. Leap Motion VR [online]. XinReality [visited on 2017-03-29]. Available
from: https://xinreality.com/wiki/Leap_Motion_VR.
27. Project Soli [online]. XinReality [visited on 2017-03-29]. Available from:
https://xinreality.com/wiki/Project_Soli.
28. Perception Neuron [online]. XinReality [visited on 2017-03-29]. Avail-
able from: https://xinreality.com/wiki/Perception_Neuron.
29. KRÓLAK, A.; STRUMIŁŁO, P. Eye-blink detection system for hu-
man–computer interaction. Universal Access in the Information So-
ciety. 2012, vol. 11.
30. Persona (user experience) — Wikipedia, The Free Encyclopedia [online].
Wikipedia [visited on 2017-04-25]. Available from: https : / / en .
wikipedia.org/wiki/Persona_(user_experience).
31. BAIRD, K. M.; BARFIELD, W. Evaluating the effectiveness of aug-
mented reality displays for a manual assembly task. Virtual Reality.
1999, vol. 4(4).
32. XIAO, R.; BENKO, H. Augmenting the Field-of-View of Head-Mounted
Displays with Sparse Peripheral Displays. In: 2016. Available also
from: https://www.microsoft.com/en-us/research/publication/
augmenting-field-view-head-mounted-displays-sparse-peripheral-
displays/.
33. Visual Design Methods for Virtual Reality [online]. Mike Alger [visited
on 2017-04-16]. Available from: https://drive.google.com/file/
d/0B19l7cJ7tVJyRkpUM0hVYmxJQ0k/view?pageId=110097417450175901408.

63
BIBLIOGRAPHY

34. Interaction fundamentals [online]. Developer Microsoft [visited on 2017-04-20].


Available from: https : / / developer . microsoft . com / en - us /
windows/mixed-reality/interaction_fundamentals.
35. VR Interface Design Pre-Visualisation Methods [online]. Mike Alger [vis-
ited on 2017-04-16]. Available from: https://www.youtube.com/
watch?v=id86HeV-Vb8.
36. Designing VR Tools: The Good, the Bad, and the Ugly [online]. Leap Mo-
tion, Blog [visited on 2017-04-17]. Available from: http://blog.
leapmotion.com/designing-vr-tools-good-bad-ugly/.
37. VR Design: Transitioning from a 2D to 3D Design Paradigm [online]. Sam-
sung Developer Connection [visited on 2017-04-16]. Available from:
https://www.youtube.com/watch?v=XjnHr_6WSqo.
38. Haptic Design, Keynote Presentation. David Birnbaum, Immersion [online].
Immersion Corporation [visited on 2017-04-16]. Available from:
https://www.youtube.com/watch?v=VVjlYLlUGSk.
39. Hovercast VR Menu: Power at Your Fingertips [online]. Leap Motion -
Blog, Zach Kinstner [visited on 2017-04-18]. Available from: http://
blog.leapmotion.com/hovercast-vr-menu-power-fingertips/.
40. GABBARD, J.; EDWARD SWAN II, J.; HIX, D. The Effects of Text
Drawing Styles, Background Textures, and Natural Lighting on Text
Legibility in Outdoor Augmented Reality. Presence Teleoperators &
Virtual Environments. 2006, vol. 15(1).
41. Augmented Reality: Design Techniques and Challenges [online]. STC - So-
ciety for Technical Communication [visited on 2017-04-17]. Avail-
able from: https://www.stc.org/intercom/2016/09/augmented-
reality-design-techniques-and-challenges/.
42. GRUBER, L.; KALKOFEN, D.; SCHMALSTIEG, D. Color harmoniza-
tion for Augmented Reality. In: Conference: 9th IEEE International
Symposium on Mixed and Augmented Reality. Seoul, Korea: ISMAR,
2010.
43. WHELAN, T.; RENATO SALAS-MORENO, F.; GLOCKER, B.; LEUTENEG-
GER, S. ElasticFusion: Real-time dense SLAM and light source esti-
mation. The International Journal of Robotics Research. 2016.
44. SHAO, L. Hand movement and gesture recognition using Leap Motion Con-
troller. 2016. Technical report. Stanford University, EE 267.

64
BIBLIOGRAPHY

45. BILLINGHURST, M.; KATO, H.; MYOJIN, S. Advanced Interaction


Techniques for Augmented Reality Applications. In: Conference: Vir-
tual and Mixed Reality, Third International Conference. San Diego, CA,
USA: VMR, 2009.
46. Keyboard Input for Virtual Reality [online]. uxdesign.cc, Jonathan Ravasz
[visited on 2017-04-20]. Available from: https : / / uxdesign . cc /
keyboard-input-for-virtual-reality-d551a29c53e9.
47. CAPTCHA — Wikipedia, The Free Encyclopedia [online]. Wikipedia [vis-
ited on 2017-04-20]. Available from: https://en.wikipedia.org/
wiki/CAPTCHA.
48. ROLIM, C.; SCHMALSTIEG, D.; KALKOFEN, D.; TEICHRIEB, V. De-
sign Guidelines for Generating Augmented Reality Instructions. In:
Conference: 2015 IEEE International Symposium on Mixed and Aug-
mented Reality (ISMAR). 2015.
49. SYBERFELDT, A.; DANIELSSON, O.; HOLM, M.; WANG, L. Visual
Assembling Guidance Using Augmented Reality. Procedia Manufac-
turing. 2015, vol. 1.
50. Unreal Engine 4 Supports Google’s New Daydream VR Platform [online].
Unreal Engine [visited on 2017-04-20]. Available from: https://
www.unrealengine.com/blog/unreal-engine-supports-google-
daydream-vr.
51. Skype [online]. Microsoft [visited on 2017-05-01]. Available from: https:
//www.microsoft.com/en-us/hololens/apps/skype.
52. Skype [online]. Microsoft [visited on 2017-05-01]. Available from: https:
/ / compass - ssl . surface . com / assets / 4f / 9f / 4f9f50fe - 146b -
47db - a66b - 51343bec6863 . jpg ? n = Skype _ Feature3 _ img _ 1920 .
jpg.
53. Augmented Reality Aerial Navigation [online]. Aero Glass [visited on
2017-04-14]. Available from: https://glass.aero.
54. Aero Glass - Future of Aviation Piloting (HD) [online]. Ghost Protocol
[visited on 2017-04-14]. Available from: https://www.youtube.com/
watch?v=TMRNMLFlfP0.
55. Inavi X1 [online]. Blog - Playme AR [visited on 2017-04-13]. Available
from: http://playmear.com/blog/wp- content/uploads/2014/
12/Inavi-X1.jpg.

65
BIBLIOGRAPHY

56. Further Than the Eye Can See [online]. Continental - Head-up display
[visited on 2017-04-13]. Available from: http://continental-head-
up - display . com / wp - content / uploads / 2014 / 08 / slidernav3 .
jpg.
57. Follow The Red: Runner’s Vision In Mirror’s EdgeTM Catalyst [online].
Mirror’s Edge [visited on 2017-04-15]. Available from: http : / /
www.mirrorsedge.com/news/runners-vision-in-mirrors-edge-
catalyst.
58. Follow The Red: Runner’s Vision In Mirror’s EdgeTM Catalyst [online].
Mirror’s Edge [visited on 2017-04-15]. Available from: https://
media . mirrorsedge . com / content / www - mirrorsedge / en _ US /
news- articles/runners- vision- in- mirrors- edge- catalyst/
_jcr_content/body/image/renditions/rendition1.img.jpg.
59. Mobile UX Design: What Makes a Good Notification? [online]. UX Planet
[visited on 2017-04-12]. Available from: https://uxplanet.org/
how- to- craft- mobile- notifications- that- users- actually-
want-7b585e0e1fa1.
60. Notifications - Patterns [online]. Material Design [visited on 2017-04-12].
Available from: https : / / material . io / guidelines / patterns /
notifications.html#notifications-types-of-notifications.
61. CIDOTA, M.; LUKOSCH, S.; DATCU, D.; LUKOSCH, H. Comparing
the Effect of Audio and Visual Notifications on Workspace Aware-
ness Using Head-Mounted Displays for Remote Collaboration in
Augmented Reality. Augmented Human Research. 2016, vol. 1(1).
62. Push Notifications Are as Distracting as Phone Calls [online]. The Atlantic
[visited on 2017-03-31]. Available from: https://www.theatlantic.
com/technology/archive/2015/07/push-notifications-versus-
phone-calls/398081/.
63. How Widely Used Practices in AR Display Visuals Fall Short [online].
Blog - Meta 2 [visited on 2017-03-31]. Available from: https : / /
blog . metavision . com / augmented - reality - antipatterns - 2 -
how-widely-used-practices-in-ar-display-visualizations-
fall-short.

66
BIBLIOGRAPHY

64. STOTHART, C.; MITCHUM, A.; YEHNERT, C. The attentional cost of


receiving a cell phone notification. Journal of Experimental Psychology:
Human Perception and Performance. 2015, vol. 41(4).
65. Hyper-Reality [online]. Keiichi Matsuda [visited on 2017-03-31]. Avail-
able from: https://www.youtube.com/watch?v=YJg02ivYzSs.
66. How Widely Used Practices in AR Display Visuals Fall Short [online]. Blog
- Meta 2 [visited on 2017-03-31]. Available from: https : / / blog .
metavision.com/hs-fs/hubfs/Keichi-Matsuda-Hyper-Reality-
Visual- Display- Overload.png?t=1490054415225&width=597&
height = 335 & name = Keichi - Matsuda - Hyper - Reality - Visual -
Display-Overload.png.

67

You might also like