0% found this document useful (0 votes)
16 views

10 - Eye Tracking Algorithms, Techniques, Tools, and Applications With An

Uploaded by

Fanny Perez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

10 - Eye Tracking Algorithms, Techniques, Tools, and Applications With An

Uploaded by

Fanny Perez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Expert Systems With Applications 166 (2021) 114037

Contents lists available at ScienceDirect

Expert Systems With Applications


journal homepage: www.elsevier.com/locate/eswa

Review

Eye tracking algorithms, techniques, tools, and applications with an


emphasis on machine learning and Internet of Things technologies
Ahmad F. Klaib *, Nawaf O. Alsrehin, Wasen Y. Melhem, Haneen O. Bashtawi, Aws A. Magableh
Information Systems Department, Faculty of Information Technology and Computer Science, Yarmouk University, Irbid 21163, Jordan

A R T I C L E I N F O A B S T R A C T

Keywords: Eye tracking is the process of measuring where one is looking (point of gaze) or the motion of an eye relative to
Eye tracking techniques the head. Researchers have developed different algorithms and techniques to automatically track the gaze po­
Eye tracking applications sition and direction, which are helpful in different applications. Research on eye tracking is increasing owing to
Electrooculography
its ability to facilitate many different tasks, particularly for the elderly or users with special needs. This study
Infrared oculography
aims to explore and review eye tracking concepts, methods, and techniques by further elaborating on efficient
Internet of Things
Machine learning and effective modern approaches such as machine learning (ML), Internet of Things (IoT), and cloud computing.
Scleral coil These approaches have been in use for more than two decades and are heavily used in the development of recent
Video oculography eye tracking applications. The results of this study indicate that ML and IoT are important aspects in evolving eye
Cloud computing tracking applications owing to their ability to learn from existing data, make better decisions, be flexible, and
Fog computing eliminate the need to manually re-calibrate the tracker during the eye tracking process. In addition, they show
Choice modeling that eye tracking techniques have more accurate detection results compared with traditional event-detection
Consumer psychology
methods. In addition, various motives and factors in the use of a specific eye tracking technique or applica­
Marketing
tion are explored and recommended. Finally, some future directions related to the use of eye tracking in several
developed applications are described.

Abbreviations Abbreviations (continued )


IOG Infrared Oculography
AIoT Artificial Intelligence of Things IoT Internet of Things
ALS Amyotrophic Lateral Sclerosis LSS Lensless Smart Sensors
ADHD Attention Deficit Hyperactivity Disorder ML Machine Learning
AR augmented reality NBC Naive Bayes Classification
ASD Autism Spectrum Disorder PCCR Pupil Center Corneal Reflection
CCD Charged Coupled Device RANSAC Random Sample Consensus
DA discriminant analysis ROIs Region of Interests
ECAS Edinburgh Cognitive and Behavioral ALS Screen SSC Scleral search coil
EOG Electrooculography SMI SensoMotoric Instruments
ElSe Ellipse Selector SVM Support Vector Machine
EUD End-User Development VOG Video Oculography
ExCuSe Exclusive Curve Selector VR Virtual Reality
FREDA Fast and robust ellipse detection algorithm ZM Zernike Movement
HMD Head-Mounted Displays
Helmholtz Hermann Ludwig Ferdinand von Helmholtz
HMM Hidden Markov Models
HCI Human Computer Interaction
HRI Human–Robot-Interaction 1. Introduction
IR Infrared Light
(continued on next column)
Eye tracking is the process of tracking the movement of the eyes to

* Corresponding author.
E-mail addresses: [email protected] (A.F. Klaib), [email protected] (N.O. Alsrehin), [email protected] (W.Y. Melhem), [email protected].
edu.jo (H.O. Bashtawi), [email protected] (A.A. Magableh).

https://doi.org/10.1016/j.eswa.2020.114037
Received 7 March 2020; Received in revised form 16 September 2020; Accepted 16 September 2020
Available online 28 September 2020
0957-4174/© 2020 Elsevier Ltd. All rights reserved.
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

know exactly where a person is looking and for how long. Eye tracking This paper mainly focuses on 1) representing the major algorithms
systems measure the eye position, movement, and pupil size at a specific and techniques used for eye tracking and the benefits of incorporating
time to detect areas in which the user has an interest. Eye trackers have ML and IoT in various eye tracking technologies, 2) researching and
been used in different research areas including visual systems (Barr, addressing the most well-known and novel applications in eye tracking,
2008; Hristozova, Ozimek, & Siebert, 2018; Moreno-Esteva, White, 3) organizing the literature related to eye tracking using a structured
Wood, & Black, 2018; Ulutas, Özkan, & Michalski, 2020); psychology approach to bring together all studies to date on how ML and IoT
and neuroscience (Maria et al., 2019; Migliaccio, MacDougall, Minor, & technologies along with their applications help people working in the
Della Santina, 2005); psycholinguistics and healthcare (Park, Sub­ field of eye tracking. In addition, the results from this study may
ramaniyam, Hong, Kim, & Yu, 2017; Chen, Fu, Lo, & Chi, 2018); user contribute to the further development of the technologies in this field.
experience and interaction (Lukander, 2016; professional performance, The initial sections of this paper are directed toward people who are
consumer research, and marketing (Hang, Yi, & Xianglan, 2018); the completely new to eye tracking as well as inexperienced researchers.
economy, clinical research, and education (Colliot & Jamet, 2018); Section 8 is intended for researchers experienced in eye tracking, where
sports performance and research, product design, and software engi­ we provide our discussions and present some future directions.
neering (Obaidellah, Al Haek, & Cheng, 2018; Zohreh Sharafi, 2015); Despite the progress made in the various aspects of eye tracking
transportation (Noland, Weiner, Gao, Cook, & Nelessen, 2017); and research, only few surveys exist that review the growing body of liter­
virtual reality (VR) (Clay, König, & König, 2019). Eye trackers have also ature, with a focus on using ML algorithms along with IoT technologies
been increasingly used for rehabilitative and assistive applications. With and their applications. Majaranta and Bulling reviewed state-of-the-art
the help of scientific research, valuable information can be inferred and research in eye tracking and gaze estimation technologies along with
used to benefit elderly and special needs patients in particular who need some of their challenges and real-life applications (Majaranta & Bulling,
technology to make their lives easier. Eye tracking systems use software 2014). In addition, the authors designed some solutions and provided
algorithms for pupil detection, image processing, data filtering, and the some guidelines on matching user requirements with different features
recording of eye movement by means of a fixation point, fixation of eye tracking systems to find a suitable system for each task. Regarding
duration, and saccades. eye tracking, various principles used to measure the eye movements
The recent increase in the development of eye tracking applications have also been reviewed (Singh & Singh, 2012).
is because of recent technological advancements in both hardware and Majaranta and Bulling (Majaranta & Bulling, 2014) have also
software. Equipment that were once cumbersome, time-consuming, and explored different visual features in eye images, factors involved in the
expensive have recently been transformed into devices that are inex­ selection of an eye tracking method, and applications of an eye
pensive, unobtrusive, and wearable, which produce data that can be tracking technique. Kiefer et al. investigated the utilization of an eye
rapidly analyzed through special software. Current innovation in tracking methodology as input to spatial cognition, cartography, and
computing capabilities allows machine learning (ML) algorithms to be geographic information science along with its challenges and oppor­
integrated with eye tracking devices and adds learning functionalities tunities for future research (Kiefer, Giannopoulos, Raubal, & Duch­
from captured data to generate smarter eye tracking devices. A large owski, 2017).
variety of hardware and software approaches have been implemented by In medical applications, Königa and Buffalo (2017), Harezla and
research groups or companies based on technological progress (Sarkar, Kasprowski (2018) have reviewed state-of-the art research related to the
Sanyal, & Majumder, 2017; Verma, 2011). application of eye tracking technology in medicine. They have provided
Users have become accustomed to new and innovative technologies insight into such studies and the ways in which eye imaging is employed
connected to the Internet, a field widely known as Internet of Things in medical applications. The authors showed that more extensive studies
(IoT), and eyewear technologies such as goggles and glasses are exam­ on cue processing and medical decision-making are needed. Al-Moteri
ples of such innovations. Goggles are safety glasses surrounding the eye et al. provided an overview of eye tracking technology and how it
to prevent exposure to particulates, water, or chemicals. iMotions eye helps in understanding and improving a diagnostic interpretation. In
tracking glasses are used to collect and analyze real-world eye tracking addition, they focused on how eye tracking has been used to understand
data using glasses from Tobii, Pupil Labs, Argus Science, and Senso medical interpretations and encourage medical education and training,
Motoric Instruments (SMI). iMotions allows for advanced analysis using the perceptual and cognitive processes involved in medical interpreta­
tools such as heatmaps, gaze replays, and areas of interest (AOI) to tion, and how eye tracking will be used in future medical applications
generate certain output metrics such as the time to first fixation and time (Al-Moteri, Symmons, Plummer, & Cooper, 2017; Thevenot, López, &
spent. iMotions also provides automated gaze-mapping from dynamic Hadid, 2018).
environments to static scenes for a simpler aggregation and analysis In education and training, Rosch and Vogel-Walcutt reviewed eye
(Haywood, 2019). Research and improvements in this area are pro­ tracking applications as tools for measuring cognitive load and how such
gressing to include eye tracking devices, the applications of which have measurements can be used for enhancing real-time training (Rosch &
been put into real use. Vogel-Walcutt, 2013). For multimedia learning, Alemdag and Cagiltay
ML is a process that allows machines to learn and adapt through explored how eye tracking technology has been used with relevant
experience. Artificial intelligence (AI) refers to a wider concept in variables to study the cognitive processes during multimedia learning
which machines can execute tasks smartly. AI applies several algorithms (Alemdag & Cagiltay, 2018). In addition, Beach and McConnel reviewed
and techniques such as deep learning and ML to solve actual problems and a state-of-the-art research on the ways in which eye tracking method­
smartly execute tasks. AI and IoT are perfect examples of two technolo­ ologies are being used to understand the learning process of teachers
gies that complement each other and should be tightly integrated. (Beach & McConnel, 2019), whereas Colliot and Jamet have used eye
Connecting devices and remotely monitoring and controlling them is tracking techniques to investigate the effects of using a video recording
not an efficient way to enable the devices to make smart decisions or of a teacher on the learning experience of students when applying an e-
take actions on their own. The real value is obtained when devices learn learning module (Colliot & Jamet, 2018). Finally, Chen et al. reviewed
from each other and from their specific use. This occurs when devices the advantages of using eye tracking technologies for application in
can tune their responses, change their behavior, or adapt based on what learning and education (Chen, Lai, & Chiu, 2010).
they have learned over time. To allow devices to be smartly connected, In the area of human–computer interaction (HCI), various studies
learn from each other over time, and take actions on their own, it is (Majaranta & Bulling, 2014; Drewes, 2010; Shimata et al., 2015; Poole &
necessary to implement ML and artificial intelligence algorithms and Ball, 2006) have explored state-of-the-art research in using eye tracking
techniques to create a field called Artificial Intelligence of Things (Bunz to evaluate the usability of HCI or to lead to a system interaction when
& Janciute, 2018). using eye movements as an input mechanism. In addition, the authors

2
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

provided future opportunities for the use of eye tracking in HCI research we included terms such as “applications,” “tools,” “algorithms,” “ap­
along with suggestions to overcome some of the challenges and com­ proaches,” and “techniques.”
plexities faced by advanced interactive systems to enable effective ap­ (3) Strategy: Because our focus is to find papers related to ML and
plications. In addition to the above areas, researchers have provided IoT, we include “machine learning,” “artificial intelligence,” “Internet of
several reviews on using eye tracking techniques in tourism (Scott, Things,” “IoT,” “cloud computing,” “fog computing,” and “edge
Zhang, Le, & Moyle, 2017), software engineering (Sharafia, Soha, & computing” into the search queries
Gúeh́eneuc, 2015), online shopping and visual marketing (Hang et al.,
2018), and control and command intelligent systems (Bazrafkan, Kar, &
Costache, 2015). 2.3. Paper inclusion and exclusion criteria
In this review article, we focus on using ML, IoT, and cloud
computing with eye tracking techniques. A classification of different ML Inclusion and exclusion criteria set the boundaries for the review.
algorithms that use eye tracking data along with the research goal, as They are determined after the research question is set, and usually
well as the strengths and limitations of the algorithms, are presented. In before the search is conducted; however, it is necessary to scope
addition, we present healthcare, medical, HCI, accessibility, education, searches to determine the appropriate criteria. Numerous factors can be
e-learning, and car assistant eye tracking applications. Moreover, this used as inclusion or exclusion criteria. Table 1 shows the main inclusion
research covers possible future directions of eye tracking techniques and exclusion criteria.
and applications using ML and IoT, as well as other research areas that
may be interesting to readers, such as the fields of gaming, marketing,
advertising, and VR. 2.4. Selection of the main studies
The reminder of this paper is organized as follows. Section 3 provides
some background and definitions for eye tracking concepts and gaze The above string of keywords was applied to the above-listed data­
estimation techniques along with the structure of the literature consid­ bases in accordance with some predefined inclusion and exclusion
ered. Eye tracking techniques using ML algorithms are described in criteria, which are listed in Table 1.
Section 4. Section 5 presents eye tracking techniques that apply IoT. As a result of the search strategy, a good number of studies have been
Section 6 shows the use of cloud computing with eye tracking. Some eye identified. First, the duplicates and redundancies were removed, and the
tracking applications are presented in Section 7. Section 8 presents our abstracts of the remaining papers were reviewed to target those studies
viewpoints and experience, limitations of current eye tracking applica­ that are directly related to the topic. The full text of each article was
tions, and possible directions for future studies. reviewed at the end of the screening process. In total, 180 research ar­
ticles, published between 2000 and 2020 (30 from 2019 and 8 from
2. Collection and selection of papers 2020), as shown in Fig. 1, were selected.
The aim of this paper is to give insight into those studies and the ways
2.1. Data sources in which they utilize eye imaging in medical applications. First, the eye
tracking methods and algorithms, along with the eye movement char­
In this study, we do not follow a typical review of individual papers; acteristics, are described. Attention was focused specifically on the VOG
instead, we provide a systematic qualitative approach, which reviews technique based on eye image processing, which is commonly used in
state-of-the-art research on improving eye tracking algorithms, tech­ current eye tracking solutions. Based on this knowledge, the experi­
niques, tools, and applications. Several well-known digital library data­ ments used in the aforementioned studies were detailed.
bases were selected for the literature search, and the selection was based Although they provided answers for many questions, there are other
on their relevance to the computer science community and the avail­ unanswered questions, for which additional studies are required. The
ability of papers for download. We focused on collecting papers from presented survey may prove helpful in this regard.
IEEEXplore, ACM Digital library, SCOPUS, Citeseer, Citebase, Microsoft There are several methods for measuring eye movement; the some
Academic, DBLP Computer Science Bibliography, ScienceDirect (Elsev­ uses video images to extract the eye position. Other methods use search
ier), and Springer. To ensure the quality of this review, only academic coils, or they are based on an electro-oculogram.
papers published in peer-reviewed journal were included. The search was
performed on the search engines of the abovementioned libraries on
different days. To establish a clear cut-off date, we limited our search to 2.5. Literature structure
papers published until June 1, 2020. To produce a more homogeneous
result, papers published after this date were not considered, which will Because eye tracking techniques require a precise procedure to track
also enable any future updates to this study to achieve greater precision in the eye gaze, it is important to study novel approaches to widen the use
terms of coverage and development. of eye tracking on a higher level. Fig. 2 shows the structure of our review
paper in which we group the literature on eye tracking techniques and
2.2. Search strings applications using the IoT along with ML algorithms and cloud/fog
computing.
To build our search string, we use the Population, Intervention,
Comparison, Outcome, Context (Barbara, 2004) criteria to frame the Table 1
search keywords. These criteria have been adapted from (Kitchenham & Inclusion Criteria.
Charters, 2007) based on the context of computer science. This Relative to subject: Articles were included if they only related to Eye Tracking
approach helps researchers define the relevant terms based on key algorithms, techniques, tools, and applications
words that have been used in previous studies and to gain expert insight. We include only published articles
We use articles written only in English
Each search query consists of a variation of the words “eye tracking,”
We include peer reviewed articles from 2000 to 2020
followed by words related to stimuli and/or strategy. The details of the We focus on articles that describe the intervention in detail
search queries are as follow: We focus on articles that use rich descriptions
(1) Eye tracking: To ensure that the search results are related to eye We focus on articles that are fully available
tracking or eye trackers, the first keyword in every search query must be We focus on articles that describe the outcomes clearly

“eye track,” “eye-track,” or “gaze,” depending on the context.


(2) Stimuli: To make sure we find papers related to a specific topic,

3
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Fig. 1. Years of publication of research articles considered.

Fig. 2. Structure of our review paper in which we group the literature on eye tracking techniques and applications using ML and IoT.

3. Background and definitions (Hansen & Ji, 2010). A saccade occurs when the gaze moves quickly
between objects, and a fixation is when the gaze is directed to a specific
3.1. Eye tracking concept object for a brief period of time.
Feature-based gaze estimation techniques are applied by detecting a
The human eye consists of the retina, pupil, cornea, sclera, iris, and set of varied features of the eyes, insensitive to disparities in lighting and
other components. The activities of these components can be captured as viewpoint, and the iris and pupil need to be clear to be properly
signals to be used in different applications (Majaranta & Bulling, 2014). detected, including the contour of the pupil, with no cornea reflections.
Eye detection is the process of locating the eye and measuring the eye In addition, 3D and geometry-based methods must be applied. An
gaze. There are several methods that are used for gaze tracking, such as appearance-based gaze estimation technique, by contrast, is applied by
shape-based, feature-based, appearance-based, and hybrid methods extracting image contents and mapping them to positions on a screen.
(Hansen & Ji, 2010). Therefore, finding the point of regard, relevant features, and personal
variations are key steps in this technique. Examples of this technique are
manifold, and include a gray scale unit, cross ratio, Gaussian interpo­
3.2. Gaze estimation techniques lation, and morphable models (Chennamma & Yuan, 2013; Sorate,
Vpkbiet, & Chhajed, 2017).
The main goal of gaze tracking is to pinpoint the gaze direction to­
ward different objects. Gaze estimation techniques can be applied for
determining the gaze position, which relies on connections between data 3.3. Eye tracking techniques
from the image and gaze direction. In general, eye movements are
classified into saccade and fixations, whereas gaze estimation tech­ Four eye tracking techniques have been the focus of most studies in
niques can be classified into feature- and appearance-based techniques this field and in developing novel eye tracking applications. They are the

4
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

scleral search coil technique, infrared oculography (IOG), electroocu­


lography (EOG), and video oculography (VOG).

3.3.1. Scleral search coil technique


The scleral search coil technique was first introduced by Robinson in
1963 (Robinson, 1963) and uses a contact lens with mirrors attached to
a wire coil that moves in a magnetic field. The field induces a voltage in
the coil to produce a signal that represents the eye position. An inte­
grated coil in the contact lens allows detecting the orientation of the coil
in a magnetic field. The eye movements can be recorded by monitoring
the infrared wavelength range that is reflected by the mirror and
recorded by eye trackers for accurately moving an image in line with the
eye movement. A scleral search coil is shown in Fig. 3.
The Helmholtz approach is a major approach in the scleral search
coil technique. However, it cannot be used for head movements. Head
movements can be attained by other popular approaches such as a
Rubens coil geometry, which is an older approach that can cover similar Fig. 4. Infrared oculography approach setup (Kurzhals et al., 2016).
magnetic fields as a Helmholtz and Merritt geometry and reduces the
error rate (Eibenberger, Eibenberger, & Rucci, 2016).
infrared light used and the typical setup of an infrared oculography
The original coil search technique has been extended to include a
technique that is known to produce fewer disturbances than an EOG.
reference coil, digital data acquisition panels, and a receiver program
The IOG technique has been widely studied and researched using
with a goal to enhance the scleral coil in terms of the cost and reliability.
many different algorithms and commercial applications such as the
The extended technique is considered inexpensive and robust. The
Starburst algorithm (Păsărică, Bozomitu, Cehan, Lupu, & Rotariu, 2015;
hardware may benefit from different operational amplifiers to reduce
Mestre, Gautier, & Pujol, 2018; Wang & Ji, 2017) , circular Hough
noise, which will be tested in future developments (Eibenberger,
transformation, FREDA (Martinikorena, Cabeza, Villanueva, Urtasun, &
Eibenberger, Roberts, Haslwanter, & Carey, 2015).
Larumbe, 2018), low pass filtering (Lupu, Bozomitu, & Cehan, 2014),
The number of experiments that use a scleral coil is less because the
Haar-like features (Santini, Fuhl, & Geisler, 2017; Wilson & Fernandez,
coil is prone to breakage, and can only be worn for a short period of time
2006), K-means (Santini, Fuhl, Geisler, & Kasneci, 2017), random
and does not work on sensitive eyes or patients who blink excessively
sample consensus (Picanço & Tonneau, 2018), ExCuSe (Wilson & Fer­
(Murphy, Duncan, Glennie, & Knox, 2001). Intraocular pressure is
nandez, 2006), ElSe (Wilson & Fernandez, 2006), Pupil labs (Fuhl,
created when wearing coil contact lenses. Patients who suffer from
Tonsen, Bulling, & Kasneci, 2016), Euclidean distance (Bobić & Grao­
glaucoma or ocular hypertension should be treated with care if required
vac, 2016) binocular-based (Sesma-Sanchez & Hansen, 2018), Camshift
to wear coil lenses.
algorithm (Li, Li, & Qin, 2014), a pupil center corneal reflection (Park,
The advantages of this technique are high accuracy, good resolution,
Jung, & Yim, 2015) and Swirski (Santini, Fuhl, Geisler, & Kasneci,
3D data representation, and high sampling rate, whereas the main dis­
2017). The use of pupil recognition allows an improved emphasis on the
advantages are its complicated implementation, invasiveness, and the
pupil rather than the iris and sclera of the eye (Picanço & Tonneau,
need for expensive gear. Therefore, many medical labs refrain from
2018; Păsărică, Bozomitu, Cehan, Lupu, & Rotariu, 2015; Fuhl et al.,
using this approach (McCamy et al., 2015; Vincent, Alonso-Caneiro, &
2016). An advantage of the IOG technique is its ability to smoothly
Collins, 2017).
handle eye blinking. However, it is unable to quantify torsion movement
(Lupu & Ungureanu, 2013; Chennamma & Yuan, 2013; Sorate &
3.3.2. Infrared oculography
Chhajed, 2017).
The infrared oculography (IOG) technique measures the strength of
an infrared light that is mirrored from the sclera, which provides a va­
3.3.3. Electrooculography
riety of information about the eye position. Light can be produced from a
Electrooculography (EOG) is a practicable and inexpensive tech­
pair of glasses. This approach relies mainly on light and pupil detection
nique for human–computer interaction. In this approach, sensors are
algorithms. To solve the issue of head movement sensitivity, a reference
attached to the area that surrounds the eyes to detect an electric field
point, which is known as a corneal reflection or glint, is included using
that occurs while the eyes are rotating by measuring fluctuations in the
an infrared light source when this technique is applied. Fig. 4 shows the

Fig. 3. Scleral Search Coil contact lenses (Dietz, Schork, & André, 2016).

5
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

skin. Horizontal and vertical eye movements are documented disjoint­


edly using electrodes. However, the signal can be altered without eye
movements. Fig. 5 shows a wearable EOG device with sensors attached.
EOG is not an approach for daily use; its application can be beneficial
to medical fields and laboratories. This approach is linearly relative to
eye movements, and it can use head movements for tracking (Chen­
namma & Yuan, 2013; Sorate et al., 2017). The EOG eye gaze interface is
easy to use owing to its simple arrangement. However, the use of EOG is
limited because of eye drifts and eye conditions (Tsui, Jia, Gan, & Hu,
2007; Constable et al., 2017). A solution has been developed for this
problem. Users can use a pointer and switcher to easily turn on/off the
call device or to forward a 1-bit signal to a PC with a high level of
firmness and fineness (Tsui, Jia, Gan, Hu, & Yuan, 2007).

3.3.4. Video Oculography


A typical setup consists of a video camera that records the move­
ments of the eyes and a computer that saves and analyses the gaze data.
The VOG approach can either use visible or infrared lights. The VOG is a
non-invasive system that performs the eye tracking remotely. There are
two ways to implement video eye tracking based on the number of
cameras used: the first approach uses a single camera, whereas the other
uses multiple cameras (Majaranta & Bulling, 2014). Both types of eye
trackers, remote or head mounted, have a major drawback if used in HCI
systems owing to changes in the head position. For remote trackers, this
can be resolved by using two stereo cameras or one wide-angle camera Fig. 6. Video oculography using two cameras (Bulling & Gellersen, 2010).
to search for the person in front and another to point to the person’s face
and zoom in. An example of VOG using two cameras is shown in Fig. 6. free head motion. Multiple cameras are utilized to achieve this goal
Single camera systems: Single camera systems capture limited-field either through wide-angle lens cameras or movable narrow-angle lens
and high-resolution images that are fixed at a single point. This cameras. One camera is used for the eye and the other camera is used to
method uses an infrared light source to produce a corneal reflection that capture the head location. All the data collected from these cameras are
will be used as a reference point for the gaze estimation. The pupil-glint combined and used to estimate the point of gaze. Eye tracker systems use
vector remains constant to the eye and head movement, whereas the two cameras to form a stereo vision system for calibrating the compu­
glint clearly changes its position when the head moves. Several com­ tation of the pupil center, which is achieved using 3D coordinates.
mercial systems use a single camera and one infrared light. The main Videos generated from these cameras, such as VOG, were introduced
difficulty in this approach is the limited view required to sufficiently to find solutions for many problems faced by image-based eye tracking
capture high-resolution images. Better results can be gained by adding techniques. Some of these problems include the difficulty in detecting
multiple light sources to the setup (Chennamma & Yuan, 2013; Sorate the eyes due to head movement, unclear eyes because of blinking, or
et al., 2017). bent eyelashes. VOG helps in increasing the accuracy of eye tracking
Multiple Camera Eye Tracker: A large field of view is required for systems and observing eye movement disorders. The video system can

Fig. 5. Electrooculography device (Coutrot et al., 2018).

6
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

be easily handled, and it allows head movements and a fully remote An SVM is an extremely effective and commonly used ML approach
recording. However, the video recording system is expensive and re­ in classification, object recognition, and image querying (Tong, 2001).
quires more storage and devices with high computational capabilities. In In addition, SVM was used with a variety of feature extraction and face
addition, when recording closed eyes, the eye torsion cannot be recognition algorithms in eye tracking techniques. With electrooculog­
measured (Chennamma & Yuan, 2013; Sorate et al., 2017). raphy (EOG), it was used to detect signal-saccades, fixations, and blinks
Many studies have used the VOG approach with different algorithms to detect repetitive patterns of eye movements. The minimum repetition,
and tools such as EyeScout (Khamis, Hoesl, Klimczak, Reiss, Alt, & minimum redundancy, and maximum relevance features were com­
Bulling, 2017), Goggles (Kong, Lee, Lee, & Nam, 2018), a morphology bined with the SVM classifier and the person-independent and leave-
approach (Kong et al., 2018), the position of the corneal reflection one-person-out strategies, to train the model. Evaluation results
(Abdulin, Friedman, & Komogortsev, 2017), a charged coupled device showed an accuracy rate of 76.1% across all categories (Bulling, Ward,
(Sakatani & Isa, 2004), GazePath (Van Renswoude, Raijmakers, Gellersen, & Troster, 2011).
Koornneef, Johnson, & Hunnius, 2018), bright-pupil algorithm (Goni, An SVM and the Zernike movement (ZM) have been used if the
2004), Matrox imaging library (Goni, 2004), polynomial-base VOG template matching algorithm or Haar face detection algorithm is not
models (Murphy et al., 2001), glint positioning (Ding, Luo, & Deng, able to detect the eyes (Singh, Mittal, & Walia, 2011). ZM was especially
2018), a Haar method with a 6D optical tracker (Migliaccio et al., 2005), helpful in reducing problematic views, specifically in the case of upside-
Tobii (Klaib, Alsrehin, Melhem, & Bashtawi, 2019), and a pupil frame­ down images or unclear visibility owing to lighting issues.
work to automate the analysis of the gaze data (Picanço & Tonneau, Template matching finds similarities between a template and
2018) on the pupil of the dark eye with up to 12 reflections of the cornea another image. This method was employed to sort items where the
(Mestre, Gautier, & Pujol, 2018). According to a previous study, we template is considered as a miniature image or part of an image. This
summarize the advantages and disadvantages of the four most popular technique was typically used by Ahuja to identify complicated images
eye tracking techniques in Table 2. and the things they reference (Ahuja & Tuli, 2013). Gradient orientation
pattern is a technique that uses vectors to generate an image that will be
4. Eye tracking using machine learning used to track the human eye in real time. A template update algorithm
was also included to help match eye images. The reason behind using the
ML is an approach to learn from previous data using a variety of template algorithm was to build a new template in each frame based on
algorithms and find predictions to eventually come up with a model that the image of the eye, and then enhance the results by using the matching
shows accurate results. Several algorithms and techniques have been method (Hotrakool, Siritanawan, & Kondo, 2010). The template algo­
introduced to eye tracking, including regression, neural networks, naive rithm also helps with light fluctuations that occur during the tracking
Bayes classification, and a support vector machine (SVM). process. A benefit of using the gradient orientation pattern is its ability
to deliver significant matching results and handle complex deviations in
lightning. However, as a limitation to this approach, if a frame is not
Table 2 matched, it will be rejected. Another study that explored the applica­
A summary of the eye tracking techniques along with their advantages and tions of using template matching with eye tracking was proposed by Liu
disadvantages.
and Liu (2010) under complex situations where detecting and capturing
Techniques Advantages Disadvantages eye movements may be difficult.
Scleral Search Coil - Very high temporal and spatial - Invasive method. ML algorithms such as Bayesian models and neural networks were
resolution. - Rarely used clinically. used to deliver highly accurate tracking. An open source software was
- Invaluable research tool. - May cause Intraocular developed by Santini for real-time head mounted eye tracking, to detect
- - Highly accurate. pressure.
an eye pupil and estimate the gaze (Santini, Fuhl, Geisler, & Kasneci,
- Wear the coil for a short
period of time. 2017). The software can be customized to match developer re­
- Doesn’t work on quirements by swapping algorithms. The software delivers progressive
sensitive eyes. settings, gaze estimation, and blinking detection using Bayesian models.
- Anesthesia of the eye.
MATLAB-Simulink has been successful in detecting winks and blinks
- Complicated settings.
Infrared - Used in light and darkness. - Unable to quantify
using a single camera with video processing capabilities and a micro
Oculography - - Handling blinking smoothly. torsional eye controlled board (Rupanagudi, Bhat, Ranjani, Srisai, Gurikar, & Pranay,
(IOG) movement. 2018). Moreover, ML algorithms have been used in an EOG and have
- Limited movement of found good accuracy levels. The aim is to improve and enhance the EOG
the head.
interface. An eyewear tool was developed to analyze data in real time
- Invasive method.
Electro - Medical fields and - Not used daily. using a microcontroller, to capture and analyze signals. For real-time
Oculography laboratories. - Inexpensive. and reliable results, researchers used a wavelet transform and neural
(EOG) - Very high temporal and spatial - Affected by the noise network to process the signals. Fuhl et al. (2016) proposed a system that
resolution. around the eye. obtained 92% reliability by applying ML techniques to a VOG
- Used Machine learning to - Invasive method.
found accuracy. - - Complicated settings.
approach that results in higher accuracy during the tracking process.
- - Analyze the data in real time Table 3 shows a classification of the ML algorithms that use eye tracking
by using microcontroller. data along with their specific applications, strengths, and limitations.
Video - Use visible light or infrared - Limited spatial
Oculography light. resolution.
(VOG) - Clinical observation of eye - Recording with closed
5. Eye tracking using Internet of Things
movement disorders. eyes is not possible.
- Video recording system is - Invasive and Developments in mobile communication devices, information tech­
easily handled. noninvasive. nology, sensors, and data have led to a significant enhancement in IoT
- Uncomplicated settings.
and novel innovative technologies and services. An IoT micro-sensor
- Can allow head movement and
fully remote recording. was used to integrate objects into wireless networks that allow objects
- Use of machine learning to interact with one another and to interact with other objects, leading to
techniques to gain higher reliable and continuous exchanges of relevant information improving
accuracy. the quality of a user’s life. The expansion of IoT has taken over the scope
- - Not expensive.
of homes, health care, environmental monitoring, inventory

7
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Table 3 Table 3 (continued )


A classification of different machine learning algorithms that use eye tracking Machine Research Goal or Strengths Limitations
data along with the research goal and the algorithm strengths and limitations. Learning Task
Machine Research Goal or Strengths Limitations Algorithm
Learning Task estimation in detection The efficiency of a
Algorithm augmented spaces ( methods. search is still a
Regression Analyzing visual A generalization Robust standard Lemley, Kar, & significant issue in
world eye tracking of logistic errors are not Corcoran, 2018), state-of-the-art
data using regression, available for how are learning detection methods.
multilevel logistic known as multilevel strategies reflected
regression in multinomial regression in the eyes (
psycholinguistic logistic approaches. The Catrysse et al.,
research (Barr, regression, can be implementation of 2017), video
2008), (Ahmad, used to handle an empirical logit saliency prediction
2018), automatic polytomous transformation (Jiang, Xu, Liu,
classification of variables. with multinomial Qiao, & Wang,
health information data is not clear. 2018), event
(Pian, Khoo, & Chi, detection (Zemblys,
2017), analysis of Niehorster,
eye movement data Komogortsev, &
using mixed effect Holmqvist, 2018),
modeling with egocentric visual
Poisson regression ( perception (
Huang, Luh, Sheu, Hristozova,
Tzeng, & Chen, Ozimek, & Siebert,
2012), and 2018), surgical
improving eye robot control (Li,
tracking calibration Hou, Wei, Song, &
accuracy using Duan, 2018), eye-
symbolic regression blinking detection
(Hassoumi, on smartphones (
Peysakhovich, & Han, Kim, & Park,
Hurter, 2019). 2018).
Convolutional The use of deep Highly accurate Feature selection, Artificial Eye-center The ANN-based The small sample
Neural learning to even when the sample size, and Neural detection (Ibrahim, prediction model size reduces the
Networks semantically size of the data analysis are Network Zin, & Ibrahim, is proven to statistical power
segment medical training set is important steps (ANN) 2018), driver generate high and reliability of
images (Stember small. CNNs are a during the learning fatigue (He, Li, Fan, correlation the prediction
et al., 2019; Krafka, good tool when and may affect the & Fei, 2016), coefficient model. A field
Khosla, Kellnhofer, developing eye accuracy of the temporal data between the study that controls
Kannan, tracking systems results if they are processing ( original and the lighting
Bhandarkar, for VR and AR. not appropriately Malakhova, predicted data. conditions is
Matusik, & controlled. Shelepin, & necessary to
Torralba, 2016), Malashin, 2018), validate the
classification of eye and predicting generated results.
tracking data (Yin, mental workload in
Juan, & nuclear power
Chakraborty, plants (Wu, Liu, Jia,
2018), strabismus Tran, & Yan, 2019).
recognition (Chen Naïve Bayes Classify individual Works better for Cannot incorporate
et al., 2018), fetal eye tracking small sample size feature interactions
ultrasound plane patterns with the and the training is
detection (Cai, help of basic eye fast.
Sharma, Chatelain, tracking metrics
& Noble, 2018), using naive Bayes
and confusion classifier (Bhattarai
prediction ( & Phothisonothai,
Salminen, Nagpal, 2019), better
Kwak, An, & Jung, understanding of
2019). Deep neural children’s visual
networks for cognitive behaviors
human-level (Moreno-Esteva
saccade detection ( et al., 2018),
Bellet, Bellet, successful
Nienborg, Hafed, & collaboration in a
Berens, 2019), pair programming (
feature extraction Villamor &
to develop a hybrid Rodrigo, 2018),
eye tracking on a Gaussian naive
smartphone ( Bayes was used for
Brousseau, Rose, & Autism Spectrum
Eizenman, 2020). Disorder (Thapaliya
Deep learning A collaborative Deep learning has Lack of large et al., 2018),
computer aided improved the datasets for prediction of
diagnosis ( accuracy and training more human intention
Khosravan et al., performance of sophisticated deep and emotion
2019), eye gaze learning models. analysis (He et al.,
2018), and
(continued on next page)

8
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Table 3 (continued ) Table 3 (continued )


Machine Research Goal or Strengths Limitations Machine Research Goal or Strengths Limitations
Learning Task Learning Task
Algorithm Algorithm

predicting levels of on the scan path of assess whether vision tools may be
learning (Parikh, the interest and the operator gender required to detect
2018). state of mind of has an influence and track region of
Support Vector Detecting readers Using SVM to To efficiently use people depending on visual interests (ROIs) as
Machine with dyslexia (Rello predict Dyslexia an SVM, the size of on variations ( behavior or not. they move.
& Ballesteros, using eye tracking the dataset should Coutrot et al.,
2015), eye-state measures is be significant. 2018), analysis of
classification ( feasible Limited fixation sequences
Dong, Zhang, Yue, generalizability during visual
& Hu, 2016), gaze owing to the inspection of front
latent support sample size. panels in a home
vector machine for appliance facility (
image classification Ulutas et al., 2020),
(Wang, Thome, & classification of
Cord, 2016; Wang, mobile eye tracking
Thome, & Corda, data (Kit &
2017), classifying Sullivan, 2016).
major depression
patients and
healthy controls ( management, intelligent transportation, and numerous other fields
Xinfang et al., (Haslgrübler, Fritz, Gollan, & Ferscha, 2017).
2019), content-
based discovery for
At present, important facilities such as the monitoring of vehicle
Web map service ( dynamics and smart navigation are being provided using IoT, which can
Hu et al., 2016), connect anything electronic to an existing Internet connection. IoT can
evaluating be used as a car assistant, to reduce car accidents that occur owing to
orthodontic
driver drowsiness. IoT is also used to monitor eye and head movements
treatment from the
lay perspective ( (Murauer & Haslgrübler, 2017). When a tracking system detects an
Wang et al., 2016), abnormal eye blinking rate, the driver will be alerted to stay awake. To
measuring learning benefit from the use of IoT in eye tracking, Tourch7 was incorporated
attention in using EyeDee technology (Morozkin & Swynghedauw, 2017).
eLearning (Liu,
Chang, & Huang,
Many applications have been developed that use an eye tracking
2013), eye-gas mechanism with IoT, to enhance the quality of life. For example, in ed­
estimation for face ucation, assessing the social interactions of students is a key part of
detection and head measuring their visual attention. An interactive framework was devel­
pose estimation (
oped by Kangas et al. that tracks and analyzes students’ eyes during the
Cabrera-Quiŕos,
2014), learning process, triggering students to focus more to increase their
identification of attention and interaction (Kangas, Špakov, Isokoski, Akkil, Rantala, &
literacy skills (Lou, Raisamo, 2016). This framework can fully benefit by modern interaction
Liu, Kaakinen, & Li, technology, allowing the learning to increase to a higher level.
2017), on-board
estimation of
Another example of using an eye gaze mechanism with IoT tech­
alertness of drivers nology is the development of assistive systems that help elderly and
(George, 2012), special needs people to direct and control any device in their smart
classification of the homes. This approach focuses on usability and reliability, which are the
age of toddlers (
most important aspects of such systems (Haslgrübler, Fritz, Gollan, &
Dalrymple, Jiang,
Zhao, & Elison, Ferscha, 2017; Kangas, et al., 2016; Gill & Vogelsang, 2016; Bissoli,
2019), and Lavino-Junior, Sime, Encarnação, & Bastos, 2019).
personalized A lensless smart sensors (LSS) is a low-power, low-cost visual sensing
control system ( technology that adds optical and thermal sensing capabilities to an IoT
Yoo, Yeon, &
Jeong, 2016).
device in a form that cannot be achieved with a traditional lensed system.
Random Forest Event detection ( RF can use the RF algorithm An LSS captures information-rich scene data in a tiny form using a rev­
(RF) Zemblys et al., features as they cannot learn the olutionary new approach to optical sensing, and enables users to expe­
2018; Zemblys, are. There is no sequence and rience augmented reality without sacrificing style, design, or comfort, by
2016), pupil need to scale, context
providing invisible eye tracking hardware to the outside world. It uses
localization ( center, reduce, or information
Kacete, Seguier, & transform them in directly from the both appearance- and feature-based algorithms to accurately track the
Royan, 2016; any way raw data and user’s gaze from a wide field of view and across the entire range of eye
Markuš, Frljak, produces the final motion from an extremely close distance (Gill & Vogelsang, 2016;
Pandži,́ Ahlberg, & output without the Abraham, Urru, Wilk, Tedesco, Walsh, & O’Flynn, 2017).
Forchheimer, 2014) need for any post-
blinking detection ( processing.
Internet shopping and advertising on social media are on the rise
Marcos-Ramiro, these days. It is important to track and analyze the user’s interests and
2014). interaction. Using IoT to track customers will help increase online
Hidden Markov Using HMM and HMM may help HMMs are shopping. SMI Eye Tracking Glasses and pupil Labs have been used to
Model discriminant discover human dependent on the
track a customer’s eyes to track the customer’s focus and know when the
(HMM) analysis (DA) visual behavior structure of the
models to suggest a patterns. It may visual content, and customer interacts (Haslgrübler, Fritz, Gollan, & Ferscha, 2017).
method that focuses also be used to thus computer A Midas touch problem occurs when users unintendedly select ob­
jects just by gazing at them. Solutions to this problem include gaze

9
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

fixation, blinking, and external triggers. Researchers have proposed a each node in parallel, and thus each slave node handles a certain number
gaze-mediated interaction method, called DualGaze (Mohan, Goh, Fu, & of concurrent eye tracker tasks. Otherwise, a real-time heatmap
Yeung, 2018), which does not rely on an external trigger or separate rendering algorithm is required to divide computations into small tasks,
menu to confirm a selection. It is an interaction method that requires each of which is computed on a slave node. Therefore, running as many
users to perform different gaze gestures to select and confirm an object. tasks as possible in parallel leverages cloud computing. In addition,
When a user looks at an object to select, a confirmation flag pops-up there are other factors that should be carefully considered, such as the
adjacent to the object to confirm to which point the user is looking at. number of nodes, the ways to distribute/merge data and/or tasks among
pressed “Enter.” Gazing at the object will then confirm the selection, these nodes, and a time segment. EyeCloud showed that cloud
which leads to an intentional confirmation of the object in less time and computing can achieve better handling than a single computer by
a correct selection. This also addresses the Midas touch problem in VR. decreasing the running and delay times.
Evaluation results have shown that DualGaze improves the familiarity of We should not use cloud computing (DAO, 2014):
use and is significantly more accurate while maintaining a comparable
selection speed. There are several classifications for an eye action. Xiao • when computational requirements are complex, particularly when
et al. (Cohn, Xiao, Moriyama, Ambadar, & Kanade, 2003) provided a using a large amount of data with a long upload time;
three-category classification: blink, flutter, and non-blinking for only the • when the number of nodes required for analysis is not too large;
right eye. The classification will be used to determine the input for the • when the cloud requires many communication messages, which can
tracking to obtain the right time for detection. lead to a slow computation speed.
Electrography and VOG have been highly discussed in the medical
field and both have been widely studied for many different applications. A cloud-based approach represents a centralized model, which may be
Their differences show that electrography is suitable for eye tracking a problem that can be solved through fog computing, which is a decen­
while a patient is sleeping, and for certain 3D tracking purposes. tralized architectural pattern that brings computing resources and
Although this is due to enhancements in technology and the focus on a application services closer to the edge. Fog computing complements the
more reliable low-cost technique, VOG has also been proven to be more cloud and improves its efficiency by reducing the amount of data that is
efficient in many different fields, with advances in VOG found even as transferred to the cloud for processing and analysis while also improving
low sampling rates have faded owing to the use of high-end image sensor security, a major concern in the IoT industry. IoT nodes are closer to the
systems (Kong et al., 2018). action, but they do not have sufficient computing and storage resources to
conduct analytics and ML tasks. By contrast, although cloud servers have
6. Cloud computing and eye tracking the required horsepower, they are far from being able to process data and
respond in real time (Dickson, 2016). To the best of our knowledge,
There are several challenges to state-of-the-art of eye tracking studies combining eye tracking with fog computing do not exist; however,
research, including slow and local computations of huge data aggrega­ some attempts can be found in fog computing with sustainable smart
tions and analysis, a massive number of distributed mobile users, limited cities (Perera, Qin, Estrella, Reiff-Marganiec, & Vasilakos, 2017), fog
research laboratories, a merging of eye tracking with augmented reality computing trends, architectures, requirements, and research directions
systems, and the fact that next-generation interactive systems and ap­ (Naha, Garg, Georgakopoulos, Jayaraman, & Gao, 2018).
plications will be ubiquitous. These challenges are preventing the Real-time synchronization, security, offline and batch processing,
development of online processing and pervasive usage. Distributed classification with feedback, and design of cloud-based distributed al­
technologies and cloud computing represent solutions to some of these gorithms represent some future directions for combining eye tracking
problems, allowing instant access to large amounts of eye tracking data with cloud and fog computing.
of a massive numbers of users at once (Vrzakova et al., 2013). The eye
tracking research community expects cloud based eye tracking appli­ 7. Eye tracking applications
cations to be available in the near future. This places more pressure on
considering the current challenges and special requirements from HCI Eye tracking is a valuable tool for any type of research on human
aspects, through parallel computing, embedded and network systems, behavior. It can be used to develop applications in a variety of fields,
and hardware optimization. Many other important topics, such as such as healthcare and medicine, psychology, marketing, engineering,
ethical and privacy issues, along with other social implications of education, and gaming, as well as enhancing human–computer in­
pervasive cloud based eye tracking, need to be carefully examined in teractions using the eyes for navigation and control. Herein, we focus on
future research. We argue that combining eye tracking research with four application fields owing to their real values and based on our in­
cloud computing will allow efficient data storage, sharing, processing, terests and experience.
modeling, and evaluation. In addition, it may help with conducting a
wide range of usability studies and questionnaires along with eye 7.1. Healthcare and medical applications
tracking and interaction analyses to access and generate useful infor­
mation at any time and location. Eye tracking techniques have been widely used as a research tool in
Future interactive applications and pervasive cloud-based eye healthcare and medical applications. Healthcare researchers have
tracking solutions will force designers and developers to focus more on recently shown that using eye tracking techniques in medical decision-
design principles, hardware requirements and improvements, intelligent making achieved better results. In general, the quality and perfor­
display management, an immediate interaction response, auto- mance of appropriate medical procedures and treatments along with use
calibration systems, and application requirements and development. of advanced technologies make the study of the human mind and body
EyeCloud (DAO, 2014) was one such attempt combining distributed much easier, and therefore enhance the process of medical decision-
algorithms with eye tracking into a cloud-based structure. A cloud-based making. In addition, the information provided by these technologies is
structure uses a parallel computation to send/receive messages through useful in revealing abnormalities in people’s feelings and intentions.
different nodes, adding network latency to the overall latency. When the The evolutions of current technologies indicate that eye tracking
network latency is less than the processing time in the cloud structure. In systems can help people who suffer from certain medical issues (Königa
addition, the efficiency and performance of the EyeCloud depends on & Buffalo, 2017). Eye tracking provides a rich source of data in an
designing distributed algorithms along with the problem itself. Some orderly time range that help a set of medicinal theories to resolve issues
problems might be distributed while others might not. For example, it regarding the inability to deduce cognitive processes. In addition, it
might not be possible to deploy an online fixation detection algorithm on allows for a precise diagnosis or monitoring of wellbeing.

10
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Patients suffering from amyotrophic lateral sclerosis (ALS) face dif­ children the ability to build their learning and attention skills. Evalua­
ficulties in speaking or writing. A software tool that uses neuropsycho­ tion results showed that the system can help children with attention
logical screening instruments based on the Edinburgh Cognitive and deficit and learning problems.
Behavioral ALS Screen has been developed (Keller et al., 2017). It uses Lauermann et al. (Lauermann et al., 2017) developed an evaluation
an infrared sensor, called EyeLive, to track the eye movements through a model to evaluate the impact of eye tracking technology on the quality
predefined user interface. It does not use a mutual camera-based of optical coherence tomography angiography (OCT-A) image in pa­
approach, instead it uses the infrared sensors built upon a pair of tients with age-related macular degeneration (AMD). The evaluation
glasses to identify the direction of the eye gaze. Current entry level results showed that using active eye tracking technology improves the
systems for people suffering from ALS have certain problems: They are image quality in OCT-A imaging in terms of the involuntary movement
expensive and are not robust under different lighting conditions, and of AMD patients in cases of a higher acquisition time. Table 4 summa­
they require special calibrations. To address these challenges, Zhang rizes a list of some diseases that use eye tracking technology along with
et al. have proposed an eye-gesture communication system, called the main findings and eye tracking techniques used.
GazeSpeak, that has certain characteristics, including a low-cost design, Many researchers have developed medical solutions for elderly and
robustness, easy learnability, a high user satisfaction, portability, the special needs assistance through ML and eye tracking technologies.
capability of running on smartphones, and a higher bandwidth than an Songpo and Jeremy developed a human–robot-interaction (HRI) mo­
e-tran board (Zhang & Kulkarni, 2017). In addition, it decodes eye dality based on 3-D gaze data that helps users with motion impairment
gestures into predicted spoken words in real time and enables commu­ express what tasks they want the robot to perform by directly looking at
nication with different interfaces for speakers and translators. Moreover, an object of interest in the real world (Li, Zhang, & Webb, 2017). Their
it outperforms eye gaze transfer (both low-tech and common use) in method uses a binocular eye tracker to detect the pupil and gaze vector to
terms of both connection speed and ease of use, with a low rate of estimate the eye gaze, and then interprets the human gaze and converts it
misrepresentation. into an effective interaction modality. The evaluation results demon­
NajiKhosravan et al. developed an algorithm that uses a variable strated the efficiency and effectiveness of gaze-based HRI, which is ex­
collaborative-computer aided diagnosis (C-CAD) system to integrate pected to help users who suffer from impaired mobility in their daily lives.
CAD with eye tracking systems (Khosravan et al., 2019). The authors Elderly people often cannot remember where they leave certain ob­
used a graph clustering to convert eye tracking data into a graph model, jects, and thus they use a visual search more frequently than others. Thus,
which can be used in an actual radiology room. In addition, the system a visual search plays a key role in their ability to complete everyday ac­
includes a 3D deep learning algorithm designed into a new multi-task tivities. There have been several studies analyzing eye movements during
platform to concurrently split and recognize suspicious areas. The au­ a visual search. First, it is necessary to detect whether a user is searching
thors test their system on lung cancer patients in the presence of multiple for an object, and then the required assistance can be provided. Dietz et al.
radiologists. The results showed that their system improves the accu­ proposed an eye tracking multimodal approach to support the visual
racy, efficiency, and applicability in a real radiology room setting. search process and assess elderly people (Dietz, Schork, & André, 2016).
To study the behavior of infants during normal interactions, an eye In addition, the authors used a head mounted display (HMD) technique to
tracking system was used to record their gaze while they were playing explore multiple strategies that provide the user with the location of the
with their mothers. Franchak et al. studied the infants’ visual explo­ desired object. Pupil Pro has been applied to manage a head mounted
ration during spontaneous, unconstrained, natural interactions with tracker with dual cameras: an infrared camera and a camera for the sur­
their caregivers, objects, and obstacles (Franchak, Kretch, Soska, & rounding view. Furthermore, the authors showed that, based on the
Adolph, 2011). The results revealed that the visual exploration of in­ initial experiments, using an eye tracking mechanism can help in
fants was opportunistic and depended on the availability of informa­ addressing some of the challenges faced by a visual search.
tion and the limitations of their bodies. The most frequent recurrences Eye imaging and eye movement visualization and metrics used in
of the infants’ gazes were most often in the direction of hand move­ medical applications were studied by Harezla and Kasprowski (2018);
ments and crawling, but the least frequent were in the direction of leg they highlighted that eye tracking applications in the medical field that
movements. Assistive systems were researched by Enkelejda et al. to may have significant potential are yet to be explored and investigated,
help individuals with glaucoma and evaluate their functional ability in particularly for children, such as the use of eye tracking in therapy and
everyday tasks (Kasneci, Black, & Wood, 2017). The authors conducted in games with gaze contingent interfaces, which may have a positive
standardized experimental studies and used the results to develop effect on their vision.
customized solutions. An SVM classifier has been used to help elderly people automatically
Individuals with autism spectrum disorder (ASD) suffer from the detect objects visually, to identify the point in time when they need
integration of audiovisual methods. Finke et al. aimed to monitor the assistance. Dietz et al. developed a complete mobile eye and head
behavior of children with and without ASD through the use of video tracking device. Their evaluation results using Google glass eye tracking
games and by controlling their faces and reactions using the Tobii eye have demonstrated the feasibility of the approach and achieved a
tracking system (Finke, Wilkinson, & Hickerson, 2017). The results of recognition rate of 97.55% with a leave-one-user-out evaluation method
their study showed that children with ASD were attracted to video (Dietz et al., 2017). The authors used Google glass over Tobii or Pupil
games in the same way as ordinary children (Hansen & Ji, 2010). Itziar Lab owing to the feedback it provides to the users. Klaib et al. proposed a
et al. used an eye tracking system to determine if infants suffer from ASD model that utilizes a VOG approach through Tobii technology with
owing to genetics in their early development. The preliminary results of added voice interfaces using the Azure cloud to help control home ap­
the study showed the advantages and limitations of using eye tracking to pliances. It does not use wearable technology through VOG with mul­
register visual attention measures and how the complexity of the tiple cameras, which delivers several advantages: an accurate gaze
observed features present a great number of challenges regarding the estimate, a portability of the video recording system, and fully remote
quality of data obtained through eye tracking (Lozano, Campos, & recordings. It traces the user through reflected infrared light patterns
Belinchón, 2018). and calculates the gaze position automatically. The flexibility and reli­
Another attempt to support the rehabilitation of cognitive functions, ability in interacting with the system, and the accuracy in the movement
this time in children with attention deficit hyperactivity disorder, was of the pointer, make this model a reliable solution for simplifying some
developed by Garcia et al. (Garcia-Zapirain, de la Torre Díez, & López- daily tasks of the elderly and special needs users (Klaib et al., 2019).
Coronado, 2017). The authors used a technological platform based on The current literature has demonstrated that eye tracking data is a
the .NET framework and two physiological sensors: a hand recognition powerful tool to provide valuable information on healthy or unhealthy
sensor (Leap Motion) and eye tracking device (Tobii X1 Light) to provide aging processes along with normative values for some oculo-metrics

11
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Table 4 Table 4 (continued )


A list of diseases that use eye tracking technology along with the main findings. Study Disease/ Device/Sensor Main findings
Study Disease/ Device/Sensor Main findings Research Goal
Research Goal
register more visual
(Scott et al., ALS and other Optical sensors/ EyeLive does not use a attention measures,
2017) locked in diseases emitters common camera-based provide concrete
approach, so it reduces answers to the role of
eye strain, increases some areas of interest.
accuracy, enhances (McCamy et al., Attention Deficit Leap Motion • help children with
mobility, and 2015) Hyperactivity and Tobii X1 attention deficit and
improves user- Disorder (ADHD) Light learning issues.
friendliness • Teachers may utilize
(Sharafia et al., Lung Cancer C-CAD with eye- Providing a paradigm this system to track
2015) tracking with shifting CAD system, the progression of
graph clustering called collaborative their students and
CAD (C-CAD), that see their behavior.
unifies CAD and eye- (Chen et al., Strabismus A gaze deviation • The authors
tracking systems in 2018) Recognition (GaDe) image proposed a CNNs-
realistic radiology based method for
room settings. strabismus
(Kitchenham, Glaucoma Mobile Glaucoma patients recognition.
2004) might be supported • The CNNs is a
through development powerful alternative
of personalized in feature extraction
solutions that combine of eye-tracking data.
VR technology with
eye-tracking
algorithms, in which (Marandi & Gazerani, 2019). However, most of the eye tracking data
the tracking should be were generated in a laboratory environment, which may differ when
done under
generated in a real-life situation. Thus, it is important to study the
standardized
experimental importance of eye tracking data in natural situations and how they
conditions. impact elderly people.
(Hansen & Ji, Autism Spectrum Tobii eye- ASD children were The current literature has shown that combing both head- and eye-
2010) Disorder (ASD) tracking system attracted to video movement data will provide a better understanding of the gaze pat­
games in the same way
as ordinary children
terns, visual information, and visual scanning strategies along with the
(Chennamma & Amyotrophic EyeTribe • Feasibility and necessity to fill in the lack of information (Paraskevoudi & Pezaris,
Yuan, 2013) Lateral Sclerosis connected to the reliability of a 2018). However, further research needs to be conducted to study the
(ALS) notebook mobile eye-tracking effects of head verses eye movements and their impact on several ap­
screening tool to
plications for elderly people.
help ASL patients.
• It can be easily The human eye generates different patterns of eye movement across a
administered even scene, which is called the scan path (Kurzhals, Burch, & Weiskopf, 2016).
at the patient’s Gaze plots for elderly people tend to cause visual clutter as the number of
bedside in a people in a scene and the length of the scan path increases. Therefore,
timeframe of usually
less than one hour.
better visual analysis techniques are required for the data generated from
• Eye-tracking based a long eye tracking sequence. Furthermore, a comprehensive comparison
ECAS version is well between different visual analysis techniques that combine an automatic
suited to identify and visual scan path is needed because standard approaches do not work
those ALS patients
well (Kurzhals et al., 2016) and have not been used for an extended
with substantial
cognitive evaluation of visualization techniques. For modeling a scan path, Coutrot
impairment. et al. used hidden Markov models and discriminant analysis models to
(Sorate & To study the Head-Mounted The most frequent suggest a method focusing on the scan path of interest and the state of
Chhajed, behavior of the Eye-Tracker, recurrences of the mind of people depending on the variations. The evaluation results
2017) infant Image infant’s gaze were
processing most often in the
showed that the suggested algorithm achieves a classification accuracy of
algorithms direction of hand 81.2% (Coutrot, Hsiao, & Chan, 2018).
movements and Eye tracking is a highly useful tool in virtual reality (VR) (Clay, König,
crawling, but the least & König, Eye Tracking in Virtual Reality, 2019), and provides an inno­
frequent were in the
vative way to interact with VR content. It achieves a high level of privacy
leg movements.
(Robinson, Age-related Multimodal Using active eye- and security using biometric identifiers through retinal scanning and
1963) macular retinal imaging tracking technology identification systems, as well as adding the extra functions of connection
degeneration including OCT- improved image and feedback to an experience. In addition, it reduces the GPU load and
A quality in OCT-A generates high-quality content through foveated rendering, which is a
imaging. regarding
presence of motion
process that renders a specific part of the screen where the user is
artifacts at the expense currently looking in full resolution. Additionally, it provides more natural
of higher acquisition interactions between a human and applications, which helps developers
time. in understanding how their application is being used. However, it has
(Eibenberger, ASD due to NA Using eye-tracking
certain limitations, such as the subjects wearing glasses are not always
Eibenberger, genetics of their involves advantages
& Rucci, early such as: automate able to participate in the experiment. Moreover, additional time is needed
2016) development testing sessions, to calibrate and validate the eye tracker, which may hinder the experi­
ments, particularly in longer sessions for elderly people who have slow

12
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

natural movements and experience motion sickness. Owing to the rapid based methods and methods based on AOI, to help researchers apply
developments in VR technologies, however, we expect that solutions to qualitative analysis methods based on their classification. In addition,
these limitations will soon be addressed. they managed to contact the main experts of eye tracking techniques to
In contrast to the use of eye tracking in many healthcare and medical check their use of visualizing techniques in analyzing eye tracking data,
applications, there are still some limitations owing to eye movements and as a result, have managed to identify the main obstacles that can be
reflecting the cognitive processes, although cognitive processes cannot avoided, and this led to an increased use of visualizations in eye tracking
be directly inferred from eye tracking data. To interpret eye tracking research (Blascheck et al., 2017).
data properly, theoretical models must be used as the design basis of the Aspects and issues related to eye-movement technology, guidelines
experiments as well as for analyzing and interpreting the eye tracking for developing eye tracking applications, and various opportunities and
data (Königa & Buffalo, 2017). challenges for developing systems using eye tracking technology were
introduced by Chandra, Sharma, Malhotra, Jha, and Mittal (2015).
7.2. Human–computer interaction (HCI) and accessibility Bulling et al. showed that subjects with or without bifocal glasses show
similar fixation results if they do not have any vision issues, but the
Using eye tracking technology to interact with computers implies results changed and incurred a small error if corrected using lenses. In
extracting information from the area the user is gazing or looking at, addition, they showed that input from Eye Tribe© requires less time
which goes back at least to the early 1990 s. The most common tasks compared to input from a mouse (Bulling et al., 2011).
applied through eye tracking in the HCI field are pointing, moving screen Within an increase in the number of smartphones worldwide and the
objects, selecting menus, dragging, and dropping (Bulling & Gellersen, human disdain for wearing extra special hardware (Khamis, Alt, &
Toward mobile eye-based human–computer interaction. , 2010). Bulling, 2018), we expect that the development of special smartphones
To analyze infer-facial expressions using partially occluded faces or handheld devices for tracking human eyes to enhance the interaction
while the user is involved in a VR experience, Hickson et al. showed an between them will further improve in the coming years. Such inexpen­
algorithm that uses an infrared gaze tracking camera inside a VR sive specialty devices should be designed to make the eye tracking
equipment with a HMD to capture a set of images of the user’s eye. systems more robust and comfortable and allow the camera to see the
Experiments showed that these images are sufficient to decide which users’ eyes even if they do not hold them in a suitable way or when a part
subset of facial expressions the images belong to without the use of any of their faces are occluded. In addition, using gaze interaction tech­
fixed external camera. The accuracy of classifying an expression was niques to leverage eye behaviors is a more promising direction for
74%, and the accuracy of classifying a facial action unit was 70% handheld devices, particularly for those who do not require a manual
(Hickson, Dufour, Sud, Kwatra, & Essa, 2019). calibration, such as gaze-gestures. Moreover, the training systems
Researchers have investigated the applicability of using eye tracking should be developed that use eye tracking to measure user interactions
technology with a small smart phone or tablet screen. Using eye with programs to evaluate where, how, and when the instruction can be
movements to interact only with a smartphone was investigated by altered. A complete holistic view on the past, present, and future of eye
Dybdal, Agustin, and Hansen (2012). The authors used finger-strokes tracking techniques applied on handheld mobile devices can be found in
and accelerometer-based interactions with gaze-gestures and dwell- Rosch and Vogel-Walcutt (2013), Khamis, Alt, and Bulling (2018).
time selection strategies. The results showed that a touch interaction Table 5 shows a comparison between recent studies using eye tracking
is better than a gaze interaction in terms of completion time of the technology in HCI along with the algorithm, device, and datasets
accelerometer and the error rate. applied.
There has been a shift from studies on tools that focus on desktop HCI allows people to interface with systems, whereas assistive sys­
graphical applications to the development of end-user development tems use HCI to help people with special needs who cannot move their
(EUD) research for web environments (Tzafilkou & Protogeros, 2017). arms or have special needs in different areas such as the work envi­
Currently, eye tracking and mouse technologies are being used to ronment, social interaction, normal activities, and education. Eye
monitor end-user behaviors in real time. Tzafilkou and Protogeros have tracking techniques detect the direction of the eyes to control the cursor
explored the correlation between eye movements and end-user percep­ movements on the screen. Some technologies such as an intelligent
tion and acceptance in modern web-based EUD environments. The au­ wheel chair (Gautam, Sumanth, Karthikeyan, Sundar, & Venkataraman,
thors have tried to see whether the end-user sense and acceptance are 2014), head control (Cognolato, Atzori, & Müller, 2018), voice recog­
reflected in the eye behavior when interacting with a web-based and nition software, single-switch access, sip and puff switch, oversized
database-driven EUD system. The authors developed a prototype based trackball mouse, adaptive keyboard, and joystick have been developed
on a natural language approach to help end-users create mobile-based as different control interfaces to help take care of people with special
database applications. Evaluation results showed significant associa­ needs. However, some people have severe motor disabilities and are
tions between eye behavior, perception, and acceptance (Tzafilkou & unable to move their limbs freely or speak clearly. Therefore, eye
Protogeros, 2017). tracking technology can be used and represents a promising source of
The IPad’s front camera and eye/head tracking technology were data in an assistive system (Koester & Arthanat, 2018; Deepika &
used to enhance the lives of people with special needs. The Haar Cascade Murugesan, 2015).
algorithm was used to detect facial features. Light colored eyes were Gautam et al. proposed an EOG-based method with an optical-type
found to gain the best results with 90% accuracy. Face detection accu­ eye tracking system to support a mechanical wheelchair for physically
racy was 100% in both light and dark environments (Lopez-Bas­ challenged people (Gautam et al., 2014). User eye movements have also
terretxea, Mendez-Zorrilla, & Garcia-Zapirain, 2015). been translated into the screen position. The computer input system
Activity recognition based on eye movement was investigated by sends an instruction to the software when the user looks at a convenient
Bulling et al. The authors recorded eye movement using an EOG system angle of the pupil by applying Daugman‘s segmentation algorithm. An
and then assessed the repetitive patterns of eye movements. The authors electric powered wheelchair that combines electro encephalography
used eight study groups to evaluate their proposed method using five (EEG) and eye movements was developed by Taher et al. to help people
activity classes. Two classifiers, a person-independent classifier, and an with severe disabilities. The developed wheelchair does not affect the
SVM were used. Evaluation results showed a precision of 76.1% and a security of the patients and achieved a success rate of 88.66% (Taher,
recall of 70.5% (Bulling et al., 2011). Amor, & Jallouli, 2015). Meena et al. have developed another tool,
Visualization techniques for eye tracking data were investigated by called EMOHEX, to help people in need of a wheelchair. It was built
Blascheck et al, who reviewed a large number of research papers using a duel control panel to determine the intention of the user such
focusing on visualization techniques and classified them into point- that they can identify the detection of any command button of an eye

13
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Table 5
A comparison between recent studies that use eye tracking technology in HCI.
Reference Algorithm Goal Device Dataset/Users

(Păsărică, Bozomitu, Supervised Learning using a CNN Classifying facial expressions IR gaze tracking camera inside a VR The authors collected
Cehan, Lupu, & equipment with an HMD data from participants in
Rotariu, 2015) a controlled setting.
(Picanço & Tonneau, Finger-strokes and accelerometer- Does using eye movements only is Cameras of small screens of smart 11 subjects in a
2018) based interactions with gaze- feasible to interact with the small screen phones and tablets controlled environment
gestures and dwell-time selection of a smartphone?
strategies
(Martinikorena et al., Prototype EUD tool based on a explore the correlation between eye Explored the correlation between eye NA
2018) natural language approach, called movements and end-user perception and movements and end-user perception and
‘simple talking’). acceptance in modern web-based EUD acceptance in modern web-based EUD
environments. environments
(Lupu & Ungureanu, Person-independent (leave-one- Investigate the eye-movement data to EOG system using saccades, fixations, Eight study groups using
2013) person-out) and support vector recognize activity and blinks five activities classes.
machine classifiers

tracking system (Meena et al., 2016). The evaluation results demon­ noisy fixations. In addition, it can detect the beginning and end of sac­
strated the feasibility and stability of the system. cades using the transition times between the fixations and saccades
An algorithm having five stages when applied using a personal along with four criteria: the distance, velocity, acceleration, and angular
computer and a low-cost webcam was developed by Ghude et al., velocity (Königa & Buffalo, 2014).
allowing users to easily handle a computer. It can detect eye movements Training special needs people to control their eye gaze under real
efficiently and accurately in real time (Ghude, Tembe, & Patil, 2014). scenarios could be the next step toward helping them to master this
Another study was proposed by Deepika and Murugesan to facilitate the method. A simulation-based environment, such as VR, may be a solution
interactions between computers and users with special needs using that can be widely used in training for real-task scenarios, and it pro­
video-based eye tracking technology (Deepika & Murugesan, 2015). vides a completely immersive environment for medical skills training
VOG and EOG methods along with a webcam have been used to control and for people with intellectual disabilities (Zhang & Hansen, 2019).
the movements of the cursor on the screen using the human eye. Videos
gained from a webcam were processed to locate the center of the iris to 7.3. Education and E-learning
estimate the eye gaze. The proposed system performed well under good
lighting conditions with an accuracy of 97%, but in dark areas the ac­ The use of eye tracking techniques in education and learning has
curacy was 87%. several applications, such as assessing the level of understanding of
Some research focused on people with hand disabilities who cannot readers and measuring the interest, attention, and other aspects of users.
use a mouse or keyboard while controlling a computer. Dagnew and Most studies have indicated that the use of eye tracking data in such
Kaur proposed a system that calculates the pixel value and sclera applications succeeds in detecting changes in the cognitive progress and
detection to apply an eye detection (Dagnew & Kaur, 2016). Another assessing the concentration of learners, which allows the use of this
approach proposed by Haritha used eye detection and a tracking camera information to enhance the educational process (Rosch & Vogel-
to capture images using the isophone center applied in eye tracking and Walcutt, 2013; Sun, Li, Zhang, & Zou, 2017).
eyeblink detection methods (Haritha et al., 2016). An eye tracking Eye tracking technology has been used by Inoue1 and Paracha as a
system using steady state visual evoked potentials technology was used tool to assess the extent to which fluent and non-fluent readers complete
for the initial selection and to activate the exact target (Stawicki, the processing of texts and images to improve their reading outcomes
Gembler, Rezeika, & Volosyak, 2017). (Inoue & Paracha, 2016). Reading progress has also been studied by
Another system was developed by Utaminingrum et al. to support Rasmussen and Tan using an eye gaze tracker through speech recogni­
people with special needs by adjusting and recognizing eye movements tion to generate word probabilities to increase the likelihood of the
when considering the decision of both the left and right eyes (Utami­ language model. Their experimental results showed that the tracking
ningrum, Fauzi, Sari, & Primaswara, 2016). The Haar Cascade algorithm error rate for speech recognition was 34.9%, whereas the error rate for
has also been applied for monitoring the area of the eyes. In addition, a combing eye gaze tracking with speech recognition was 31.2%, which is
Hough Circle Transform was also used to make a decision regarding the an improvement of 10.6% (Rasmussen & Tan, 2013).
handling of the eye movement position. The accuracy reached more Software developers have worked with programs that consist of
than 80%. different source code files, each of which typically contains thousands of
To enhance and facilitate the lives of individuals with upper-limb lines of code. They spend most of their development time accomplishing
disabilities, an eye-control approach was proposed by Tunhua et al. activities such as file switching, code folding, and content scrolling
based on Kinect 2.0. This approach has been utilized to express the (Minelli & Mocci, 2015). While they read, understand, and debug code,
critical points of the eyelid. Indirect control was made to ensure a they constantly moving their eyes between multiple files and elements,
smooth and gradual movement of the cursor to a fixation point, which which increases the development time and makes the process of
reduced the difficulty of calibration and interference from eye jitter on capturing the context of what developers are looking at difficult. Sharafi
the control. This system provided good stability and high efficiency et al. evaluated the use of eye trackers in software engineering to study
using a real-time control speed that reached a frame rate of 19.9 fps (Wu, model comprehension, code comprehension, debugging, collaborative
Wang, Lin, & Zhou, 2017). Beibin et al. modified the DBSCAN algorithm interaction, and traceability. Their results have provided evidence on
to be able to recognize fixations and saccades, and it can handle noise, the uses and contributions of eye trackers in empirical software engi­
fixed structure, and smooth continuation movements that traditional neering studies using Engineering Village rather than applying a manual
fixation identification algorithms cannot. However, it still needs some search. Finally, they reported the limitations of current eye tracking
arbitrary thresholds (Li et al., 2016). Königa and Buffalo proposed the technology, threatening the validity of previous studies, along with
Cluster Fix algorithm, which uses a K-means cluster analysis to measure suggestions for mitigating these limitations (Zohreh Sharafi, 2015).
the variation between fixations and saccades. It can detect even small Similarly, Obaidellad and Al Haek focused on expressing how different
saccades that are often difficult to detect, and distinguish them from experiments using eye tracking are steered and used to collect different

14
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

information. They used 63 studies focusing on program comprehension even when the driver is wearing glasses or a respiratory mask. Eriksson
and debugging, non-code comprehension, collaborative programming, et al. explored the usefulness of the gaze time on screen as an engage­
and requirements traceability research. Their results showed that most ment measure in screen mediated learning context, and the generated
of the participants used programing languages and materials, and that data showed very little about the engagement of the drivers when using
eye trackers and attention tracking tools have been utilized (Obaidellah gaze trackers while driving an expensive car (Eriksson, Swenberg, Zhao,
et al., 2018). & Eriksson, 2018).
Guarnera et al. developed an eye tracking infrastructure, called For driver drowsiness or fatigue, Khushaba et al. developed a feature
iTrace, that enables an eye tracking option in some integrated devel­ extraction algorithm, called fuzzy mutual-information - based wavelet
opment environments and allows software developers to work under packet transform (FMIWPT), to classify the driver drowsiness state into
more realistic conditions (Guarnera, Bryant, Mishra, Maletic, & Sharif, one of the predefined drowsiness levels. FMIWPT extracted the most
2018). In addition, the authors presented several use cases using relevant features required to identify the driver drowsiness/fatigue
different gaze patterns along with a video demonstration of where states and maximized the amount of drowsiness-related information
iTrace can be used (Guarnera, Bryant, Mishra, Maletic, & Sharif, 2018). extracted from a set of electroencephalogram (EEG), electrooculogram
Najar et al. investigated the using of eye tracking data to compare the (EOG), and electrocardiogram (ECG) signals during a simulation driving
behaviors of novices and advanced students while studying examples to test. The authors have also compared EOG, ECG, and EEG signals for
increase the learning level using an intelligent tutoring system (Najar, feature extraction and found that EEG and ECG signals are more suitable
Mitrovic, & Neshatian, 2014). They found that advanced students paid than EOG signals for this purpose (Khushaba, Kodagoda, Lal, & Dis­
more attention to database schema than novices. Video-based pupil sanayake, 2011). Other attempts were made to monitor the driver’s eyes
monitoring eye trackers have been used to investigate the allocation of using a camera and detect symptoms of driver fatigue early enough to
the attention generated from the eye tracker regarding an individual’s avoid an accident (Singh, Bhatia, & Kaur, 2011; Nguyen & Chew, 2015;
locus of attention in a digital assessment game. Muñoz-Leiva, Hernández-Méndez, & Gómez-Carmona, 2019).
Eye tracking technology has proven its efficiency and effectiveness in Researchers from the University of Missouri have announced two new
terms of measuring and evaluating the cognitive level, inspiring an applications of eye tracking technology: The first application focused on
interface design, discovering a behavioral response according to the user designing a better collision avoidance warning system, whereas the sec­
data, improving the teaching quality by improving the teaching frame­ ond application focused on a real-time evaluation of the driver’s physical
work, promoting the application of technology, and improving the behavior using the driver’s eyes as a crash occurs. They observed that the
educational standards. However, some areas lack significant contribu­ alert generated from the collision avoidance system itself creates a
tions, particularly for developing an adaptive system based on the need distraction that may be a hinderance to the driver and cause an accident
of user’s cognitive load levels calculated from eye tracking data. Having or crash. Therefore, they used a vehicle-assisted safety system to watch
such a personalized system will improve the training by increasing the how people’s pupils changed in response to their physical reactions and
knowledge transfer and retention while utilizing the time of the trainee. built a two-way communication channel between a driver and a vehicle to
In addition, current eye tracking systems and analysis tools have some avoid such collisions (Missouri-Columbia, 2019).
issues related to system cost and software integration that limit the Moreover, automakers have been widely looking for ways to incor­
development of commercialized devices or tools for general use (Rosch porate IoT in the vehicle for entertainment and particularly for the safety
& Vogel-Walcutt, 2013). of elder users and special needs drivers. Eye tracking in smart cars has
been a goal of many companies, including Volkswagen, General Motors,
7.4. Car assistant and Audi. Sensors were used to detect eye blinks to guarantee the safety
of drivers on the road. Eye tracking has been used to detect drowsiness
Most eye tracking systems have been applied indoors and have and sleepiness (Anusha & Ahmed, 2017; Park, Subramaniyam, Hong, &
shown a convenient performance when the surrounding light is Kim, 2017).
controlled, and the eye region of interest (ROI) can be detected by using Noland et al. used a survey to evaluate quantitatively the ways in
focused images. When these techniques are applied to a driving sce­ which individuals process and rank images used in public settings for
nario, the light can change quickly when the car is moving, and thus it urban planning, using eye tracking techniques. Their results have shown
cannot be guaranteed that there will be continuous reflection of lights on different ranking levels for urbanist components such as images, people,
the eye. In addition, tracking the behaviors of eye-movement is impor­ pedestrian features, greenery, buildings, cars, and parking. In addition,
tant for driver fatigue detection (Xu, Min, & Hu, 2018), drowsiness they showed a way to extract greater value from visual preference sur­
monitoring (Neshov, 2017), real-time drowsiness detection (Murphy veys for transportation and urban planners and can be used in the future
et al., 2001), driving-assistance systems (Said et al., 2018), and accident to reduce motor vehicle traffic in cities (Noland, Weiner, Gao, & Cook,
prevention (Anjali, Thampi, Vijayaraman, Francis, James, & Rajan, 2017).
2016). Many researchers showed interest in developing intelligent Different methods have been explored for making driving for elderly
methods for activating alarm systems when the eyes are not exposed for people a safer experience using IoT to quickly gain assistance during an
several seconds while driving. emergency. Park et al. (Park, Subramaniyam, Hong, & Kim, 2017)
In general, the Viola Jones face detector and tracking-learning showed how a stroke can be detected and that a medical doctor or
detection algorithms have been used to detect changes in the appear­ relative can be informed when such an incident occurs while driving. For
ance of a scene (Bengoechea, Villanueva, & Cabeza, 2012). Neshov and example, the authors showed that elderly people can use wearables that
Manolova suggested an algorithm that combine these two algorithms are connected to sensors to help them stay safe when experiencing a
(Neshov, 2017). Another system was developed by Anjali et al. to pre­ stroke.
vent car accidents by detecting closed eyes for driver fatigue and to alert The US National Highway Traffic Safety Administration classifies
drivers using a buzzer and vibration upon a positive detection (Anjali, three types of driver distraction: visual, manual, and cognitive distrac­
et al., 2016). The system was developed using a camera mounted in front tions (Le, Suzuki, & Aoki, 2020). Visual distraction occurs when the
of the driver, which captures real-time video using OpenCV and Rasp­ driver looks away from the roadway, for example, when looking at the
berry Pi. It can detect the eyes at medium and high illumination. In dashboard. Manual distraction occurs when the driver takes a hand off
addition, grayscale image processing and PERCLOS have been used by the steering wheel to eat or drink, for example. Finally, cognitive
Yan et al. to determine if the driver suffers from fatigue or drowsiness in distraction is defined as a task that involves thinking about something
a timely manner (Yan, Kuo, & Lin, 2016). The quick sort method is used other than driving. Using eye tracking approaches to detect the types of
to confirm the distribution range of black pixels. This study can be used distraction is a common task. However, observing a cognitive distraction

15
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

from an external behavior, such as eye tracking, is harder, and it is easier Moreover, they found that using eye movement information in the
to discover the visual distraction using eye tracking approaches. More­ choice model improves the model performance as compared to other
over, combining different types of distractions makes the process of models such as a benchmark approach.
identifying the distraction more difficult. Several researchers have Another research conducted by Rasch, Louviere, and Teichert (2015)
investigated the use of eye tracking to detect the type of distraction combined eye tracking with facial electromyography to provide a better
(Liang and Lee, 2007b), (Hurtado & Chiasson, 2016), and (Gazepoint, understanding of the choice process, which should lead to an improved
2020). In a real driving environment, cognitive distraction often occurs prediction performance. Their evaluation results indicate the existence
with visual and manual distraction (Liang and Lee, 2007a). Thus, we of an effect on the stated choice experiments for trivial product cate­
believe that discovering a cognitive distraction from a manual or visual gories and provide insight into drivers and the contexts of affective
distraction might be a future direction for researchers. In addition, choice processes. Among others, the best and worst task frames show the
checking the level of visual and manual distraction may help in influence of an integral affect in discrete choice experiments. Their
discovering the level of cognitive distraction. recommendations express the need for future joint investigations of
cognitive and affective processes in consumer choice tasks. Better un­
7.5. Choice modeling, consumer psychology, and marketing derstanding of these processes should lead to valuable insight into how
real-time marketing actions influence decisions, ways to improve the
New marketing and adverting strategies require involving users in predictive performance of choice models, and novel ways to help con­
the design and evaluation of products and services. This gives insight sumers and organizations make better decisions.
into customer satisfaction, engagement, and the determination of the Yegoryan, Guhl, and Klapper (2020) related the visual attention
design decisions. In addition, the ability to open new possibilities in both derived from eye tracking to the probability of attribute non-attendance
lab and real-world neuromarketing tests allows researchers to collect (ANA) and preference heterogeneity to test, understand, and validate
objective data coming from the customers’ eyes and brain when they ANA in a marketing context. Their model concentrated on two appli­
interact with a product or service. Portable eye tracking devices allow cations in which individuals chose durable products and when making
individuals to automatically interact with their environment normally. product choices in durable categories owing to higher stakes. Their
This helps in understanding how specific features appear to be under­ findings show that the use of eye tracking to augment the ANA models is
stood by humans and using that knowledge further to make better informative in uncovering individual-level behavior and that most re­
decisions. spondents indeed ignore some attributes, which has implications for
One of the metrics for measuring the impact of ads is the number of willingness-to-pay estimates, segmentation, and targeting.
clicks, which is the number of times users have clicked on a digital ­ Meißner, Pfeiffer, Pfeiffer, and Oppewal (2019) concentrated on
advertisement to reach an online property (Orquin & Wedel, 2020; Hang using VR settings with eye tracking to provide a unique opportunity for
et al., 2018; MetricHQ, 2020). However, it does not precisely reflect the shopper research regarding the use of augmented reality to provide
effectiveness of an ad campaign. This is because some non-human re­ shopper assistance. They outlined how research in retailing and
sources, such as software tools and applications can generate automatic decision-making can benefit from the use of mobile eye tracking in VR.
clicks, which may make this metric inaccurate. With eye tracking Furthermore, the study exemplified how the environment interacts with
technology, online advertisers can exactly measure the number of clicks, the user in real time by showing additional product information as re­
which can be achieved by calculating the number of ad views when they actions to the user’s pure eye movements. Similar to other researchers,
appear on the page using some measurements such as the ROI. In fact, the authors expected that, in the near future, eye tracking technology
using eye tracking to assess the interest of an audience and to give will be integrated into many electronic devices, such as mobile phones,
insight into how users interact with ads, requires every computer and tablets, or laptops, and will generate large quantities of eye tracking data
mobile device to be embedded with eye tracking technology (Wedel, that can be used to obtain insights into the underlying search and choice
Pieters, & Lans, 2019; Bulling & Wedel, 2019a, 2019b). We believe that, processes of consumers (Meißner et al., 2019; Wedel et al., 2019;
in the future, devices will be associated with such technology and many Meißner, Oppewal, & Huber, 2020).
applications will benefit from such features. Current consumers and marketplaces are different from the past, and
Khushaba et al. conducted a pilot study that explores the nature of they are likely to be different from the future. These differences are due
decision-making by examining the associated brain activity and EEG of to major societal forces, customer needs, purchasing behavior, and
people, to understand how the brain responds while undertaking choices changing lifestyles that have created new opportunities and challenges
designed to elicit the subjects’ preferences (Khushaba et al., 2012). and have significantly changed marketing management. Learning more
Similarly, Khushaba et al. (Khushaba et al., 2013) helped in improving about consumers and the ways to use neuroscientific tools to discover
the design and presentation of products according to consumer prefer­ the brain or physiological reactions to reach a better understanding of
ences by using an Emotiv EPOC wireless EEG headset to collect EEG the consumer needs are the key to implementing and exercising mar­
signals from participants and a Tobii-Studio eye tracker system to relate keting concepts and imagination, and these be exciting areas for a
the EEG data to the specific choice of options. Their study differs from marketing evolution in the future.
others in that they did not concentrate on what customers like or dislike,
but observed and evaluated the cortical activity of the different brain 8. Discussion and future directions
regions and the interdependencies among the EEG signals from these
regions and provided a way to quantify the importance of different Eye tracking has proven itself to be valuable in many areas, partic­
features that contribute to the product design based on mutual ularly medical and diagnostic fields. In this paper, we discuss eye
information. tracking techniques, tools, and applications to further focus and entice
Uggeldahl et al. (Uggeldahl, Jacobsen, Lundhede, & BøyeOlsen, researchers to find suitable techniques for their specific applications.
2016) investigated the eye movement during the completion of choice When choosing an eye tracking system, one should pay attention to
sets based on a sample of 200 respondents. They interpreted the re­ different factors, such as the gaze tracking system, its features, the
spondents’ stated certainty of choice as a measure of their “true” cer­ accompanying software and accessories, eye tracking manufacturing
tainty of choice. Their evaluation results confirmed that the frequency of devices, software utilizing raw eye data, and the suitability of the system
the respondents’ gaze shifts between alternatives is related to the stated for a specific purpose, including the spatial and temporal resolution
certainty of choice and can be used to explain systematic variations in (accuracy), camera angle, freedom of head movements, tolerance to
the error variance. The error variance is larger for choices in which the ambient light, tolerance to eye glasses and contact lenses, and possibility
respondents more frequently shift their gaze between the alternatives. to track only one or two eyes. Here, we classify the above studies into

16
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

four areas shown below. Table 7


A list of eye tracking applications clustered based on the field, research goals,
8.1. Future directions of techniques and applications and major findings.
Reference Area of application – Overall Research Goals/
Techniques. There are various techniques that can be used for survey Major Findings
building an eye tracker system. We provide a clear description on (Brunyé et al., 2019) Understanding and • Provided an overview of
electrooculography, a scleral search coil, infrared oculography, and improving diagnostic how the eye-tracking tech­
VOG. There is no perfect method that can be used for solving all prob­ interpretation – nology has been employed
medical in the perceptual and
lems in different life sectors and environments because each method has cognitive processes
advantages and limitations, and thus we should choose the best candi­ involved in medical
date method based on the problem specifications, environment, cost, interpretation
and usability, among other factors. For example, we can use the scleral • Provided an overview of
how the eye-tracking tech­
search coils method if accuracy is a must. The electrooculography
nology has been used to
method can be used if the available resources are limited. Infrared understand medical inter­
oculography or VOG methods provide flexibility, whereas VOG is pretation and enhancing
extremely expensive in terms of time complexity. Table 6 represents the medical education and
eye tracking applications, techniques, and algorithms and criteria used training.
• Provided some future
for each method. directions and challenges
Applications. Different areas of eye tracking applications have been of using eye tracking
presented in this paper, including in the healthcare and medical fields, technology in medical
HCI and assistive technologies, learning and education, and vehicle applications.
(Bobić & Graovac, Tools for training, • Provided a clearer solution
manufacturing. A list of comprehensive review papers on the use of eye
2016), (Tsai Meng- exploring learning, for implementing eye-
tracking in different applications is shown in Table 7. In our review of the LungLai, Yang, Hsu, medical education – tracking systems into a
literature, we noted that eye tracking appears to be a valuable tool for Liu, Lee, & Tsai, education training environment.
medical diagnoses, providing automatic hints to observers during an 2013; Ashraf et al., • Suggested future studies
imaging checkup. The use of eye movements along with ML algorithms 2018) based on the review
findings.
can help predict most diagnostic errors before their occurrence (Brunyé,
• Discussed how the eye
Drew, Weaver, & Elmore, 2019). However, each person brings a unique movement measures can be
set of individual differences to the diagnostics, clearly impacting the vi­ used to indicate the
sual search and decision-making processes. In addition, most current processes of learning.
(Alemdag & Cagiltay, Multimedia learning – • Provided a systematic
studies have used static eye tracking images during an image interpre­
2018) education review of employing the
tation. Many medical images are becoming more complex and dynamic eye-tracking technology in
(Brunyé et al., 2019), and the use of dynamic video frames may indeed the area of multimedia
help in providing valuable information for tracking a ROI moving across a learning.
free observation scene during playback. Furthermore, using a combina­ (Borys, 2014; Wedel, Advertising and • Showed on-going trends,
2013; Higgins, Marketing metrics, and technical
tion of different attributes generated from eye movements may become
Leinenger, & Rayner, challenges of employing
the focus of future research. More information regarding the applications 2014) eye tracking method into
of eye tracking in healthcare along with some future directions can be research.
found in Liu et al. (2018) and Harezla and Kasprowski (2018). • Presented a systematic
review of the state-of-the-
Eye tracking technology has been and will continue to be one of the
art research of employing
main parts of next-generation vehicles, significantly improving driving eye tracking in digital
safety. ABI reported that “the global shipments of factory-installed marketing, online adver­
driver monitoring systems (DMS) based on interior facing cameras tising, advertisements, and
reached 6.7 million by the end of 2019.” In addition, DMS solutions are shopper marketing
• Integrated recent findings
expected to play an important role in developing human–machine in­
in the field and serves as a
teractions (HMIs) that will allow smart dashboards and contextual HMIs practical guide for online
in an interior vehicle environment to be used for safer driving. More­ and traditional marketers
over, we believe that, to develop high-accuracy car-assistance systems for future research
planning and development.
and models, high-quality eye tracking data should be generated and
• Examined key findings on
certain factors should therefore be considered. Some of these factors are eye movements when
viewing advertisements.
Table 6
List of eye tracking applications and techniques along with algorithms and
criteria used. the quality of the capturing device or camera, the eye detection and eye
tracking algorithms, the lighting conditions, the training models, and
Applications Techniques Algorithms and criteria used
the specialized fixation- or saccade-classification algorithms applied.
Healthcare and EOG, IOG, Deep learning algorithm, KNN, Therefore, we encourage researchers to consider these factors and
Medical Machine Random Forest, SVM, Naïve Bayes
explore their effectiveness and efficiency to develop more reliable and
Applications Learning
HCI and Accessibility EOG, IOG, Boosted Deep Belief Network, Haar efficient car-assistance systems and models.
Machine Cascade algorithm, SVM A popular feature in new vehicles that generate an alert from a
Learning collision avoidance warning system before a crash can also create a
Education VOG Duplicate, Irrelevant, Improper distraction that may itself cause an accident. For instance, there is no
format, Unavailable, Foreign
language
need to generate an alert if the driver knows about the crash prior to its
Car Assistant VOG Viola Jones face detection algorithm, occurrence. Therefore, a multi-way communication channel needs to
Haar Cascade algorithm, Tracking, exist between the driver and vehicle. In addition, observing how the
Learning Detection Algorithm driver’s eyes and behaviors are changing while they respond to an alert

17
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

generated by a vehicle collision avoidance warning may help engineers camera of a handheld device will affect the quality of the captured im­
design safer systems that decrease driver distractions. ages. Moreover, the nature and structure of the collected data along with
It is believed that the symptoms of driver fatigue can be detected the successful labeling of training data help with the pre- and post-
sufficiently early to avoid car accidents by monitoring the movements of processing to generate higher quality data.
the eyes and blink pattern (Sommer et al., 2009). Some of the symptoms Incorporating ML algorithms, such as an SVM and neural networks,
are an inability to keep the eyes open or eye burn. Other symptoms were with eye tracking techniques is important owing to their ability to
described by Lees et al. (2018) and could be measured using an EEG. independently adapt and learn from previous data, to identify patterns
However, using such a device to measure driver fatigue is an uncom­ and produce reliable decisions with minimal human intervention. This
fortable and intrusive technique that should be addressed in future incorporation is certainly going to yield more accurate and trustworthy
research (Rost et al., 2015; Jap, Lal, & Fischer, 2011). In addition, it is eye tracking results. Algorithms relying on ML methods can solve some
important to calculate the delay when the driver becomes fatigued and of the eye tracking issues. Some of these issues include synching between
the time for the symptoms to start showing in the human eyes. For head movement and head-rotation data. Noisy data represent another
instance, generating a crash alert within an acceptable delay allows the issue that should be addressed specially when different IoT devices are
driver to take an action that may help prevent an accident. We believe applied together.
that combining data generated from eye tracking along with other facial
behavior such as yawning, head movements, and positioning may help 8.3. Future directions of using IoT with eye tracking
in detecting the fatigue level for the driver.
Eye tracking technology is producing high data rates with different
8.2. Future directions of using ML with eye tracking data types generated from different sensors and cameras. If this exten­
sive amount of data continues to be compiled, eye tracking will be the
ML algorithms have been used to solve problems and benefit new direction for big data. This will require developing new aggregation
different companies by making predictions and helping them make models, understanding different techniques, and applying evaluation
better decisions. Future studies using ML with eye tracking can be approaches beyond a statistical analysis and a visual inspection, to
conducted in different areas. First, the performance of existing ML al­ extract patterns and behaviors from the data. Challenges facing big data
gorithms can be enhanced and their accuracy verified using larger on eye movements, the techniques useful to address such challenges, and
datasets and various learning techniques. Second, along with the use of a the potential scenarios for such big data were described in Blascheck,
communication protocol, the implementation can be optimized to ach­ Burch, Raschke, and Weiskopf (2015).
ieve higher processing rates. Third, focus can be given to detecting other With the decreasing costs of gaze trackers, pervasive eye tracking is
events for eye-movement (such as a smooth pursuit and nystagmus in likely to become a reality and has been widely used to improve the
medical applications), which may be achieved by including other design and usability of different systems and applications, such as web
training sets and using other ML approaches. pages, and to explore and understand how users are guided by such
The use of ML techniques with eye tracking algorithms can predict applications (Shokishalov & Wang, 2019). However, collecting such
most diagnostic errors prior to their occurrence, making it possible to data can reveal personal attributes and the privacy will be an issue.
automatically calibrate the input images and integrate the feedback of Most of the existing literature on IoT smart home platforms focused
the trainees. Using eye tracking for the detection task, on top of the on the functionalities provided by smarter connected devices; however,
diagnosis and segmentation in medical applications, for example, rep­ this does not address the privacy concerns or management. Some at­
resents a promising future direction for handling cases that are missed tempts have focused on dealing with only the data necessary to
by radiologists. In addition, a real, larger, and suitable public dataset accomplish a task, which has moderated the privacy issue. Dasgupta,
with richer features is needed to learn the relationship between fixation Gill, and Hussain (2019) proposed a framework to assist smart home
patterns and confusion because certain features may be impactful. Such users and manufacture IoT devices to make informed privacy manage­
type of dataset utilizes two head mounted eye-facing cameras for both ment decisions. In addition, this framework has helped practitioners and
eyes, which may be necessary to learn further details regarding the researchers interested in the privacy of IoT-enabled smart systems.
challenges of eye tracking in AR/VR environments and systems. In Moreover, Dasgupta et al. (2019) showed the current emerging trends in
addition, it will help in implementing deep learning models for IoT in different industry sectors and discussed the key privacy challenges
augmented spaces. Moreover, high-quality dataset images make the that limit the growth of IoT to reach its potential in the context of smart
training of deep learning systems easier. homes. We believe that pervasive eye tracking has numerous benefits;
Therefore, more eye tracking analyses using ML algorithms in the however, as with most technologies, moving forward with little care will
context of a user study are needed. Other future agenda will be to benefit the public in the coming years (Liebling & Preibusch, 2014).
develop technologies that focus on delivering fast and accurate pre­ We also focus on the current major IoT issues that lead to a lack of
dictions and handling voluminous data. In addition, the transfer of trust in eye tracking techniques, such as privacy and usability concerns
successful learning of similar tasks from other domains into the eye for the elderly and people with special needs. Another issue facing eye
tracking domain also represents an area of future research. A real tracking research is the limited collaboration among different wireless
promise of ML is that it will have a single algorithm that can detect and devices in several fields.
distinguish between all events that exist in the literature on psycho­ Products with more powerful GPUs accelerate the image recognition
logical and neurological eye-movement. We encourage other re­ speeds, which allows the progress in eye tracking technology to be
searchers to validate and further develop new ML-based architectures significantly accelerated. Researchers and developers benefit from such
to improve techniques for prediction from eye tracking data to make developments, and their implementations and experimentations are
better decisions. increasing in frequency. In addition, this development enables the
ML algorithms are extremely powerful tools that hold the potential to collection of larger and broader datasets to train the recognition algo­
revolutionize the way things work. However, applying ML algorithms rithms more quickly.
into eye tracking systems have certain limitations. Some of these limi­ Eye tracking has increased the communication bandwidth between
tations are the high dependency between the selected features, sample the user and device, changing the way devices and humans are
size, and data quality with accurate results, and it has been proven that if communicating with each other. Eye tracking allows a device to be
the images used for training and/or testing CNNs is low, the results will aware of what the user is interested in at any given point in time. In
be inferior compared to the results using higher quality images (Stember addition, it provides easier and faster ways to obtain data to train al­
et al., 2019). In addition, the calibration of operational parameters of the gorithm models without taking anything else away. We believe that eye

18
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

tracking will be a standard feature of a new generation of devices, such attention and integrate it with the core to enhance the games.
as smartphones, laptops, and desktop monitors. These products will be Marketing and Advertising: Recent research recommends that the ad
less expensive and more potent. In addition, new open source software industry will focus more on attention metrics in the future, and not on
platforms will be adopted in commercial products in a wide array of impressions, and the best way to track attention is using eye tracking
consumer segments, which have driven the progress of eye tracking technologies (Orquin & Wedel, 2020; Hang et al., 2018). Orquin and
technology. Wedel (Orquin & Wedel, 2020; Bulling & Wedel, 2019a, 2019b; Orquin
We believe that a new paradigm of eye tracking will make personal & Loose, 2013) and Zuschke (2020) provided some other research di­
computers much more productive and intuitive. Because vision is the rections: (1) The relationship between attention and the consumer
most used sense among human beings, eye gaze always presents a decision-making was studied. (2) The ways to improve eye-movement
smarter user interaction in comparison with a mouse, keyboard, or metrics that consider visual attention, pupil dilation, blinks, and facial
voice. Being able to digitally track and measure gaze-based data, there expressions as a coordinating mechanism of the body, head, and eye
will be a significant impact on how we make our intentions known to movements were determined. (3) How social media and video market­
computers. ing will change the potential role of eye tracking in the attribution of
The accuracy of the eye tracker systems generally depends on the advertising was considered. (4) Privacy issues of the consumer data were
quality of the collected data along with the number of subjects involved explored, such as the identity parameter, gaze point, age, gender, and
to capture more heterogeneity. In general, a high sampling rate gener­ duration of view. Finally, (5) how mobile devices will be used to track
ates high-quality data. In this regard, focusing on manufacturing and human eyes in a virtual environment for observing shopping behavior
working with better-quality instruments represents a future direction to will be studied. Moreover, it is interesting to see whether the active role
enhance the accuracy of the results. In addition, a robust eye tracking of visual attention that has been applied in a testing environment will
technology with advanced data analysis tools is needed to better un­ occur similarly in a real-life environment. Consumers can obtain more
derstand the eye movement patterns and improve the mobility and information regarding the healthiness of a product using the nutrition
quality of life of elderly people. Table 8 shows a list of currently used eye label, and thus the design properties can increase the chance that con­
tracking devices along with their characteristics, whereas Table 9 shows sumers will visually pay attention to this label. Therefore, future
a list of eye tracking software currently used along with their charac­ research can analyze how different design strategies will affect the se­
teristics and applications. lection of healthy food.
Virtual Reality: The current evolution in hardware manufacturing,
8.4. Other future directions of eye tracking software development, end-user requirements and specifications, as well
as research professionals, have allowed the interactions of VR and eye
We identified other interesting areas where eye tracking is being tracking technology to be an area of future research. Tracking eye
used. Here, we provide some future eye tracking directions that may be movements with sufficient precision provides researchers with a sig­
interesting to researchers and the gaming, marketing, advertising, and nificant amount of information on a subject’s behavior in a virtual
VR industries. environment. In addition, it provides an opportunity to analyze a sub­
Gaming: In the past decade, researchers on computer games have ject’s behavior in relation to the objects the subject is looking at, as well
indicated that eye tracking provides better input control and task as the location of these objects. We see a significant potential of
completion compared with traditional input devices (e.g., mouse, combining VR with eye tracking in the next coming years (Meißner et al.,
keyboard, and gamepad) in terms of action accuracy and responsiveness 2019; Clay et al., 2019).
(Santana, 2018). Using eye tracking in games has several advantages.
First, it lets game devices know where the player is visually focused. 9. Conclusion
Second, it adds some reality to the player’s interaction and provides the
player with a richer experience and an ability to extend himself/herself Eye tracking has proven itself to be valuable in many research areas.
into the game. Third, it identifies the player’s strengths and weaknesses This study discussed the eye tracking techniques and applications with a
according to the player’s fundamental skills and learns how to improve focus on modern approaches such as ML, IoT, and cloud computing.
them. Finally, it allows the player’s map to provide more awareness and A classification of different ML algorithms that use eye tracking data
vision-related skills to be accumulated. As a future direction, we expect along with the research goal and the strengths and limitations of the
researchers to study how eye tracking can improve a game instead of algorithm are presented. In addition, we summarize different lists that
replacing a traditional input. In addition, we plan to study the player’s cover eye tracking applications and techniques along with the

Table 8
A list of eye tracking devices currently used along with their characteristics.
Devices Synchronization with Head Different Head mount Integrating with Gaming Mobile Real-time Data Multi-sensor Support
movements options/comfortable other products processing Analysis integration VR

EyeSeeCam ✓ ✓ ✓ ✓ ✓
Tobii Pro ✓ ✓ ✓ ✓ ✓
Glasses 2
iMotion ✓ ✓ ✓ ✓ ✓
LooxidVR ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
FOVE ✓ ✓ ✓ ✓ ✓ ✓ ✓
Varjo ✓ ✓ ✓ ✓ ✓ ✓ ✓
OmniView- ✓ ✓ ✓ ✓ ✓ ✓
TX™
OmniView- ✓ ✓ ✓ ✓ ✓ ✓ ✓
RD™
Eyegaze ✓ ✓ ✓ ✓ ✓
Edge®
EyeTech ✓ ✓ ✓
TM5 mini ✓ ✓ ✓ ✓
Ergoneers ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓
Blickshift ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓ ✓

19
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Table 9 accessibility, education, e-learning, next-generation car assistants,


A list of eye tracking software currently used along with their characteristics and driver drowsiness or fatigue, consumer research, and neuromarketing.
their applications. This study revealed that ML and IoT are important parts in evolving
Software/ Hardware Supported Applications eye tracking applications owing to their ability to learn from existing
hardware Platforms data, make better decisions, flexibility of use, and ability to eliminate the
EyeSeeCam Goggles Medical need for manually re-calibrating the tracker during the eye tracking
Tobii Pro Mobile portable Unity Professional performance, process. In addition, it was discovered that eye tracking techniques
Glasses 2 device – like laptop consumer behavior, achieve more accurate detection results compared with traditional
assistive technology, and
event-detection methods. In addition, various motives and factors in the
scientific research
LooxidVR headset foam pads, Unity Researchers focusing on use of a specific eye tracking technique or application were explored and
Mask, dry EEG acquisitioning of recommended. Finally, it provides some limitations and future di­
electrodes, and synchronized eye and rections related to the use of eye tracking techniques in the development
built-in eye-tracking brain data for Healthcare, of several applications, using ML, IoT, and other recent research areas
cameras Education/Training,
Marketing and Brain-
such as gaming, marketing, advertising, and VR as a key that might open
Computer Interface new opportunities and spark some new ideas to researchers in this area
research, to move forward in the future.
FOVE headset foam pads Unity, It is the first VR headset to
and Mask Unreal, have embedded eye-
Xenko tracking, Gaming,
Funding
education, healthcare and
medical, movies, social This research was funded by the Faculty of Scientific Research and
communication, and Graduate Studies, Yarmouk University, Jordan, grant number 2017/48.
development.
Varjo headset foam pads Unrea, Design, training,
and Mask OpenXR, simulation, and research.
and Unity
Declaration of Competing Interest
Eyegaze Screen with Infrared NA Locked-in users to
Edge camera and sensors communicate with the The authors declare that they have no known competing financial
world via robust, accurate interests or personal relationships that could have appeared to influence
eye tracking devices, and
the work reported in this paper.
software
TM5 mini Screen with Infrared NA ALS, Cerebral Palsy,
camera and sensors Muscular Dystrophy, References
Spinal Cord Injury,
Traumatic Brain Injury, Abdulin, E., Friedman, L., & Komogortsev, O. V. (2017). Method to detect eye position
Stroke, Spinal Muscular noise from video-oculography when detection of pupil or corneal reflection position
Atrophy fails.
Ergoneers Goggles NA Market research, Abraham, L., Urru, A., Wilk, M. P., Tedesco, S., Walsh, M., & O’Flynn, B. (2017). Point
Ergonomics, Vehicle Tracking with Lensless Smart Sensors. 2017 IEEE SENSORS. Glasgow, UK: IEEE.
research, Airplane Ahmad, G. (2018). A study of eye tracking data based on multiple regression analysis.
research, Usability, Journal of Next Generation Information Technology (JNIT), 9(1).
Behavioral research, Ahuja, K., & Tuli, P. (2013). Object recognition by template matching using correlations
Perception research, and phase angle method. International Journal of Advanced Research in Computer and
Communication Engineering, 2(3), 1368–1373.
Design clinics, Sports and
Alemdag, E., & Cagiltay, K. (2018). A systematic review of eye tracking research on
biomechanics research,
multimedia learning. Computers & Education, 125, 413–428.
Medical research, Realtime
Al-Moteri, M. O., Symmons, M., Plummer, V., & Cooper, S. (2017). Eye tracking to
gaze control, Behavioral investigate cue processing in medical decisionmaking: A scoping review. Computers
studies in virtual reality, in Human Behavior, 66, 52–66.
Games testing (Maria et al., Anjali, K. U., Thampi, A. K., Vijayaraman, A., Francis, M. F., James, N. J., & Rajan, B. K.
2019), Evaluation of (2016). Real-time nonintrusive monitoring and detection of eye blinking in view of
virtual design concepts, accident prevention due to drowsiness. International Conference on Circuit, Power
Pre-Test of new product and Computing Technologies (ICCPCT). Nagercoil.
ideas, Test of early Anusha, A., & Ahmed, S. M. (2017). Vehicle tracking and monitoring system to enhance
interface designs, Eye the safety and security driving using IOT. International Conference on Recent Trends
tracking integration in in Electrical, Electronics and Computing Technologies. Ras Al Khaimah.
helmets, Design of Ashraf, H., Sodergren, M. H., Merali, N., Mylonas, G., Singh, H., & Darzi, A. (2018). Eye-
individual head mounted tracking technology in medical education: A systematic review. Medical Teacher –
Taylor & Francis, 40(1), 62–69.
eye tracker.
Barr, D. J. (2008). Analyzing ‘visual world’ eyetracking data using multilevel logistic
Blickshift headset foam pads NA Scientists, cognition
regression. Journal of Memory and Language, 59(4), 457–474.
and mask scientists (MK, B, AT, & SA, Bazrafkan, S., Kar, A., & Costache, C. (2015). Eye gaze for consumer electronics:
2017; Bueno, Sato, & Controlling and commanding intelligent systems. IEEE Consumer Electronics
Hornberger, 2019), Magazine, 4(4), 65–71.
automotive engineers, AI Beach, P., & McConnel, J. (2019). Eye tracking methodology for studying teacher
developers, usability learning: A review of the research. International Journal of Research & Method in
experts and market Education, 42(5), 485–501.
researchers. Bellet, M. E., Bellet, J., Nienborg, H., Hafed, Z. M., & Berens, P. (2019). Human-level
saccade detection performance using deep neural networks. Journal of Neurophysiol,
121(2), 646–661.
Bengoechea, J. J., Villanueva, A., & Cabeza, R. (2012). Hybrid eye detection algorithm
algorithms and criteria used, diseases that can be identified using eye
for outdoor environments. The 2012 ACM Conference on Ubiquitous Computing.
tracking technology, and the main findings. A comparison between New York.
recent research that uses eye tracking technology in HCI, eye tracking Bhattarai, R., & Phothisonothai, M. (2019). Eye-Tracking Based Visualizations and
software, and devices that are currently applied along with their char­ Metrics Analysis for Individual Eye Movement Patterns. 16th International Joint
Conference on Computer Science and Software Engineering (JCSSE). Chonburi,
acteristics is given with a further focus on enticing researchers to find a Thailand, Thailand.
suitable technique for their specific application. Moreover, we present Bissoli, A., Lavino-Junior, D., Sime, M. M., Encarnação, L. F., & Bastos, T. (2019). (2019).
eye tracking applications such as healthcare, medicine, HCI, A Human–Machine Interface Based on Eye Tracking for Controlling and Monitoring
a Smart Home Using the Internet of Things. Sensors, 19(4), 859. Sensors, 19(4), 859.

20
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Blascheck, T., Burch, M., Raschke, M., & Weiskopf, D. (2015). Challenges and perspectives Dietz, M., Schork, D., & André, E. (2016). Exploring eye-tracking-based detection of
in big eye-movement data visual analytics. Big Data Visual Analytics (BDVA). Australia: visual search for elderly people. 12th International Conference on Intelligent
Hobart, TAS. Environments (IE). London.
Blascheck, T., Kurzhals, K., Raschke, M., Burch, M., Weiskopf, D., & Ertl, T. (2017). Dietz, M., Schork, D., Damian, I., Steinert, A., Haesner, M., & André, E. (2017).
Visualization of eye tracking data: A taxonomy and survey. Computer Graphics Forum, Automatic detection of visual search for the elderly using eye and head tracking
1–25. data. KI-KünstlicheIntelligenz, 31(4), 339–348.
Bobić, V., & Graovac, S. (2016). Development, implementation and evaluation of new Ding, Z; Luo, J; Deng, H. (2018). Accelerated exhaustive eye glints localization method
eye tracking methodology. . 2016 24th Telecommunications Forum (TELFOR). for infrared video oculography. 33rd Annual ACM Symposium on Applied
Belgradu. Computing. Pau.
Borys, M. (2014). Eye tracking in marketing research: a review of recent available Dong, Y., Zhang, Y., Yue, J., & Hu, Z. (2016). Comparison of random forest, random ferns
literature. International Conference on Knwoldege and Learning Management. and support vector machine for eye state classification. Multimedia Tools and
Portorož, Slovenia. Applications, 75(19), 11763–11783.
Brousseau, B., Rose, J., & Eizenman, M. (2020). Hybrid Eye-Tracking on a Smartphone Drewes, H. (2010). Eye Gaze Tracking for Human Computer Interaction.
with CNN Feature Extraction and an Infrared 3D Model. Sensors, 543(20), 1–21. Dybdal, M. L., Agustin, J. S., & Hansen, J. P. (2012). Gaze input for mobile devices by
Brunyé, T. T., Drew, T., Weaver, D. L., & Elmore, J. G. (2019). A review of eye tracking dwell and gestures. Proceedings of the Symposium on eye tracking Research and
for understanding and improving diagnostic interpretation. Cognitive Research: Applications, 1(1), 225–228.
Principles and Implications, 4(7), 1–16. Eibenberger, K., Eibenberger, B., & Rucci, M. (2016). Design, simulation and evaluation
Bueno, A., Sato, J., & Hornberger, M. (2019). Eye tracking – The overlooked method to of uniform magnetic field systems for head-free eye movement recordings with
measure cognition in neurodegeneration? Neuropsychologia, 133, Article 107191. scleral search coils. 38th Annual International Conference of the IEEE Engineering in
Bulling, A., & Gellersen, H. (2010). Toward mobile eye-based human-computer Medicine and Biology Society (EMBC). Florida.
interaction. IEEE Pervasive Computing, 9(4), 8–12. Eibenberger, K., Eibenberger, B., Roberts, D. C., Haslwanter, T., & Carey, J. P. (2015).
Bulling, A., & Wedel, M. (2019). Pervasive Eye Tracking for Real-World Consumer A novel and inexpensive digital system for eye movement recordings using magnetic
Behavior Analysis. In A. K. Michael Schulte-Mecklenbeck, A Handbook of Process scleral search coils. Medical & Biological Engineering & Computing, 54(2–3), 421–430.
Tracing Methods for Decision Research: A Critical Review and User’s Guide. Taylor Eriksson, P. E., Swenberg, T., Zhao, X., & Eriksson, Y. (2018). How gaze time on screen
& Francis. impacts the efficacy of visual instructions. Heliyon, 4(6).
Bulling, A., & Wedel, M. (2019). Pervasive Eye Tracking for Real-World Consumer Finke, E. H., Wilkinson, K. M., & Hickerson, B. D. (2017). Social referencing gaze
Behavior Analysis. In A. K. Michael Schulte-Mecklenbeck, Handbook of Process behavior during a videogame task: Eye tracking evidence from children with and
Tracing Methods (pp. 27-44). New York: Taylor and Francis Group. without ASD. Journal of Autism and Developmental Disorders, 47(2), 415–423.
Bulling, A., Ward, J. A., Gellersen, H., & Troster, G. (2011). Eye movement analysis for Franchak, J. M., Kretch, K. S., Soska, K. C., & Adolph, K. E. (2011). Head-mounted eye
activity recognition using electrooculography. IEEE Transactions on Pattern Analysis tracking: A new method to describe infant looking. Child Development, 82(6),
and Machine Intelligence, 33(4), 741–753. 1738–1750.
Bunz, M., & Janciute, L. (2018). Artificial Intelligence and the Internet of Things. University Fuhl, W., Tonsen, M., Bulling, A., & Kasneci, E. (2016). Pupil detection for head-mounted
of Westminster Press. eye tracking in the wild: An evaluation of the state of the art. Machine Vision and
Cabrera-Quiŕos, L. C. (2014). Eye gaze estimation using SVM for regression from face and eye Applications, 27(8), 1275–1288.
detection results. Master Thesis. Cartago: Tecnologico
́ de Costa Rica. Garcia-Zapirain, B., de la Torre Díez, I., & López-Coronado, M. (2017). Dual system for
Cai, Y., Sharma, H., Chatelain, P., & Noble, J. A. (2018). SonoEyeNet: Standardized fetal enhancing cognitive abilities of children with ADHD using leap motion and eye-
ultrasound plane detection informed by eye tracking. IEEE 15th International tracking technologies. Journal of MEDICAL Systems, 41(7), 111.
Symposium on Biomedical Imaging (ISBI). Washington, DC, USA. Gautam, G., Sumanth, G., Karthikeyan, K. C., Sundar, S., & Venkataraman, D. (2014). Eye
Catrysse, L., Gijbels, D., Donche, V., Maeyer, S. D., Lesterhuis, M., & Bossche, P. V. movement based electronic wheel chair for physically challenged persons.
(2017). How are learning strategies reflected in the eyes? Combining results from International Journal of Scientific & Technology Research, 3(2), 206–212.
self-reports and eye-tracking. British Journal of Educational Psychology. https://doi. Gazepoint. (2020, June 16). Eye Tracking Applications: Distracted Driving Studies.
org/10.1111/bjep.12181 Retrieved from https://www.gazept.com/blog/visual-tracking/eye-tracking-
Chandra, S., Sharma, G., Malhotra, S., Jha, D., & Mittal, A. P. (2015). Eye tracking based applications-distracted-driving-studies/.
human computer interaction: Applications and their uses. International Conference George, A. (2012). design and implementation of real-time algorithms for eye tracking and
on Man and Machine Interfacing (MAMI). Bhubaneshwar. perclos measurement for on board estimation of alertness of drivers. Master Thesis.
Chen, H., Lai, H., & Chiu, F. (2010). Eye tracking technology for learning and education. Kharagpur: Department of Electrical Engineering, Indian Institute of Technology.
Journal of Research in Education Sciences, 55(4), 39–68. Ghude, P., Tembe, A., & Patil, S. (2014). Real-Time eye tracking System for People with
Chen, Z., Fu, H., Lo, W.-L., & Chi, Z. (2018). Strabismus recognition using eye-tracking Several Disabilities using Single Web Cam. International Journal of Computing
data and convolutional neural networks. Journal of Healthcare Engineering. https:// andTechnology, 1(2), 173–176.
doi.org/10.1155/2018/7692198 Gill, P., & Vogelsang, T. (2016). Lensless smart sensors: Optical and thermal sensing for
Chennamma, H., & Yuan, X. (2013). A survey on eye-gaze tracking techniques. Indian the Internet of Things. IEEE Symposium on VLSI Circuits (VLSI-Circuits). Honolulu.
Journal of Computer Science and Engineering, 4(5), 388–393. Goni, S. E. (2004). Robust algorithm for pupil-glint vector detection in a video-
Clay, V., König, P., & König, S. (2019). Eye tracking in virtual reality. Journal of Eye oculographyeyetracking system. 17th International Conference on Pattern
Movement Research, 12(1), 1–18. Recognition. Cambridge.
Cognolato, M., Atzori, M., & Müller, H. (2018). Head-mounted eye gaze tracking devices: Guarnera, D. T., Bryant, C. A., Mishra, A., Maletic, J. I., & Sharif, B. (2018). iTrace: eye
An overview of modern devices and recent advances. Journal of Rehabilitation and tracking infrastructure for development environments. The 2018 ACM Symposium
Assistive Technologies Engineering, 5(1). on eye tracking Research & Applications. Warsaw.
Cohn, J. F., Xiao, J., Moriyama, T., Ambadar, Z., & Kanade, T. (2003). Automatic Han, Y.-J., Kim, W., & Park, J.-S. (2018). Efficient eye-blinking detection on
Recognition of Eye Blinking in Spontaneously Occurring Behavior. Behavior Research smartphones: A hybrid approach based on deep learning. Mobile Information Systems.
Methods, Instruments, and Computers, 35(3), 1–39. https://doi.org/10.1155/2018/6929762
Colliot, T., & Jamet, É. (2018). Understanding the effects of a teacher video on learning Hang, Y., Yi, X., & Xianglan, C. (2018). Eye-tracking studies in visual marketing: Review
from a multimedia document: An eye-tracking study. Educational Technology and prospects. Foreign Economics & Management, 40(12), 98–108.
Research and Development, 66(6), 1415–1433. Hansen, D., & Ji, Q. (2010). In the eye of the beholder: A survey of models for eyes and
Constable, P. A., Bach, M., Frishman, L. J., Jeffrey, B. G., Robson, A. G., & Vision., I. S. gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(3), 478–500.
(2017). ISCEV Standard for clinical electro-oculography. DocumentaOphthalmologica, Harezla, K., & Kasprowski, P. (2018). Application of eye tracking in medicine: A survey,
134(1), 1–9. research issues and challenges. Computerized Medical Imaging and Graphics, 65(1),
Coutrot, A., Hsiao, J. H., & Chan, A. B. (2018). Scanpath modeling and classification with 176–190.
hidden Markov models. Behavior Research Methods, 50, 362–379. Haritha, Z. S., Raveena, P. V., Arun, K. S., Nithya, P. N., Balan, M. V., & Krishnan, S.
Dagnew, G., & Kaur, G. (2016). Human computer interaction using eye movements for (2016). Eye tracking system using isophote eye center detection with blink
hand disabled people. International Journal of Engineering Trends and Technology, 33, perception. International Conference on Signal Processing, Communication, Power
142–150. and Embedded System (SCOPES). Odisha.
Dalrymple, K. A., Jiang, M., Zhao, Q., & Elison, J. T. (2019). Machine learning accurately Haslgrübler, M., Fritz, P., Gollan, B., & Ferscha, A. (2017). Getting through: modality
classifies age of toddlers based on eye tracking. Scientific Reports. https://doi.org/ selection in a multi-sensor-actuator industrial IoT environment. The Seventh
10.1038/s41598-019-42764-z International Conference on the Internet of Things. Linz.
Dao, T.-C. (2014). Real-time Eye tracking Analysis from Large scale Dataset using Cloud Hassoumi, A., Peysakhovich, V., & Hurter, C. (2019). Improving eye-tracking calibration
Computing. Master Thesis. Joensuu, Kuopio, Eastern Finland, Finland: School of accuracy using symbolic regression. PLoS ONE, 14(3). https://doi.org/10.1371/
Computing, University of Eastern Finland. journal.pone.0213675
Dasgupta, A., Gill, A. Q., & Hussain, F. (2019). Privacy of IoT-enabled smart home Haywood, C. (2019). iMotion eye-tracking glasses. (iMotion) Retrieved April 26, 2020,
systems. In Internet of Things (IoT) for automated and smart applications (pp. 1–16). from https://imotions.com/biosensor/eye-tracking-glasses/.
Web of Science. He, H., She, Y., Xiahou, J., Yao, J., Li, J., Hong, Q., & Ji, Y. (2018). Real-time eye-gaze
Deepika, S. S., & Murugesan, G. (2015). A novel approach for Human Computer Interface based interaction for human intention prediction and emotion analysis. Bintan, Island,
based on eye movements for disabled people. IEEE International Conference on Indonesia: Computer Graphics International.
Electrical, Computer and Communication Technologies (ICECCT). Mysore. He, Q., Li, W., Fan, X., & Fei, Z. (2016). Evaluation of driver fatigue with multi-indicators
Dickson, B. (2016). How fog computing pushes IoT intelligence to the edge. TechTalks. based on artificial neural network. IET Intelligent Transport Systems, 10(8), 555–561.

21
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Hickson, S., Dufour, N., Sud, A., Kwatra, V., & Essa, I. (2019). Eyemotion: Classifying Kong, Y., Lee, S., Lee, J., & Nam, Y. (2018). A head-mounted goggle-type video-
facial expressions in VR using eye-tracking cameras. IEEE Winter Conference on oculography system for vestibular function testing. EURASIP Journal on Image and
Applications of Computer Vision (WACV). Colorado. Video Processing, 2018(1), 1–10.
Higgins, E., Leinenger, M., & Rayner, K. (2014). Eye movements when viewing Königa, S. D., & Buffalo, E. A. (2014). A nonparametric method for detecting fixations
advertisements. Frontiers in Psychlogy, 5, 210. https://doi.org/10.3389/ and saccades using cluster analysis: Removing the need for arbitrary thresholds.
fpsyg.2014.00210 Journal of Neuroscience Methods, 227, 121–131.
Hotrakool, W., Siritanawan, P., & Kondo, T. (2010). A real-time eye-tracking method Krafka, K., Khosla, A., Kellnhofer, P., Kannan, H., Bhandarkar, S., Matusik, W., &
using time-varying gradient orientation patterns. Electrical Engineering/Electronics Torralba, A. (2016). Eye Tracking for Everyone. The IEEE Conference on Computer
Computer Telecommunications and Information Technology (ECTI-CON), (pp. Vision and Pattern Recognition (CVPR). Las Vegas, Nevada, United States.
492–496). Chiang Mai. Kurzhals, K. B. F., Burch, M., & Weiskopf, D. (2016). Eye tracking evaluation of visual
Hristozova, N., Ozimek, P., & Siebert, J. P. (2018). Efficient Egocentric Visual Perception analytics. Information Visualization, 15(4), 340–358.
Combining Eye-tracking, a Software Retina and Deep Learning. EPIC Workshop at Lauermann, J. L., Treder, M., Heiduschka, P., Clemens, C. R., Eter, N., & Alten, F. (2017).
the European Conference on Computer Vision. Munich, Germany. Impact of eye-tracking technology on OCT-angiography imaging quality in age-
Hu, K., Gui, Z., Cheng, X., Qi, K., Zheng, J., You, L., & Wu, H. (2016). Content-based related macular degeneration. Graefe’s Archive for Clinical and Experimental
discovery for web map service using support vector machine and user relevance Ophthalmology, 255(8), 1535–1542.
feedback. PLoS ONE. https://doi.org/10.1371/journal.pone.0166098 Le, A. S., Suzuki, T., & Aoki, H. (2020). Evaluating driver cognitive distraction by eye
Huang, C. H., Luh, W. M., Sheu, C.-F., Tzeng, Y., & Chen, M. (2012). Analysis of Eye tracking: From simulator to driving. Transportation Research Interdisciplinary
Movement Data Using Mixed Effects Modeling with a Poisson Regression Model. Perspectives, 4.
Hurtado, S., & Chiasson, S. (2016). An Eye-tracking Evaluation of Driver Distraction and Lees, T., Chalmers, T., Burton, D., Zilberg, E., Penzel, T., Lal, S., & Lal, S. (2018).
Unfamiliar Road Signs. 8th International Conference on Automotive User Interfaces Electroencephalography as a predictor of self-report fatigue/sleepiness during
and Interactive Vehicular Applications, (pp. 153–160). monotonous driving in train drivers. Physiological Measurement.
Ibrahim, F. N., Zin, Z. M., & Ibrahim, N. (2018). Eye Center Detection Using Combined Lemley, J., Kar, A., & Corcoran, P. (2018). Eye Tracking in Augmented Spaces: A Deep
Viola-Jones and Neural Network Algorithms. International Symposium on Agent, Learning Approach. IEEE Games, Entertainment, Media Conference (GEM). Galway,
Multi-Agent Systems and Robotics (ISAMSR). Putrajaya, Malaysia. Ireland.
Inoue, A., & Paracha, S. (2016). Identifying reading disorders via eye-tracking . Li, B., Wang, Q., Barney, E., Urabain, I. S., Smith, T. J., & Shic, F. (2016). Modified
International Conference on Advanced Materials for Science and Engineering DBSCAN Algorithm on Oculomotor Fixation Identification. The Ninth Biennial ACM
(ICAMSE). Tainan. Symposium on Eye Tracking Research & Applications. New York, NY, United States.
Jap, B. T., Lal, S., & Fischer, P. (2011). Comparing combinations of EEG activity in train Li, P., Hou, X., Wei, L., Song, G., & Duan, X. (2018). Efficient and low-cost deep-learning
drivers during monotonous driving. Expert Systems with Applications, 38(1), based gaze estimator for surgical robot control. IEEE International Conference on
996–1003. Real-time Computing and Robotics (RCAR). Kandima, Maldives, Maldives.
Jiang, L., Xu, M., Liu, T., Qiao, M., & Wang, Z. (2018). DeepVS: A Deep Learning Based Li, S., Zhang, X., & Webb, J. D. (2017). 3-D-gaze-based robotic grasping through
Video Saliency Prediction Approach. The European Conference on Computer Vision mimicking human visuomotor function for people with motion impairments. IEEE
(ECCV). Munich, Germany. Transactions on Biomedical Engineering, 64(12), 2824–2835.
Kacete, A., Seguier, R., Royan, J. o., Collobert, M., & Soladie, C. (2016). Real-time eye Li, X., Li, Z., & Qin, J. (2014). An improved gaze tracking technique based on eye model.
pupil localization using Hough regression forest. IEEE Winter Conference on . 2014 33rd Chinese Control Conference (CCC). Nanjing.
Applications of Computer Vision. Lake Placid, NY, United States. Liang, Y., & Lee, J. D. (2007). Comparing Support Vector Machines (SVMs) and Bayesian
Kangas, J., Špakov, O., Isokoski, P., Akkil, D., Rantala, J., & Raisamo, R. (2016). Networks (BNs) in detecting driver cognitive distraction using eye movements.
Feedback for smooth pursuit gaze tracking based control. The 7th Augmented Liang, Y., & Lee, J. D. (2007). Driver Cognitive Distraction Detection Using Eye
Human International Conference. New York. Movements. In R. I. Hammoud, Passive eye monitoring: Algorithm, applications and
Kasneci, E., Black, A. A., & Wood, J. M. (2017). Eye-tracking as a tool to evaluate experiments (pp. 285–300).
functional ability in everyday tasks in glaucoma. Journal of Ophthalmology, 2017(1), Liebling, D. J., & Preibusch, S. (2014). Privacy considerations for a pervasive eye
1–10. tracking world. International Joint Conference on Pervasive and Ubiquitous
Keller, J., Krimly, A., Bauer, L., Schulenburg, S., Böhm, S., Aho-Özhan, H. E., & Computing: Adjunct Publication. Seattle, Washington, USA.
Abrahams, S. (2017). A first approach to a neuropsychological screening tool using Liu, C. H., Chang, P. Y., & Huang, C. Y. (2013). Using eye-tracking and support vector
eye-tracking for bedside cognitive testing based on the Edinburgh Cognitive and machine to measure learning attention in elearning. Applied Machines and Materials:
Behavioural ALS Screen. Amyotrophic Lateral Sclerosis and Frontotemporal Information, Communication and Engineering, 311, 9–14.
Degeneration, 18(5–6), 443–450. Liu, H., & Liu, Q. (2010). Robust real-time eye detection and tracking for rotated facial
Khamis, M., Alt, F., & Bulling, A. (2018). The past, present, and future of gaze-enabled images under complex conditions. Sixth International Conference on Natural
handheld mobile devices: survey and lessons learned. The 20th International Computation, ICNC. Yantai.
Conference on Human-Computer Interaction with Mobile Devices and Services. Liu, S. S., Rawicz, A., Ma, T., Zhang, C., Lin, K., Rezaei, S., & Wu, E. (2018). An eye-gaze
Barcelona, Spain. tracking and human computer interface system for people with ALS and other
Khamis, M., Hoesl, A., Klimczak, A., Reiss, M., Alt, F., & Bulling, A. (2017). Eyescout: locked-in diseases. CMBES Proceedings, 33(1), 1.
Active eye tracking for position and movement independent gaze interaction with Lopez-Basterretxea, A., Mendez-Zorrilla, A., & Garcia-Zapirain, B. (2015). Eye/head
large public displays. 30th ACM User Interface Software and Technology Symposium tracking technology to improve HCI with iPad applications. Sensors, 15(2),
(UIST). Quebec. 2244–2264.
Khosravan, N., Celik, H., Turkbey, B., Jones, E. C., Wood, B., & Bagci, U. (2019). Lou, Y., Liu, Y., Kaakinen, J. K., & Li, X. (2017). Using support vector machines to
A collaborative computer aided diagnosis (C-CAD) system with eye-tracking, sparse identify literacy skills: Evidence from eye movements. Behavior Research Methods, 49,
attentional model, and deep learning. Medical Image Analysis, 51(1), 101–115. 887–895.
Khushaba, R. N., Greenacre, L., Kodagoda, S., Louviere, J., Burke, S., & Dissanayake, G. Lozano, I., Campos, R., & Belinchón, M. (2018). Eye-tracking measures in audiovisual
(2012). Choice modeling and the brain: A study on the Electroencephalogram (EEG) stimuli in infants at high genetic risk for ASD: challenging issues. the 2018 ACM
of preferences. Expert Systems with Applications, 39(16), 12378–12388. Symposium, (p. 73). carlsbad.
Khushaba, R. N., Kodagoda, S., Lal, S., & Dissanayake, G. (2011). Driver drowsiness Lukander, K. (2016). A short review and primer on eye tracking in human computer
classification using fuzzy wavelet-packet-based feature-extraction algorithm. IEEE interaction applications.
Transactions on Biomedical Engineering, 58(1), 121–131. Lupu, R., & Ungureanu, F. (2013). A survey of eye tracking methods and applications.
Khushaba, R. N., Wise, C., Kodagoda, S., Louviere, J., Kahn, B. E., & Townsend, C. Buletinul Institutului Politehnic din Iasi, Automatic Control and Computer Science Section,
(2013). Consumer neuroscience: Assessing the brain response to marketing stimuli 3(1), 72–86.
using electroencephalogram (EEG) and eye tracking. Expert Systems with Applications, Lupu, R., Bozomitu, R., & Cehan, V. (2014). Detection of gaze direction by using
40(9), 3803–3812. improved eye-tracking technique. 2014 37th ISSE International Spring Seminar in
Kiefer, P., Giannopoulos, I., Raubal, M., & Duchowski, A. (2017). Eye tracking for spatial Electronics Technology. Dresden.
research: Cognition, computation, challenges. Spatial Cognition & Computation, 17 Majaranta, P., & Bulling, A. (2014). Eye tracking and eye-based human-computer
(1–2), 1–19. https://doi.org/10.1080/13875868.2016.1254634 interaction. Advances in Physiological Computing, 39–65|. https://doi.org/10.1007/
Kit, D., & Sullivan, B. (2016). Classifying mobile eye tracking data with hidden markove 978-1-4471-6392-3_3
models. 18th International Conference on Human-Computer Interaction with Mobile Malakhova, E. Y., Shelepin, E. Y., & Malashin, R. O. (2018). Temporal data processing
Devices and Services Adjunct. New York, NY, United States. from webcam eye tracking using artificial neural networks. Journal of Optical
Kitchenham, B. (2004). Procedures for performing systematic reviews. Garden Street, Technology, 85(3), 186–188.
Eversleigh, Australia: Keele University Technical Report TR/SE-0401. Marandi, R. Z., & Gazerani, P. (2019). Aging and eye tracking: In the quest for objective
Kitchenham, B., & Charters, S. (2007). Guidelines for performing Systematic Literature biomarkers. Future Neurology, 14(4).
Reviews in Software Engineering. Keele University and Durham University Joint Marcos-Ramiro, A. (2014). Automatic Blinking Detection towards Stress Discovery. 16th
Report. ACM International Conference on Multimodal Interaction. Istanbul, Turkey.
Klaib, A. F., Alsrehin, N., Melhem, W., & Bashtawi, H. (2019). IoT smart home using eye Maria, C., Krystle-Lee, T., Tasbire, S., Steven, C., Marion, G. E., Rob, M., & Vasyl, K.
tracking and voice interfaces for elderly and special needs people. Journal of (2019). Eye tracking the feedback assigned to undergraduate students in a digital
Communications, 14(7), 1–8. assessment game. Frontiers in Psychology, 10.
Koester, H. H., & Arthanat, S. (2018). Text entry rate of access interfaces used by people Markuš, N., Frljak, M., Pandži,́ I. S., Ahlberg, J., & Forchheimer, R. (2014). Eye pupil
with physical disabilities: A systematic review. Assistive Technology, 30(3), 151–163. localization with an ensemble of randomized trees. Pattern Recognition, 47(2),
578–587.

22
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Martinikorena, I., Cabeza, R., Villanueva, A., Urtasun, I., & Larumbe, A. (2018). Fast and Park, S. J., Subramaniyam, M., Hong, S., Kim, D., & Yu, J. (2017). Conceptual design of
robust ellipse detection algorithm for head-mounted eye tracking systems. Machine the elderly healthcare services in-vehicle using IoT. WCX™ 17: SAE World Congress
Vision and Applications, 29(1), 845–860. Experience. Detroit.
McCamy, M. B., Otero-Millan, J., Leigh, R. J., King, S. A., Schneider, R. M., Păsărică, A., Bozomitu, R. G., Cehan, V., Lupu, R. G., & Rotariu, C. (2015). Pupil
Macknik, S. L., & Martinez-Conde, S. (2015). Simultaneous recordings of human detection algorithms for eye tracking applications. 2015 IEEE 21st International
microsaccades and drifts with a contemporary video eye tracker and the search coil Symposium for Design and Technology in Electronic Packaging (SIITME). Cluj-
technique. PLoS ONE, 10(6), Article e0128428. Napoca.
Meena, Y. K., Chowdhury, A., Cecotti, H., Wong-Lin, K., Nishad, S. S., Dutta, A., & Perera, C., Qin, Y., Estrella, J. C., Reiff-Marganiec, S., & Vasilakos, A. V. (2017). Fog
Prasad, G. (2016). Emohex: An eye tracker based mobility and hand exoskeleton computing for sustainable smart cities: A survey. ACM Computing Surveys, 50(32),
device for assisting disabled people. IEEE International Conference on Systems, Man, 32–43. Article No.: 32.
and Cybernetics (SMC). Budapest. Pian, W., Khoo, C. S., & Chi, J. (2017). Automatic classification of users’ health
Meißner, M., Oppewal, H., & Huber, J. (2020). Surprising adaptivity to set size changes in information need context: Logistic regression analysis of mouse-click and eye-tracker
multi-attribute repeated choice tasks. Journal of Business Research, 163–175. data. Journal of Medical Internet Research, 19(12).
Meißner, M., Pfeiffer, J., Pfeiffer, T., & Oppewal, H. (2019). Combining virtual reality Picanço, C. R., & Tonneau, F. (2018). A low-cost platform for eyetracking research: Using
and mobile eye tracking to provide a naturalistic experimental environment for Pupil in behavior analysis. Journal of the Experimental Analysis of Behavior, 110(2),
shopper research. Journal of Business Research, 100, 445–458. 157–170.
Meng-LungLai, Tsai, M.-J., Yang, F.-Y., Hsu, C.-Y., Liu, T.-C., Lee, S. W.-Y., .. Tsai, C.-C. Poole, A., & Ball, L. J. (2006). Eye Tracking in Human-Computer Interaction and
(2013). A review of using eye-tracking technology in exploring learning from 2000 Usability Research: Current Status and Future Prospects. In Encyclopedia of Human
to 2012. Educational Research Review, 10, 90–115. Computer Interaction. Idea Group Reference.
Mestre, C., Gautier, J., & Pujol, J. (2018). Robust eye tracking based on multiple corneal Rasch, C., Louviere, J. J., & Teichert, T. (2015). Using facial EMG and eye tracking to
reflections for clinical applications. Journal of Biomedical Optics, 23(3), 1–9. study integral affect in discrete choice experiments. Journal of Choice Modelling, 14,
MetricHQ. (2020, August 31). Retrieved from https://www.klipfolio.com/metrics/ 32–47.
marketing/ad-clicks. Rasmussen, M. H., & Tan, Z.-H. (2013). Fusing eye-gaze and speech recognition for tracking
Migliaccio, A. A., MacDougall, H. G., Minor, L. B., & Della Santina, C. C. (2005). in an automatic reading tutor: A step in the right direction? Grenoble, France: Speech
Inexpensive system for real-time 3-dimensional video-oculography using a and Language Technology in Education.
fluorescent marker array. Journal of Neuroscience Methods, 143(2), 141–150. Rello, L., & Ballesteros, M. (2015). Detecting readers with dyslexia using machine
Minelli, R., Mocci, A., & Lanza, M. (2015). I know what you did last summer: an learning with eye tracking measures. 12th Web for All. Florence, Italy.
investigation of how developers spend their time. IEEE 23rd International Robinson, D. (1963). A method of measuring eye movemnent using a scieral search coil
Conference on Program Comprehension. Florence, Italy. in a magnetic field. Bio-medical Electronics, 48(3), 137–145.
Missouri-Columbia, U. o. (2019, September 24). Seeing is believing: Eye-tracking Rosch, J. L., & Vogel-Walcutt, J. J. (2013). A review of eye-tracking applications as tools
technology could help make driving safer Scientists take a new look at the for training. Cognition Technology and Work, 15(3), Article 313327.
importance of keeping your eyes on the road. Retrieved December 12, 2019, from Rost, M., Zilberg, E., Xu, Z. M., Feng, Y., Burton, D., & Lal, S. (2015). Comparing
https://www.sciencedaily.com/releases/2019/09/190924133248.htm. contribution of algorithm based physiological indicators for characterisation of
MK, E., B, G.-C., AT, M. S., & SA, B. (2017). Beyond eye gaze: What else can eyetracking driver drowsiness. Journal of Medical and Bioengineering.
reveal about cognition and cognitive development? Developmental Cognitive Rupanagudi, S. R., Bhat, V. G., Ranjani, B. S., Srisai, A., Gurikar, S. K., Pranay, M. R., &
Neuroscience, 69–91. Chandana, S. (2018). A simplified approach to assist motor neuron disease patients
Mohan, P., Goh, W. B., Fu, C.-W., & Yeung, S.-K. (2018). DualGaze: Addressing the Midas to communicate through video oculography. International Conference on
Touch Problem in Gaze Mediated VR Interaction. IEEE International Symposium on Communication information and Computing Technology (ICCICT). Mumbai.
Mixed and Augmented Reality Adjunct (ISMAR-Adjunct). Munich, Germany, Said, S., AlKork, S., Beyrouthy, T., Hassan, M., Abdellatif, O., & Abdraboo, M. F. (2018).
Germany. Real time eye tracking and detection-a driving assistance system. Advances in Science,
Moreno-Esteva, E. G., White, S. L., Wood, J. M., & Black, A. (2018). Application of Technology and Engineering Systems Journal, 3(6), 446–454.
mathematical and machine learning techniques to analyse eye tracking data Sakatani, T., & Isa, T. (2004). PC-based high-speed video-oculography for measuring
enabling better understanding of children’s visual cognitive behaviours. Frontline rapid eye movements in mice. Neuroscience Research, 49(1), 123–131.
Learning Research, 6(3), 72–84. Salminen, J., Nagpal, M., Kwak, H., An, J., Jung, S.-g., & Jansen, B. J. (2019). Confusion
Morozkin, P., Swynghedauw, M., & Trocan, M. (2017). Neural network based eye Prediction from Eye-Tracking Data: Experiments with Machine Learning. 9th
tracking. International Conference on Computational Collective Intelligence. Cyprus. International Conference on Information Systems and Technologies, (pp. 1–9). Cairo
Muñoz-Leiva, F., Hernández-Méndez, J., & Gómez-Carmona, D. (2019). Measuring Egypt.
advertising effectiveness in Travel 2.0 websites through eye-tracking technology. Santana, J. A. (2018). A study on the use of eye tracking to adapt gameplay and
Physiology and Behavior, 200, 83–95. procedural content generation in first-person shooter games. Multimodal Technologies
Murauer, M., Haslgrübler, M., & Ferscha, A. (2017). Natural pursuit calibration: using and Interact, 23(2), 1–20.
motion trajectories for unobtrusive calibration of mobile eye trackers. The Seventh Santini, T., Fuhl, W., Geisler, D., & Kasneci, E. (2017). EyeRecToo: Open-source Software
International Conference on the Internet of Things. Linz. for Real-time Pervasive Head-mounted Eye Tracking. International Conference on
Murphy, P. J., Duncan, A. L., Glennie, A. J., & Knox, P. C. (2001). The effect of scleral Computer Vision Theory and Applications. Porto.
search coil lens wear on the eye. British Journal of Ophthalmology, 85(3), 332–335. Sarkar, A., Sanyal, G., & Majumder, S. (2017). Performance evaluation of an eye tracking
Naha, R. K., Garg, S., Georgakopoulos, D., Jayaraman, P. P., & Gao, L. (2018). Fog system under varying conditions. International Journal of Computer Science and
computing: Survey of trends, architectures, requirements, and research directions. Network Security (IJCSNS), 17(4), 182.
IEEE Access, 6, 47980–48009. Scott, N., Zhang, R., Le, D., & Moyle, B. (2017). A review of eye-tracking research in
Najar, A. S., Mitrovic, A., & Neshatian, K. (2014). Utilizing eye tracking to improve tourism. Current Issues in Tourism, 22(10), 1244–1261.
learning from examples. Lecture Notes in Computer Science, 8514, 410–418. Sesma-Sanchez, L., & Hansen, D. W. (2018). Binocular model-based gaze estimation with
Neshov, N., & Manolova, A. (2017). Drowsiness monitoring in real-time based on a camera and a single infrared light source. The 2018 ACM Symposium (p. 47). New
supervised descent method. 2017 9th IEEE International Conference on Intelligent York: In Proceedings of the 2018 ACM Symposium on eye tracking Research &
Data Acquisition and Advanced Computing Systems: Technology and Applications Applications.
(IDAACS). Bucharest. Sharafia, Z., Soha, Z., & Gúeh́eneuc, Y.-G. (2015). A systematic literature review on the
Nguyen, T. P., Chew, M. T., & Demidenko, S. (2015). Eye Tracking System to Detect usage of eye-tracking in software engineering. Journal of Information and Software
Driver Drowsiness. 6th International Conference on Automation, Robotics and Technology.
Applications, (pp. 472-477). Queenstown, New Zealand. Shimata, R., Mitani, Y., & Ochiai, T. (2015). A study of pupil detection and tracking by
Noland, R. B., Weiner, M. D., Gao, D., Cook, M. P., & Nelessen, A. (2017). Eye-tracking image processing techniques for a human eye–computer interaction system. 16th
technology, visual preference surveys, and urban design: preliminary evidence of an International Conference on Software Engineering, Artificial Intelligence,
effective methodology. Journal of Urbanism: International Research on Placemaking Networking and Parallel/Distributed Computing (SNPD). Takamatsu, Japan.
and Urban Sustainability, 10(1), Journal. Shokishalov, Z., & Wang, H. (2019). Applying eye tracking in information security.
Obaidellah, U., Al Haek, M., & Cheng, P. C. (2018). A survey on the usage of eye-tracking Procedia Computer Science, 150, 347–351.
in computer programming. ACM Computing Surveys, 51(1), 1–58. Singh, C., Mittal, N., & Walia, E. (2011). Face recognition using Zernike and complex
Orquin, J. L., & Loose, S. M. (2013). Attention and choice: A review on eye movements in Zernike moment features. Pattern Recognition and Image Analysis, 21(1), 71–81.
decision making. Acta Psychologica, 190–206. Singh, H., & Singh, J. (2012). Human eye tracking and related issues: A review.
Orquin, J. L., & Wedel, M. (2020). Contributions to attention based marketing: International Journal of Scientific and Research Publications, 2(9).
Foundations, insights, and challenges. Journal of Business Research, 111, 85–90. Singh, H., Bhatia, J., & Kaur, J. (2011). Eye tracking based driver fatigue monitoring and
Paraskevoudi, N., & Pezaris, J. S. (2018). Eye movement compensation and spatial warning system. India International Conference on Power Electronics (IICPE). New
updating in visual prosthetics: Mechanisms, limitations and future directions. Delhi, India.
Frontiers in Systems Neuroscience, 73(12). Sommer, D., Golz, M., Schnupp, T., Krajewski, J., Trutschel, U., & Edwards, D. (2009). A
Parikh, S. (2018). Eye Gaze Feature Classification for Predicting Levels of Learning. measure of strong driver fatigue. Fifth International Driving Symposium on Human
International Conference on Emerging eLearning Technologies and Applications. Factors in Driver Assessment, Training and Vehicle Design. Big Sky MT, United
High Tatras, Slovakia. States.
Park, J., Jung, T., & Yim, K. (2015). Implementation of an eye gaze tracking system for Sorate, P. P., Vpkbiet, B., & Chhajed, G. J. (2017). Survey paper on eye gaze tracking
the disabled people. 29th International Conference on Advanced Information methods and techniques. International Research Journal of Engineering and Technology,
Networking and Applications. Gwangju. 4(6), 465–468.

23
A.F. Klaib et al. Expert Systems With Applications 166 (2021) 114037

Sorate, p., & Chhajed, G. J. (2017). Survey paper on eye gaze tracking methods and Wang, K., & Ji, Q. (2017). Real time eye gaze tracking with 3d deformable eye-face
techniques. International Research Journal of Engineering and Technology, 4(6), model. IEEE International Conference on Computer Vision (ICCV). Venice.
5612–5616. Wang, X., Cai, B., Cao, Y., Zhou, C., Yang, L., Liu, R., … Bao, B. (2016). Objective method
Stawicki, P., Gembler, F., Rezeika, A., & Volosyak, I. (2017). A novel hybrid mental for evaluating orthodontic treatment from the lay perspective: An eye-tracking
spelling application based on eye tracking and SSVEP-based BCI. Brain Sciences, 7(4), study. American Journal of Orthodontics and Dentofacial Orthopedics, 150(4), 601–610.
35. Wang, X., Thome, N., & Cord, M. (2016). Gaze latent support vector machine for image
Stember, J. N., Celik, H., Krupinski, E., Chang, P. D., Mutasa, S., Wood, B. J., … Bagci, U. classification. IEEE International Conference on Image Processing (ICIP). Phoenix,
(2019). Eye tracking for deep learning segmentation using convolutional neural AZ, USA.
networks. Journal of Digital Imaging, 32, 597–604. https://doi.org/10.1007/s10278- Wang, X., Thome, N., & Corda, M. (2017). Gaze latent support vector machine for image
019-00220-4 classification improved by weakly supervised region selection. Pattern Recognition,
Sun, Y., Li, Q., Zhang, H., & Zou, J. (2017). The application of eye tracking in education. 72, 59–71.
Advances in Intelligent Information Hiding and Multimedia Signal Processing – Smart Wedel, M. (2013). Attention Research in Marketing: A Review of Eye Tracking Studies.
Innovation, Systems and Technologies, 82, 27–33. Robert H. Smith School Research Paper No. Available at SSRN: https://ssrn.com/
Taher, F. B., Amor, N. B., & Jallouli, M. (2015). A multimodal wheelchair control system abstract=2460289 or http://dx.doi.org/10.2139/ssrn.2460289, RHS 2460289.
based on EEG signals and eye tracking fusion. International Symposium on Wedel, M., Pieters, R., & Lans, R.v. (2019). Eye tracking methodology for research in
Innovations in Intelligent SysTems and Applications (INISTA). Madrid. consumer psychology. In P. M. In, & Frank R. Kardes (Eds.), Handbook of research
Thapaliya, S., Jayarathna, S., & Jaime, M. (2018). Evaluating the EEG and Eye methods in consumer psychology (pp. 276–292). Taylor & Francis.
Movements for Autism Spectrum Disorder. EEE International Conference on Big Wilson, P. I., & Fernandez, J. (2006). Facial feature detection using Haar classifiers.
Data. Seattle, WA, USA. Journal of Computing Sciences in Colleges, 21(4), 127–133.
Thevenot, J., López, M. B., & Hadid, A. (2018). A survey on computer vision for assistive Wu, T., Wang, P., Lin, Y., & Zhou, C. (2017). A robust noninvasive eye control approach
medical diagnosis from faces. IEEE Journal of Biomedical and Health Informatics, 22 for disabled people based on Kinect 2.0 sensor. IEEE Sensors Letters, 1(4), 1–4.
(5), 1497–1511. Wu, Y., Liu, Z., Jia, M., Tran, C. C., & Yan, S. (2019). Using Artificial Neural Networks for
Tong, S., & Chang, E. (2001). Support vector machine active learning for image retrieval. Predicting Mental Workload in Nuclear Power Plants Based on Eye Tracking. Nuclear
9th ACM International Conference on Multimedia. Ottawa. Technology, 206, 94–106.
Tsui, C. S., Jia, P., Gan, J. Q., Hu, H., & Yuan, K. (2007). EMG-based hands-free Xinfang, D., Xinxin, Y., Ruic, Z., Cheng, B., Dai, L., & Guizhong, Y. (2019). Classifying
wheelchair control with EOG attention shift detection. IEEE International major depression patients and healthy controls using EEG, eye tracking and galvanic
Conference on Robotics and Biomimetics (ROBIO). Sanya. skin response data. Journal of Affective Disorders, 251, 156–161.
Tzafilkou, K., & Protogeros, N. (2017). Diagnosing user perception and acceptance using Xu, J., Min, J., & Hu, J. (2018). Real-time eye tracking for the assessment of driver
eye tracking in web-based end-user development. Computers in Human Behavior, 72 fatigue. Healthc Technol Letters, 5(2), 54–58.
(1), 23–37. Yan, J. J., Kuo, H. H., Lin, Y. F., & Liao, T. L. (2016). Real-time driver drowsiness
Uggeldahl, K., Jacobsen, C., Lundhede, T. H., & BøyeOlsen, S. (2016). Choice certainty in detection system based on PERCLOS and grayscale image processing. International
Discrete Choice Experiments: Will eye tracking provide useful measures? Journal of Symposium on Computer, Consumer and Control (IS3C). xian.
Choice Modelling, 20, 35–48. Yegoryan, N., Guhl, D., & Klapper, D. (2020). Inferring attribute non-attendance using
Ulutas, B. H., Özkan, N. F., & Michalski, R. (2020). Application of hidden Markov models eye tracking in choice-based conjoint analysis. Journal of Business Research, 111,
to eye tracking data analysis of visual quality inspection operations. Central European 290–304.
Journal of Operations Research, 28, 761–777. Yin, Y., Juan, C., Chakraborty, J., & McGuire, M. P. (2018). Classification of Eye Tracking
Utaminingrum, F., Fauzi, M. A., Sari, Y. A., Primaswara, R., & Adinugroho, S. (2016). Eye Data Using a Convolutional Neural Network. 17th IEEE International Conference on
Movement as Navigator for Disabled Person. The 2016 International Conference on Machine Learning and Applications (ICMLA). Orlando, FL, USA.
Communication and Information Systems . Bangkok. Yoo, S., Yeon, H., & Jeong, S. (2016). Personalized control system using eye tracking and
Van Renswoude, D. R., Raijmakers, M. E., Koornneef, A., Johnson, S. P., & Hunnius, S. SVM classifier. Jeongseon, Gangwon, Korea: Human Computer Interaction Korea.
(2018). Gazepath: An eye-tracking analysis tool that accounts for individual Zemblys, R. (2016). Eye-movement event detection meets machine learning. Biomedical
differences and data quality. Behavior Research Methods, 50(2), 834–852. Engineering.
Verma, S. P. (2011). An eye-tracking based wireless control system. The 26th Zemblys, R., Niehorster, D. C., Komogortsev, O., & Holmqvist, K. (2018). Using machine
International Conference on CAD/CAM, Robotics and Factories of the Future. Kuala learning to detect events in eye-tracking data. Behavioural Research, 50, 160–181.
Lumpur, Malaysia. Zhang, G., & Hansen, J. P. (2019). Accessible control of telepresence robots based on eye
Villamor, M., & Rodrigo, M. M. (2018). Predicting Successful Collaboration in a Pair tracking. Denver, Colorado: Eye Tracking Research and Applications.
Programming Eye Tracking Experiment. The 26th Conference on User Modeling, Zhang, X., Kulkarni, H., & Morris, M. R. (2017). Smartphone-based gaze gesture
Adaptation and Personalization (UMAP), (pp. 263–268). Singapore. communication for people with motor disabilities. In Proceedings of the 2017 CHI
Vincent, S. J., Alonso-Caneiro, D., & Collins, M. J. (2017). Evidence on scleral contact Conference on Human Factors in Computing Systems. New York.
lenses and intraocular pressure. Clinical and Experimental Optometry, 100(1), 87–88. Zohreh Sharafi, Z. S.-G. (2015). A systematic literature review on the usage of
Vrzakova, H., & Bednarik, R. (2013). EyeCloud: Cloud Computing for Pervasive Eye- eyetracking in software engineering. Information and Software Technology, 67,
Tracking. 3rd International Workshop on Pervasive Eye Tracking and Mobile Eye- 79–107.
Based Interaction. Lund, Sweden. Zuschke, N. (2020). An analysis of process-tracing research on consumer decision-
making. Journal of Business Research, 111, 305–320.

24

You might also like