0% found this document useful (0 votes)
0 views

UsingSynchronizedEyeMovementstoPredictAttentioninOnlineVideoLearning

This study investigates the use of inter-subject correlation (ISC) of eye movements to predict attention levels in online video learning, addressing whether ISC can assess attention beyond simple categories and if learning styles influence this prediction. The findings indicate that ISC effectively correlates with self-reported attention states, regardless of learning styles, suggesting its viability for attention assessment in online education. This research contributes to enhancing attention monitoring methods in the evolving landscape of online learning.

Uploaded by

yelinliu769
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

UsingSynchronizedEyeMovementstoPredictAttentioninOnlineVideoLearning

This study investigates the use of inter-subject correlation (ISC) of eye movements to predict attention levels in online video learning, addressing whether ISC can assess attention beyond simple categories and if learning styles influence this prediction. The findings indicate that ISC effectively correlates with self-reported attention states, regardless of learning styles, suggesting its viability for attention assessment in online education. This research contributes to enhancing attention monitoring methods in the evolving landscape of online learning.

Uploaded by

yelinliu769
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/380738910

Using Synchronized Eye Movements to Predict Attention in Online Video


Learning

Article in Education Sciences · May 2024


DOI: 10.3390/educsci14050548

CITATION READS

1 33

4 authors, including:

Hang Zeng
Beijing Normal University at Zhuhai
12 PUBLICATIONS 104 CITATIONS

SEE PROFILE

All content following this page was uploaded by Hang Zeng on 28 May 2024.

The user has requested enhancement of the downloaded file.


education
sciences
Article
Using Synchronized Eye Movements to Predict Attention in
Online Video Learning
Caizhen Su 1 , Xingyu Liu 1 , Xinru Gan 1 and Hang Zeng 2, *

1 Faculty of Education, Beijing Normal University, Beijing 100875, China


2 Center for Educational Science and Technology, Beijing Normal University at Zhuhai, Zhuhai 519087, China
* Correspondence: [email protected] or [email protected]

Abstract: Concerns persist about attentional engagement in online learning. The inter-subject
correlation of eye movements (ISC) has shown promise as an accessible and effective method for
attention assessment in online learning. This study extends previous studies investigating ISC of eye
movements in online learning by addressing two research questions. Firstly, can ISC predict students’
attentional states at a finer level beyond a simple dichotomy of attention states (e.g., attending and
distracted states)? Secondly, whether learners’ learning styles affect ISC’s prediction rate of attention
assessment in video learning? Previous studies have shown that learners of different learning styles
have different eye movement patterns when viewing static materials. However, limited research
has explored the impact of learning styles on viewing patterns in video learning. An eye tracking
experiment with participants watching lecture videos demonstrated a connection between ISC and
self-reported attention states at a finer level. We also demonstrated that learning styles did not
significantly affect ISC’s prediction rate of attention assessment in video learning, suggesting that
ISC of eye movements can be effectively used without considering learners’ learning styles. These
findings contribute to the ongoing discourse on optimizing attention assessment in the evolving
landscape of online education.

Keywords: eye tracking; intersubject correlation; online learning; learning styles; attentional assessment

Citation: Su, C.; Liu, X.; Gan, X.; Zeng,


1. Introduction
H. Using Synchronized Eye
1.1. Online Learning and Attention Assessment
Movements to Predict Attention in
Online Video Learning. Educ. Sci. Online learning, or e-learning, has revolutionized education by increasing flexibil-
2024, 14, 548. https://doi.org/ ity, accessibility, and inclusivity. The pervasive influence of technology has significantly
10.3390/educsci14050548 contributed to the widespread adoption of online learning, especially during the COVID-
19 pandemic when educational institutions quickly transitioned to digital platforms [1].
Academic Editor: Luca Tateo
Despite its numerous advantages, concerns have been raised regarding attentional engage-
Received: 1 April 2024 ment in online learning.
Revised: 16 May 2024 Attentional engagement refers to directing and maintaining attention on a task, stimu-
Accepted: 16 May 2024 lus, or activity, involving cognitive resource allocation and sustaining focus over time [2].
Published: 19 May 2024 Attentional engagement or the lack of it has been shown to be highly correlated with learn-
ing performance [3–5]. In traditional classroom settings, teachers can monitor students’
attention in real-time and make timely adjustments to the pedagogical strategies to attract
their attention. However, in the context of online learning, especially in asynchronous
Copyright: © 2024 by the authors.
online learning (e.g., MOOCs and recorded lectures), educators lack the immediacy of
Licensee MDPI, Basel, Switzerland.
feedback to gauge and adapt to students’ attention levels. Consequently, there is a pressing
This article is an open access article
need for a real-time and effective attention assessment method to address this pedagog-
distributed under the terms and
ical gap. Researchers used various methods to tackle this challenge. For example, some
conditions of the Creative Commons
Attribution (CC BY) license (https://
researchers used wearable devices such as headsets to detect attentional-related brain
creativecommons.org/licenses/by/
signals [6–8] or wrist devices to assess students’ attention levels via photoplethysmogram
4.0/).

Educ. Sci. 2024, 14, 548. https://doi.org/10.3390/educsci14050548 https://www.mdpi.com/journal/education


Educ. Sci. 2024, 14, 548 2 of 12

signals [9]. Other researchers developed attention-monitoring algorithms based on multi-


modal data including facial expressions, eye movements, and behaviors [10,11]. Notably,
recent advancements by Madsen and colleagues have demonstrated that the inter-subject
correlation of eye movements (ISC), specifically synchronized eye movements across sub-
jects, tracked through webcams, can effectively predict attentional levels in the context
of video learning [2]. This method is particularly promising as it obviates the need for
additional specialized devices, ensuring ease of computation and rendering ISC of eye
movement a viable solution to the attention assessment predicament in online learning.

1.2. Eye Tracking and Synchronized Eye Movements


Eye tracking technology, which captures information through the tracking of eye
movement trajectories and pupil sizes, serves as a valuable tool for revealing cognitive
and perceptual abilities during information processing. The application of eye tracking
technology in online learning has garnered increasing attention from researchers in recent
decades, with a notable focus on attention assessment (for reviews, refs. [12,13]). Attention,
a critical component in the learning process, is intricately linked to eye movements, gaze
direction, and visual fixation [14–16]. Previous studies typically examined the directions
in which learners gaze at learning materials or teacher’s instructions as well as the time
they dwell on learning materials. However, this method necessitates meticulous content
analysis and is not easily applicable for routine evaluation of individual students. A more
efficient and convenient index is needed.
Building on the observed high correlation of eye movements across subjects dur-
ing video presentations [17,18], Madsen et al. [2] hypothesized that online instructional
videos synchronize eye movements across students; but the level of synchrony depends on
whether students are paying attention. Using a research-grade eye tracker and standard
webcam, they successfully demonstrated that participants in an attentive condition exhib-
ited significantly higher ISC of eye movements compared to those in a distracted condition.
They also found that ISC of eye movements effectively predicted students’ performance in
exams. Liu et al. [19] replicated the results.
While the studies by Madsen et al. [2] and Liu et al. [19] focused on the binary
states of attention—attending and distracted—it is acknowledged that attention operates
on a continuum, fluctuating between optimal and suboptimal states from moment to
moment [20]. The question of whether ISC of eye movements can reliably predict learners’
attention at a more nuanced level remains unexplored.

1.3. Learning Styles and Eye Movements


Although ISC of eye movements serve as a valid indicator of attentional engagement,
many factors may influence the synchronization of eye movements across subjects. For
example, individual differences in learning styles present a pertinent consideration, as
elucidated by the well-known Felder-Silverman learning style model [21] This model con-
siders learning in a structured education as a two-step process involving the reception
and processing of information. The learning style model classifies students according to
where they fit on many scales of the ways they receive and process information. Dimen-
sions of learning styles include Sensory/Intuitive, Visual/Verbal, Active/Reflective, and
Sequential/Global.
• Sensing/Intuitive continuum refers to the way individuals prefer to take in informa-
tion. Sensing learners prefer concrete and practical information, relying on facts and
details. Intuitive learners prefer conceptual and innovative information, concerned
with theories and meanings.
• Visual/Verbal continuum describes how learners prefer to receive information. Visual
learners learn best through visual aids such as diagrams, charts, and graphs. Verbal
learners prefer written and spoken explanations.
Educ. Sci. 2024, 14, 548 3 of 12

• Active/Reflective continuum refers to the way individuals prefer to process infor-


mation. Active learners prefer to learn through doing or discussing, while reflective
learners prefer to think about and consider information before acting on it.
• Sequential/Global continuum determines how you prefer to organize and progress
toward understanding information. Sequential learners prefer organized and linear
presentations, while global learners prefer a broader context and see the overall
structure before focusing on details.
Research has consistently demonstrated that learners with distinct learning styles
(or cognitive styles) exhibit divergent viewing patterns when engaging with learning ma-
terials as summarized in Nugrahaningsih et al. [22]. For example, Al-Wabil et al. [23]
observed that visual learners classified by Felder and Silverman’s learning style focused
more at the multimedia areas, while verbal learners focused more at the text areas. Mehi-
gan et al. [24] also showed that visual learners made more fixations on the graphic slide
area than verbal learners (no statistical significance test was conducted). Luo [25] tracked
learners’ eye movement information when studying different learning and showed that
(1) compared with active learners, reflective learners spend more time on examples ar-
eas; (2) compared with intuitive learners, sensing learners spend more time on reading;
(3) visual learners mainly fixated on images, while verbal learners fixated on words more
often; (4) sequential learners skip less learning objects and sequential learners spend less
time on the navigation pane.
However, it is noteworthy that most prior studies primarily focused on static learning
materials, such as slides or web pages, leaving a notable gap in our understanding of how
learning styles impact viewing patterns in video learning. Only a limited number of studies
have utilized videos as stimuli for eye-tracking experiments [26,27]. For example, Cao and
Nishihara [26] conducted an eye-tracking experiment using slide videos. Results showed
that the mean viewing time for visual group learners in picture parts was longer than the
intermediate group, but the difference in the mean viewing time on each slide was not
significant. Importantly, the mean viewing time for text parts on each slide for the strong
visual group was bigger than the intermediate visual group, contrary to the hypothesis.
They also found that global learners tended to have shorter fixation durations and moved
their eyes faster and with larger degrees than sequential learners, but the difference was
again not significant. Therefore, we do not have enough information on how learning styles
affect viewing patterns in video learning.
On the one hand, learners with different learning styles may also have different
viewing patterns in video learning. Therefore, categorizing learners based on their learning
styles and calculating ISC within homogeneous groups may improve ISC’s predictive
accuracy of attention assessment. In other words, it reduces interference from dissimilar
eye movement patterns of students with distinct learning styles, ultimately increasing
prediction efficiency. On the other hand, learning styles may not significantly affect learners’
viewing patterns in video learning and ISC of eye movements can be effectively used
without considering learners’ learning styles. Unlike textural learning materials, lecture
videos usually do not present much content on a single frame/slide. Instead, presentation
contents are broken down into serval parts and presented sequentially [28], limiting the
complicated viewing paths. Also, when designing lecture videos, we use methods such
as animation or teacher gestures to guide students’ attention [29,30]. Therefore, students’
viewing paths are relatively clear and consistent, unlike textual learning materials, which
produce different view paths. Empirical studies are indispensable to test these contrasting
hypotheses and advance our understanding of the intricate interplay between learning
styles and eye movement patterns in the dynamic context of video-based learning.

1.4. The Current Study


To further explore the ISC of eye movements in assessing attention states in online
learning, we conducted an eye tracking experiment examining learners’ attention in a
video-viewing context. Specifically, we aimed to answer two research questions
Educ. Sci. 2024, 14, 548 4 of 12

RQ 1: Whether ISC of eye movements can reliably predict learners’ academic


performance as well as their attention at a more nuanced level?
We followed Madsen et al. [2] study to evaluate the validity and stability of ISC as an
indicator for assessing learners’ academic performance as well as their attentional states in a
video learning context. More importantly, in terms of the attentional states, we extended our
inquiry beyond the binary attending and distracted conditions to incorporate intermediate
levels of attention. To achieve this, subjects were asked to self-report their attention levels
on a scale from 1 to 9 after watching each video. Subsequently, we conducted correlation
analyses to test the significance of the relationship between ISC of eye movements and
reported attention levels, with the objective of determining whether ISC can effectively
predict attention across a spectrum of attentional states.
RQ 2: Whether learners’ learning styles affect ISC’s prediction rate of attention
assessment in video learning?
Expanding our exploration into the practical application of ISC in attention assessment,
we considered the potential impact of learners’ individualized learning styles. Specifically,
we asked subjects to fill out the Felder-Silverman learning style questionnaire to identify
their learning style. We then correlated learners’ attention levels with ISC of eye move-
ments calculated within the subgroup. Importantly, we compared the two correlation
coefficients (correlations between attention levels and ISCstyles or ISCall ), seeking insights
into whether differentiating ISC based on learning styles affects ISC’s prediction rate of at-
tention assessment. This comparison served as a crucial step in determining whether ISC of
eye movements can be effectively employed without considering learners’ individualized
learning styles or if calculating ISC within distinct learner groups would yield improved
prediction rates.

2. Method
2.1. Participants
Thirty participants took part in the experiment. One participant was excluded due
to bad data quality, resulting in 29 participants (14 females, age range from 18 to 27 years,
M = 22.31, SD = 2.33). The sample size was chosen based on previous eye-tracking studies
for learning [19,25]. None of the participants reported a history of neurological or psy-
chiatric disorders. All participants included had normal or corrected to normal vision.
Written informed consent was obtained before the experiment following the Declaration
of Helsinki. Participants were remunerated for their time. The ethics committee of the
Education Department at XX University had approved the study.

2.2. Stimuli and Procedure


The video stimuli for the eye-tracking experiment were 4 videos from “Digital Pho-
tography Fundamentals” in MOOC. The duration of the videos ranged from 8.5 min to
13.5 min with a mean of 10 min. The video style was classic “Presenter and animation”,
which shows a presenter as texts and pictures information are shown. Video materials
contain the content or elements expected by different learning styles. For example, videos
include not only theories about the relationship between aperture, shutter, and ISO, but
also illustrative examples and concrete data to demonstrate the relationship. Diagrams
and charts as well as texts are integrated to cater to varied learning styles. Stimuli were
presented on a 27-inch LCD monitor with a screen refresh rate of 60 Hz and a resolution set
at 1920 × 1080 pixels. A chin rest was used to hold the participant’s head throughout the
experiment. The experiment was programmed using PsychoPy 2022.2.1.
The participants first filled out the Felder-Silverman learning style questionnaire to
identify their learning styles (Figure 1). They were then asked to answer 20 questions
(16 four-alternative forced choice and 4 multiple choice questions) to test their prior knowl-
edge of photography. After answering the pre-test questions, the eye-tracking experiment
started. Eye positions were monitored using the monocular eye-tracking system EyeLink
1000 (SR Research, Mississauga, ON, Canada) at a sampling rate of 1000 Hz. Eye movement
four-alternative forced choice and 4 multiple choice questions) to test their prior
knowledge of photography. After answering the pre-test questions, the eye-tracking ex-
periment started. Eye positions were monitored using the monocular eye-tracking system
EyeLink 1000 (SR Research, Mississauga, ON, Canada) at a sampling rate of 1000 Hz. Eye
Educ. Sci. 2024, 14, 548
movement data were recorded from the left eye. A nine-point calibration and5 validation of 12

procedure were performed to map the eye positions to the screen coordinates. Drift cor-
rection wasrecorded
data were performed frombefore the
the left very
eye. video. Participants
A nine-point calibrationwatched four videos
and validation in the or-
procedure
der
were performed to map the eye positions to the screen coordinates. Drift correction was rated
of the original MOOC course while their eye movements were tracked. They
their attention
performed level
before theon a video.
very scale from 1 to 9 watched
Participants after watching each
four videos in video.
the orderAfter finishing
of the origi- the
video watching,
nal MOOC coursethey
whileanswered the 20 questions
their eye movements again to
were tracked. assess
They ratedtheir
their learning outcomes.
attention level
on a scale from 1 to 9 after watching each video. After finishing the video watching,
They were then asked to watch the first video again but in a distracted condition. In this they
answered participants
condition, the 20 questions again backward
counted to assess their learning
silently outcomes.
in their minds, They
fromwere then askedchosen
a randomly
to watchbetween
number the first video again
800 and but in
1000, inadecrements
distracted condition.
of 7 [2].In this task
This condition, participants
distracted the subjects
counted backward silently in their minds, from a randomly chosen number between 800
from the stimulus without requiring overt responses. They had to report the final number
and 1000, in decrements of 7 [2]. This task distracted the subjects from the stimulus without
when the video
requiring finished. They had to report the final number when the video finished.
overt responses.

Figure 1. Experimental procedure.


Figure 1. Experimental procedure.
2.3. Data Analyses
2.3.1. Preprocessing of Eye Movement Data
2.3. Data Analyses
EDF (EyeLink Data Format) files were converted to ASCII files for later analyses.
2.3.1. Preprocessing
Horizontal, of Eye
and vertical MovementasData
coordinates, well as pupil sizes from the first frame to the
lastEDF
frame of the video,
(EyeLink Datawere extracted.
Format) Blinks
files were were detected
converted usingfiles
to ASCII the for
SR research blink Hor-
later analyses.
detection
izontal, andalgorithm. The blinks and
vertical coordinates, 100as
as well mspupil
before andfrom
sizes afterthe
were filled
first with
frame to linearly
the last frame
ofinterpolated
the video, values for horizontal,
were extracted. Blinksandwere
vertical coordinates,
detected usingasthe
wellSR
as research
pupil sizes.
blink detection
2.3.2. Intersubject Correlation of Eye Movement Data
Intersubject correlation of eye movements was calculated using the method from
Madsen et al. [2]. Specifically, we first computed the Pearson’s correlation coefficient
between a single participant’s vertical coordinates with that of all other participants. Second,
Educ. Sci. 2024, 14, 548 6 of 12

we calculated a single ISC value for a participant by averaging the correlation values
between that participant and all other participants. Third, we then repeated steps 1 and
2 for all participants, resulting in a single ISC value for each participant. We repeat these
three steps for the horizontal coordinates and pupil size. Finally, we averaged ISCvertical
and ISChorizontal to obtain a single ISC to represent the ISC of eye movements or the three
ISC values (i.e., ISCvertical , ISChorizontal , and ISCpupil ) to represent the ISC of eye movements
and pupil sizes for each participant, to assess their correlations with other participants. The
ISC values for the attending and distracted conditions were computed on the data for the
two conditions separately.

2.3.3. Learning Style Questionnaire


The FSLSQ assesses participants’ learning styles across four main categories: (1) Active/
Reflective, (2) Sensing/Intuitive, (3) Visual/Verbal, and (4) Sequential/Global. There are
11 questions in each category with answer a or answer b. Answer a corresponds to the
preference for the first type of each category (Active, Sensing, Visual, or Sequential), and
answer b to the second type of each category (Reflective, Intuitive, Verbal, or Global). When
answering a question, for instance, with an active preference, 1 point is added to the value
of the type Active. We then calculated the learning style results by subtracting the points of
the second type from the first type (e.g., 7 Active − 4 Reflective = 3 Active). According to
Felder and Soloman, the interpretation of the score is as follows:
• If the score for a dimension is 1 or 3, learners are fairly well balanced on the two
categories of that dimension, with only a mild preference for one or the other.
• If the score for a dimension is 5 or 7, learners have a moderate preference for one
category of that dimension. Learners may learn less easily in an environment that
fails to address that preference at least some of the time than they would in a more
balanced environment.
• If the score for a dimension is 9 or 11, learners have a strong preference for one category
of that dimension. Learners may have difficulty learning in an environment that fails
to address that preference at least some of the time.

3. Results
3.1. Scores of the Pretest and Posttest
We first performed a one-sample t-test to examine the prior knowledge level of partici-
pants. Results showed that the scores of the pretest (M: 4.76 ± 1.62) were not significantly
different than 5 (20 items with 25% of chance being right from guessing), t (28) = −0.80,
p = 0.429, 95%CI = [−0.53, 0.23], suggesting participants were naïve to the learning material.
Paired t-test of the pretest and post-test showed that the scores of the posttest (M: 15.59 ± 2.34)
were significantly larger than the pretest, t (28) = 19.59, p < 0.001, 95%CI = [9.70, 11.96],
Cohen’s d = 3.64, suggesting learning process were successfully implemented during
the experiment.

3.2. ISC Difference between Attending and Distracted Conditions


ISC differences in participants’ eye movement between attending and distracted
conditions were examined. ISC values from the two attentional conditions were used to
perform paired t-tests. Results showed that the ISC of eye movements (i.e., horizontal
and vertical coordinates of the gaze point) were significantly higher in the attending (M:
0.40 ± 0.10) condition than the distracted condition (M: 0.13 ± 0.07), t (28) = 17.36, p < 0.001,
95%CI = [0.30, 0.24], Cohen’s d = 3.22. ISC of eye movements and pupil sizes showed a
similar pattern, being significantly higher in the attending condition (M: 0.51 ± 0.07) than
in the distracted condition (M: 0.28 ± 0.07) t (28) = 18.96, p < 0.001, 95%CI = [0.26, 0.21],
Cohen’s d = 3.52, (Figure 2).
± 0.10) condition than the distracted condition (M: 0.13 ± 0.07), t (28) = 17.36, p < 0.001,
= [0.30, 0.24], Cohen’s d = 3.22. ISC of eye movements and pupil sizes showed a s
pattern, being significantly higher in the attending condition (M: 0.51 ± 0.07) than
distracted condition (M: 0.28 ± 0.07) t (28) = 18.96, p < 0.001, 95%CI = [0.26, 0.21], Coh
Educ. Sci. 2024, 14, 548 7 of 12
= 3.52, (Figure 2).

Figure 2. Eye movement data from two subjects in horizontal and vertical direction and pupil size in
Figure 2. Eye movement data from two subjects in horizontal and vertical direction and pu
attending and distracted conditions.
in attending and distracted conditions.
Educ. Sci. 2024, 14, 548
Educ. Sci. 2024, 14, x FOR PEER REVIEW 8 of 812
of 12

3.3. ISC Prediction on Test Score and Attention Level


3.3. ISC Prediction on Test Score and Attention Level
ToTotest whether correlated eye movements predict the test score and the attention
test whether correlated eye movements predict the test score and the attention
level,
level, we performedcorrelation
we performed correlation analyses. ThePearson
analyses. The Pearsoncorrelation
correlation coefficient
coefficient showed
showed thatthat
ISC of eye movements was not significantly correlated with test scores
ISC of eye movements was not significantly correlated with test scores (r = 0.09, p = 0.65, (r = 0.09, p = 0.65,
Figure 3A). However, when we correlated the self-report attention
Figure 3A). However, when we correlated the self-report attention level (from 1–9) afterlevel (from 1–9) after
watching each video with ISC values, Spearman correlation analysis
watching each video with ISC values, Spearman correlation analysis showed significant showed significant
correlation
correlationeffects
effects(r(r= =0.57, p<
0.57, p 0.001, Figure
< 0.001, 3B).
Figure Results
3B). Results suggest
suggestthat ISC
that ISCof ofeyeeyemovements
move-
predicts the attention level of the learners but not necessarily the test
ments predicts the attention level of the learners but not necessarily the test scores. scores. This is is
This also
true
alsofor ISC
true forof
ISCeye movement
of eye movement andandpupil
pupilsizes
sizes(r(rscores 0.12,p =p 0.54;
scores ==0.12, = 0.54; rattention
rattention = 0.59,=p 0.59,
<
p 0.001).
< 0.001).

Figure3.3.(A):
Figure (A):Correlations
Correlations between
between test
test scores
scores and
andISC.
ISC.(B):
(B):Correlations
Correlationsbetween
betweenattention levels
attention levels
and ISC.
and ISC.

3.4.ISC
3.4. ISCofofDifferent
DifferentLearning
Learning Styles
WeWeset
setthe
thescore
scoredifference
difference threshold
threshold to to3,3,which
whichmeans
meansthat
thatparticipants
participantswith
witha score
a score
largerororequal
larger equaltoto33were
were categorized
categorized to to the
thecorresponding
correspondinglearning
learningstyle. The
style. distribution
The distribution
ofofthe
theparticipant’s
participant’slearning
learning style
style is listed
listed in
in Table
Table11

Table
Table 1.1.Distribution
Distributionof
ofthe
theparticipants’
participants’ learning
learningstyle.
style.
Dimension Active
Dimension Active Reflective Sensing Intuitive
Reflective Sensing Intuitive Visual
Visual Verbal
Verbal Sequential
Sequential Global
Global
No.
No. 4
4 16
16
14
14 6
6 15
15
55 55 1616

To investigate whether categorizing learners into their learning styles can change
To prediction
ISC’s investigaterate whether categorizing
of learner’s attention, learners into their learning
we re-calculated styles can
ISC of different change
learning
ISC’s prediction rate of learner’s attention, we re-calculated ISC
styles. Specifically, we first computed the Pearson’s correlation coefficient between a sin- of different learning
styles. Specifically,
gle participant’s we first computed
vertical/horizontal the Pearson’s
coordinates correlation
or pupil sizes with that coefficient
of all otherbetween
partic- a
single
ipantsparticipant’s
who belonged vertical/horizontal
to the same learningcoordinates
style. We then or calculated
pupil sizesa withsinglethat of all for
ISC value other
participants
a participantwho belonged the
by averaging to the same learning
correlation style. We
values between then
that calculated
participant anda all
single
otherISC
value for a participant
participants of the same bylearning
averaging theThe
style. correlation
rest of the values between that
steps remained participant
unchanged. and all
In other
other participants
words, we utilized of the same learning
information only fromstyle.the
Theparticipants
rest of the steps
who remained
belonged to unchanged.
the same In
other words,
learning wetoutilized
style compute information only from
ISCstyles, whereas, thethe participantsISC
conventional who belonged
all utilizes to the same
information
learning style to compute ISC
from all the participants regardless , whereas, the conventional
styles of their learning styles. We performed ISC all utilizes information
Spearman cor-
from all the
relation participants
analyses with theregardless
ISCstyles andof their
their learning
attention styles.
levels. The rWe performed
coefficients Spearman
of different
learning styles
correlation are listed
analyses withinthe
Table
ISC 2.styles and their attention levels. The r coefficients of
different learning styles are listed in Table 2.
Table
To2.test
Correlation
whether coefficients
ISCstyles between ISCstyles and
is significantly attentionfrom
different levels.ISC , we performed a boot-
all
strap analysisActive
Dimension [31]. Specifically,
Reflective we randomly
Sensing sampled
Intuitive VisualISCVerbal
styles and ISCall datasets
Sequential Global with
replacement,
rEM each
0.68time taking
0.55 out the
0.49same amount
0.73 of data
0.46 as the
0.73 original sample
0.55 0.63 new
as a
sample. We then calculated the difference between the two new ISCstyles and ISCall samples,
repeating it for 5000 times. It allowed us to construct the 95% confidence interval of the
difference. We then determined the significance of the difference based on the position of 0
Educ. Sci. 2024, 14, 548 9 of 12

in the confidence interval. If the 95% confidence interval includes the 0, it suggests that the
bootstrap procedure does not provide sufficient evidence to discard the null hypothesis,
and we cannot conclude that ISCstyles is really a better predictor of attention than ISCall . We
performed bootstrapping with 5000 resamples, and the results are summarized in Table 3.
The results suggest that categorizing learners according to their learning styles did not
significantly change ISCstyles ’s prediction rate of learner’s attention.

Table 2. Correlation coefficients between ISCstyles and attention levels.

Dimension Active Reflective Sensing Intuitive Visual Verbal Sequential Global


rEM 0.68 0.55 0.49 0.73 0.46 0.73 0.55 0.63
rEM + PS 0.69 0.58 0.56 0.74 0.48 0.67 0.70 0.59
EM: eye movements, EM + PS: eye movements and pupil sizes, all p < 0.001.

Table 3. Bootstrapping results.

Dimension Active Reflective Sensing Intuitive Visual Verbal Sequential Global


95% CI [−0.02, 0.02] [−0.15, 0.13] [−0.04, 0.03] [−0.10, 0.01] [−0.05, 0.03] [−0.06, 0.06] [−0.15, 0.13] [−0.01, 0.04]

4. Discussions
In the current study, we investigated the effectiveness of using synchronized eye
movements and pupil sizes to predict learner’s attentional states. Specifically, we asked
participants to watch four lecture videos in the attending condition and one lecture video
in the distracted condition while their eye movements were tracked. The results showed
that ISC of eye movements was significantly higher in the attending condition than in the
distracted condition. Notably, the ISC demonstrated predictive capabilities not only for the
binary attending versus distracted states but also at a more nuanced level of self-reported
attention states. However, there was no significant correlation between ISC and learner’s
test scores. Lastly, learners with different learning styles exhibited similar viewing patterns
when engaging with video content. The differentiation of learners’ learning styles did
not change predictive efficiency of ISC for attention assessment. This finding suggests
that, ISC of eye movements can be effectively employed without considering learners’
individualized learning styles.

4.1. ISC of Eye Movements Predict Learners’ Attention at a More Nuanced Level but Not Their
Academic Performance
Consistent with the findings of Madsen et al. [2] and Liu et al. [19], our study reaffirms
that synchronized eye movements across learners serve as a robust indicator capable of
distinguishing between attending and distracted states. However, our research extends
beyond these prior results by showing that the ISC of eye movements not only excels at
discerning binary attending and distracted states but also exhibits proficiency in capturing
nuanced shifts within attentional states. This nuanced capability allows for a more precise
assessment of students’ attention, offering a finer-grained understanding of the magnitude
of their attentional engagement. It contributes to the field by highlighting the practical
utility of ISC in facilitating attention assessment. Specifically, it allows for the establishment
of customizable thresholds for distraction detection, providing flexibility in tailoring the
assessment to specific needs. This adaptability in setting thresholds enhances the appli-
cability of ISC of eye movements as a versatile tool in gauging attention states within the
dynamic context of online learning.
Contrary to the previous findings [2], we were not able to observe significant cor-
relations between ISC of eye movements (or pupil sizes) and test scores. Although
Liu et al. (2023) also demonstrated that ISC of eye movements significantly correlated with
test scores, the magnitude of the coefficient was much smaller than what was found in
Madsen et al. [2] (~0.25 vs. ~0.50). Furthermore, ISC of pupil sizes was not found to be
Educ. Sci. 2024, 14, 548 10 of 12

significantly correlated with test scores. Another study that used a webcam-based eye
tracking system also failed to replicate the relationship between ISC of eye movements
and learning performance [32]. Several factors may contribute to these discrepancies, in-
cluding variations in learning materials and exam difficulty. For instance, Madsen et al. [2]
utilized short videos from YouTube channels, while Liu et al. [19] and our study employed
lecture videos from MOOCs, which typically follow a different structure and have longer
durations. Sauter et al. [32] utilized recorded conference videos. The variances in learning
materials and exam characteristics highlight the need for additional research to thoroughly
investigate and confirm the relationship between ISC of eye movements and test scores. It
is crucial to consider these contextual factors to discern the generalizability and robustness
of the observed correlations, emphasizing the necessity for further exploration in diverse
educational settings.

4.2. Learners’ Learning Styles Do Not Affect ISC’s Prediction Rate of Attention Assessment in
Video Learning
Importantly, we tested whether learners’ learning styles affect ISC’s prediction rate
of attention assessment in video learning. The outcomes revealed that distinguishing
among learners’ learning styles did not yield notable changes in the effectiveness of ISC
in predicting attention. As highlighted in our introduction, lecture videos commonly
employ strategies like animation and teacher’s gestures to guide viewer attention, poten-
tially resulting in more consistent viewing patterns. This finding aligns with the work of
Mu et al. [33], who explored learners’ attention preferences in the context of online learning.
Their study, conducted during video learning, found no significant differences in attention
preferences among students with varying visual-verbal preferences.
To further explore this finding, we conducted a focused analysis on three segments of
the lecture video stimuli which lasted around 200 s with the mean duration being 65.85 s
where content remained relatively static. The information presented in the segment contains
images and texts at the same time. We investigated the dwell time of visual and verbal
learners on images and text contents. We found that visual learners spent significantly more
time on images (mean ± SD: 30.71 s ± 10.12) than textual information (23.71 s ± 11.62),
t (44) = 2.27, p = 0.03, while verbal learners dedicated more time on texts (33.95 s ± 12.01)
than image information (23.65 s ± 10.22), t (14) = 2.25, p = 0.04. This implies that learners
with distinct learning styles indeed exhibit divergent viewing patterns and preferences
when confronted with static materials. However, these distinctions seem less pronounced
in the context of lecture videos, where the dynamic presentation and additional guiding
elements might contribute to more uniform viewing behaviors.
There are a few limitations in the current study that need to be mentioned. Firstly, the
distribution of learning styles among subjects is not balanced in the current participants.
The representation of active, verbal, sensing, and sequential learners is notably lower
compared to their counterparts, introducing potential bias into the results. Future studies
should aim for a more balanced representation to ensure a comprehensive understand-
ing of the relationship between learning styles and eye movements. Another limitation
pertains to the examination of a specific style of lecture videos—Presenter and animation.
Generalizing the findings to other lecture video formats, such as presenter and glass board
or recorded lectures, may not be warranted. Future investigations should encompass a
broader spectrum of video formats and diverse subjects to enhance the external validity of
the findings. Additional limitation relates to the ISC itself. If we use ISC of eye movements
to assess attention, we need to guarantee that the contents presented to learners are syn-
chronized. While it may not be a problem in synchronous online learning, in asynchronous
learning however, learners could not stop or re-play the video section if they did not
understand the section before continuing learning which is an important advantage of
asynchronous learning.
In summary, the current study verifies and extends previous studies by demonstrating
that ISC of eye movements can reliably predict learners’ attention at a more nuanced level.
Educ. Sci. 2024, 14, 548 11 of 12

Such prediction is not affected by the learner’s learning style and ISC of eye movements
can be effectively employed without considering learners’ individualized learning styles.

Author Contributions: Conceptualization, H.Z.; methodology, H.Z.; software, H.Z.; validation, H.Z.;
formal analysis, H.Z.; investigation, C.S.; resources, H.Z.; data curation, H.Z.; writing—original
draft preparation, H.Z., C.S., X.L. and X.G.; writing—review and editing, H.Z., C.S., X.L. and X.G.;
visualization, H.Z.; supervision, H.Z.; project administration, H.Z.; funding acquisition, H.Z. All
authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by Guangdong Province Educational Science Planning under
Higher Education Special Projects (grant number: 2023GXJK664), Guangdong Province Philosophy
and Social Science Planning Project (grant number: GD20XJY61), and the Beijing Normal University
at Zhuhai (grant number: 111032101).
Institutional Review Board Statement: The study was conducted in accordance with the Declaration
of Helsinki, and approved by the Ethical committee of Education Department at Beijing Normal
University (IRB Number: BNU202302100001) on 1 March 2023.
Informed Consent Statement: Informed consent was obtained from all subjects involved in the
study.
Data Availability Statement: The data and scripts of analyses of the current study are available at
https://osf.io/fw3pz/ (accessed on 1 May 2024).
Conflicts of Interest: The authors declare no conflicts of interest.

References
1. Pokhrel, S.; Chhetri, R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. High. Educ. Future
2021, 8, 133–141. [CrossRef]
2. Madsen, J.; Júlio, S.U.; Gucik, P.J.; Steinberg, R.; Parra, L.C. Synchronized Eye Movements Predict Test Scores in Online Video
Education. Proc. Natl. Acad. Sci. USA 2021, 118, e2016980118. [CrossRef] [PubMed]
3. Newmann, F.M. Student Engagement and Achievement in American Secondary Schools; Teachers College Press: New York, NY, USA,
1992.
4. Polderman, T.J.C.; Boomsma, D.I.; Bartels, M.; Verhulst, F.C.; Huizink, A.C. A Systematic Review of Prospective Studies on
Attention Problems and Academic Achievement. Acta Psychiatr. Scand. 2010, 122, 271–284. [CrossRef] [PubMed]
5. Reynolds, R.E.; Shirey, L.L. The Role of Attention in Studying And Learning. In Learning and Study Strategies; Weinstein, C.E.,
Goetz, E.T., Alexander, P.A., Eds.; Academic Press: San Diego, CA, USA, 1988; pp. 77–100. [CrossRef]
6. Chen, C.-M.; Wang, J.-Y.; Yu, C.-M. Assessing the Attention Levels of Students by Using a Novel Attention Aware System Based
on Brainwave Signals. Br. J. Educ. Technol. 2017, 48, 348–369. [CrossRef]
7. Chen, C.-M.; Wang, J.-Y. Effects of Online Synchronous Instruction with an Attention Monitoring and Alarm Mechanism on
Sustained Attention and Learning Performance. Interact. Learn. Environ. 2018, 26, 427–443. [CrossRef]
8. Lin, C.-H.; Chen, C.-M.; Lin, Y.-T. Improving Effectiveness of Learners’ Review of Video Lectures by Using an Attention-Based
Video Lecture Review Mechanism Based on Brainwave Signals. In 2018 7th International Congress on Advanced Applied Informatics
(IIAI-AAI); IEEE: Yonago, Japan, 2018; pp. 152–157. [CrossRef]
9. Li, Q.; Ren, Y.; Wei, T.; Wang, C.; Liu, Z.; Yue, J. A Learning Attention Monitoring System via Photoplethysmogram Using
Wearable Wrist Devices. In Artificial Intelligence Supported Educational Technologies; Pinkwart, N., Liu, S., Eds.; Advances in
Analytics for Learning and Teaching; Springer International Publishing: Cham, Swizerland, 2020; pp. 133–150. [CrossRef]
10. Abate, A.F.; Cascone, L.; Nappi, M.; Narducci, F.; Passero, I. Attention Monitoring for Synchronous Distance Learning. Future
Gener. Comput. Syst. 2021, 125, 774–784. [CrossRef]
11. Terraza Arciniegas, D.F.; Amaya, M.; Piedrahita Carvajal, A.; Rodriguez-Marin, P.A.; Duque-Muñoz, L.; Martinez-Vargas, J.D.
Students’ Attention Monitoring System in Learning Environments Based on Artificial Intelligence. IEEE Lat. Am. Trans. 2022, 20,
126–132. [CrossRef]
12. Alemdag, E.; Cagiltay, K. A Systematic Review of Eye Tracking Research on Multimedia Learning. Comput. Educ. 2018, 125,
413–428. [CrossRef]
13. Jamil, N.; Belkacem, A.N.; Lakas, A. On Enhancing Students’ Cognitive Abilities in Online Learning Using Brain Activity and
Eye Movements. Educ. Inf. Technol. 2023, 28, 4363–4397. [CrossRef]
14. Pouta, M.; Lehtinen, E.; Palonen, T. Student Teachers’ and Experienced Teachers’ Professional Vision of Students’ Understanding
of the Rational Number Concept. Educ. Psychol. Rev. 2021, 33, 109–128. [CrossRef]
15. Sharma, K.; Giannakos, M.; Dillenbourg, P. Eye-Tracking and Artificial Intelligence to Enhance Motivation and Learning. Smart
Learn. Environ. 2020, 7, 13. [CrossRef]
Educ. Sci. 2024, 14, 548 12 of 12

16. Tsai, M.-J.; Hou, H.-T.; Lai, M.-L.; Liu, W.-Y.; Yang, F.-Y. Visual Attention for Solving Multiple-Choice Science Problem: An
Eye-Tracking Analysis. Comput. Educ. 2012, 58, 375–385. [CrossRef]
17. Hasson, U.; Landesman, O.; Knappmeyer, B.; Vallines, I.; Rubin, N.; Heeger, D.J. Neurocinematics: The Neuroscience of Film.
Projections 2008, 2, 1–26. [CrossRef]
18. Yang, F.-Y.; Chang, C.-Y.; Chien, W.-R.; Chien, Y.-T.; Tseng, Y.-H. Tracking Learners’ Visual Attention during a Multimedia
Presentation in a Real Classroom. Comput. Educ. 2013, 62, 208–220. [CrossRef]
19. Liu, Q.; Yang, X.; Chen, Z.; Zhang, W. Using Synchronized Eye Movements to Assess Attentional Engagement. Psychol. Res. 2023,
87, 2039–2047. [CrossRef]
20. Esterman, M.; Rosenberg, M.D.; Noonan, S.K. Intrinsic Fluctuations in Sustained Attention and Distractor Processing. J. Neurosci.
2014, 34, 1724–1730. [CrossRef]
21. Felder, R.M.; Silverman, L.K. Learning and Teaching Styles in Engineering Education. Eng. Educ. 1988, 78, 674–681.
22. Nugrahaningsih, N.; Porta, M.; Klasnja-Milicevic, A. Assessing Learning Styles through Eye Tracking for E-Learning Applications.
ComSIS 2021, 18, 1287–1309. [CrossRef]
23. Al-Wabil, A.; ElGibreen, H.; George, R.P.; Al-Dosary, B. Exploring the Validity of Learning Styles as Personalization Parameters
in eLearning Environments: An Eyetracking Study. In Proceedings of the 2010 2nd International Conference on Computer
Technology and Development, Cairo, Egypt, 2–4 November 2010; pp. 174–178. [CrossRef]
24. Mehigan, T.J.; Barry, M.; Kehoe, A.; Pitt, I. Using Eye Tracking Technology to Identify Visual and Verbal Learners. In Proceedings
of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain, 11–15 July 2011; pp. 1–6. [CrossRef]
25. Luo, Z. Using Eye-Tracking Technology to Identify Learning Styles: Behaviour Patterns and Identification Accuracy. Educ. Inf.
Technol. 2021, 26, 4457–4485. [CrossRef]
26. Cao, J.; Nishihara, A. Understanding Learning Style by Eye Tracking in Slide Video Learning. J. Educ. Multimed. Hypermedia 2012,
21, 335–358.
27. Cao, J.; Nishihara, A. Viewing Behaviors Affected by Slide Features and Learning Style in Slide Video from a Sequence Analysis
Perspective. J. Inf. Syst. Educ. 2013, 12, 1–12. [CrossRef]
28. Ou, C.; Joyner, D.A.; Goel, A.K. Designing and Developing Videos for Online Learning: A Seven-Principle Model. OLJ 2019, 23,
82–104. [CrossRef]
29. Mayer, R.E.; DaPra, C.S. An Embodiment Effect in Computer-Based Learning with Animated Pedagogical Agents. J. Exp. Psychol.
Appl. 2012, 18, 239–252. [CrossRef] [PubMed]
30. Wang, F.; Li, W.; Mayer, R.E.; Liu, H. Animated Pedagogical Agents as Aids in Multimedia Learning: Effects on Eye-Fixations
during Learning and Learning Outcomes. J. Educ. Psychol. 2018, 110, 250–268. [CrossRef]
31. Diedenhofen, B.; Musch, J. Cocor: A Comprehensive Solution for the Statistical Comparison of Correlations. PLoS ONE 2015, 10,
e0121945. [CrossRef]
32. Sauter, M.; Hirzle, T.; Wagner, T.; Hummel, S.; Rukzio, E.; Huckauf, A. Can Eye Movement Synchronicity Predict Test Performance
With Unreliably-Sampled Data in an Online Learning Context? In 2022 Symposium on Eye Tracking Research and Applications; ACM:
Seattle, WA, USA, 2022; pp. 1–5. [CrossRef]
33. Mu, L.; Cui, M.; Wang, X.; Qiao, J.; Tang, D. Learners’ Attention Preferences of Information in Online Learning: An Empirical
Study Based on Eye-Tracking. Interact. Technol. Smart Educ. 2019, 16, 186–203. [CrossRef]

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.

View publication stats

You might also like