UsingSynchronizedEyeMovementstoPredictAttentioninOnlineVideoLearning
UsingSynchronizedEyeMovementstoPredictAttentioninOnlineVideoLearning
net/publication/380738910
CITATION READS
1 33
4 authors, including:
Hang Zeng
Beijing Normal University at Zhuhai
12 PUBLICATIONS 104 CITATIONS
SEE PROFILE
All content following this page was uploaded by Hang Zeng on 28 May 2024.
Abstract: Concerns persist about attentional engagement in online learning. The inter-subject
correlation of eye movements (ISC) has shown promise as an accessible and effective method for
attention assessment in online learning. This study extends previous studies investigating ISC of eye
movements in online learning by addressing two research questions. Firstly, can ISC predict students’
attentional states at a finer level beyond a simple dichotomy of attention states (e.g., attending and
distracted states)? Secondly, whether learners’ learning styles affect ISC’s prediction rate of attention
assessment in video learning? Previous studies have shown that learners of different learning styles
have different eye movement patterns when viewing static materials. However, limited research
has explored the impact of learning styles on viewing patterns in video learning. An eye tracking
experiment with participants watching lecture videos demonstrated a connection between ISC and
self-reported attention states at a finer level. We also demonstrated that learning styles did not
significantly affect ISC’s prediction rate of attention assessment in video learning, suggesting that
ISC of eye movements can be effectively used without considering learners’ learning styles. These
findings contribute to the ongoing discourse on optimizing attention assessment in the evolving
landscape of online education.
Keywords: eye tracking; intersubject correlation; online learning; learning styles; attentional assessment
2. Method
2.1. Participants
Thirty participants took part in the experiment. One participant was excluded due
to bad data quality, resulting in 29 participants (14 females, age range from 18 to 27 years,
M = 22.31, SD = 2.33). The sample size was chosen based on previous eye-tracking studies
for learning [19,25]. None of the participants reported a history of neurological or psy-
chiatric disorders. All participants included had normal or corrected to normal vision.
Written informed consent was obtained before the experiment following the Declaration
of Helsinki. Participants were remunerated for their time. The ethics committee of the
Education Department at XX University had approved the study.
procedure were performed to map the eye positions to the screen coordinates. Drift cor-
rection wasrecorded
data were performed frombefore the
the left very
eye. video. Participants
A nine-point calibrationwatched four videos
and validation in the or-
procedure
der
were performed to map the eye positions to the screen coordinates. Drift correction was rated
of the original MOOC course while their eye movements were tracked. They
their attention
performed level
before theon a video.
very scale from 1 to 9 watched
Participants after watching each
four videos in video.
the orderAfter finishing
of the origi- the
video watching,
nal MOOC coursethey
whileanswered the 20 questions
their eye movements again to
were tracked. assess
They ratedtheir
their learning outcomes.
attention level
on a scale from 1 to 9 after watching each video. After finishing the video watching,
They were then asked to watch the first video again but in a distracted condition. In this they
answered participants
condition, the 20 questions again backward
counted to assess their learning
silently outcomes.
in their minds, They
fromwere then askedchosen
a randomly
to watchbetween
number the first video again
800 and but in
1000, inadecrements
distracted condition.
of 7 [2].In this task
This condition, participants
distracted the subjects
counted backward silently in their minds, from a randomly chosen number between 800
from the stimulus without requiring overt responses. They had to report the final number
and 1000, in decrements of 7 [2]. This task distracted the subjects from the stimulus without
when the video
requiring finished. They had to report the final number when the video finished.
overt responses.
we calculated a single ISC value for a participant by averaging the correlation values
between that participant and all other participants. Third, we then repeated steps 1 and
2 for all participants, resulting in a single ISC value for each participant. We repeat these
three steps for the horizontal coordinates and pupil size. Finally, we averaged ISCvertical
and ISChorizontal to obtain a single ISC to represent the ISC of eye movements or the three
ISC values (i.e., ISCvertical , ISChorizontal , and ISCpupil ) to represent the ISC of eye movements
and pupil sizes for each participant, to assess their correlations with other participants. The
ISC values for the attending and distracted conditions were computed on the data for the
two conditions separately.
3. Results
3.1. Scores of the Pretest and Posttest
We first performed a one-sample t-test to examine the prior knowledge level of partici-
pants. Results showed that the scores of the pretest (M: 4.76 ± 1.62) were not significantly
different than 5 (20 items with 25% of chance being right from guessing), t (28) = −0.80,
p = 0.429, 95%CI = [−0.53, 0.23], suggesting participants were naïve to the learning material.
Paired t-test of the pretest and post-test showed that the scores of the posttest (M: 15.59 ± 2.34)
were significantly larger than the pretest, t (28) = 19.59, p < 0.001, 95%CI = [9.70, 11.96],
Cohen’s d = 3.64, suggesting learning process were successfully implemented during
the experiment.
Figure 2. Eye movement data from two subjects in horizontal and vertical direction and pupil size in
Figure 2. Eye movement data from two subjects in horizontal and vertical direction and pu
attending and distracted conditions.
in attending and distracted conditions.
Educ. Sci. 2024, 14, 548
Educ. Sci. 2024, 14, x FOR PEER REVIEW 8 of 812
of 12
Figure3.3.(A):
Figure (A):Correlations
Correlations between
between test
test scores
scores and
andISC.
ISC.(B):
(B):Correlations
Correlationsbetween
betweenattention levels
attention levels
and ISC.
and ISC.
3.4.ISC
3.4. ISCofofDifferent
DifferentLearning
Learning Styles
WeWeset
setthe
thescore
scoredifference
difference threshold
threshold to to3,3,which
whichmeans
meansthat
thatparticipants
participantswith
witha score
a score
largerororequal
larger equaltoto33were
were categorized
categorized to to the
thecorresponding
correspondinglearning
learningstyle. The
style. distribution
The distribution
ofofthe
theparticipant’s
participant’slearning
learning style
style is listed
listed in
in Table
Table11
Table
Table 1.1.Distribution
Distributionof
ofthe
theparticipants’
participants’ learning
learningstyle.
style.
Dimension Active
Dimension Active Reflective Sensing Intuitive
Reflective Sensing Intuitive Visual
Visual Verbal
Verbal Sequential
Sequential Global
Global
No.
No. 4
4 16
16
14
14 6
6 15
15
55 55 1616
To investigate whether categorizing learners into their learning styles can change
To prediction
ISC’s investigaterate whether categorizing
of learner’s attention, learners into their learning
we re-calculated styles can
ISC of different change
learning
ISC’s prediction rate of learner’s attention, we re-calculated ISC
styles. Specifically, we first computed the Pearson’s correlation coefficient between a sin- of different learning
styles. Specifically,
gle participant’s we first computed
vertical/horizontal the Pearson’s
coordinates correlation
or pupil sizes with that coefficient
of all otherbetween
partic- a
single
ipantsparticipant’s
who belonged vertical/horizontal
to the same learningcoordinates
style. We then or calculated
pupil sizesa withsinglethat of all for
ISC value other
participants
a participantwho belonged the
by averaging to the same learning
correlation style. We
values between then
that calculated
participant anda all
single
otherISC
value for a participant
participants of the same bylearning
averaging theThe
style. correlation
rest of the values between that
steps remained participant
unchanged. and all
In other
other participants
words, we utilized of the same learning
information only fromstyle.the
Theparticipants
rest of the steps
who remained
belonged to unchanged.
the same In
other words,
learning wetoutilized
style compute information only from
ISCstyles, whereas, thethe participantsISC
conventional who belonged
all utilizes to the same
information
learning style to compute ISC
from all the participants regardless , whereas, the conventional
styles of their learning styles. We performed ISC all utilizes information
Spearman cor-
from all the
relation participants
analyses with theregardless
ISCstyles andof their
their learning
attention styles.
levels. The rWe performed
coefficients Spearman
of different
learning styles
correlation are listed
analyses withinthe
Table
ISC 2.styles and their attention levels. The r coefficients of
different learning styles are listed in Table 2.
Table
To2.test
Correlation
whether coefficients
ISCstyles between ISCstyles and
is significantly attentionfrom
different levels.ISC , we performed a boot-
all
strap analysisActive
Dimension [31]. Specifically,
Reflective we randomly
Sensing sampled
Intuitive VisualISCVerbal
styles and ISCall datasets
Sequential Global with
replacement,
rEM each
0.68time taking
0.55 out the
0.49same amount
0.73 of data
0.46 as the
0.73 original sample
0.55 0.63 new
as a
sample. We then calculated the difference between the two new ISCstyles and ISCall samples,
repeating it for 5000 times. It allowed us to construct the 95% confidence interval of the
difference. We then determined the significance of the difference based on the position of 0
Educ. Sci. 2024, 14, 548 9 of 12
in the confidence interval. If the 95% confidence interval includes the 0, it suggests that the
bootstrap procedure does not provide sufficient evidence to discard the null hypothesis,
and we cannot conclude that ISCstyles is really a better predictor of attention than ISCall . We
performed bootstrapping with 5000 resamples, and the results are summarized in Table 3.
The results suggest that categorizing learners according to their learning styles did not
significantly change ISCstyles ’s prediction rate of learner’s attention.
4. Discussions
In the current study, we investigated the effectiveness of using synchronized eye
movements and pupil sizes to predict learner’s attentional states. Specifically, we asked
participants to watch four lecture videos in the attending condition and one lecture video
in the distracted condition while their eye movements were tracked. The results showed
that ISC of eye movements was significantly higher in the attending condition than in the
distracted condition. Notably, the ISC demonstrated predictive capabilities not only for the
binary attending versus distracted states but also at a more nuanced level of self-reported
attention states. However, there was no significant correlation between ISC and learner’s
test scores. Lastly, learners with different learning styles exhibited similar viewing patterns
when engaging with video content. The differentiation of learners’ learning styles did
not change predictive efficiency of ISC for attention assessment. This finding suggests
that, ISC of eye movements can be effectively employed without considering learners’
individualized learning styles.
4.1. ISC of Eye Movements Predict Learners’ Attention at a More Nuanced Level but Not Their
Academic Performance
Consistent with the findings of Madsen et al. [2] and Liu et al. [19], our study reaffirms
that synchronized eye movements across learners serve as a robust indicator capable of
distinguishing between attending and distracted states. However, our research extends
beyond these prior results by showing that the ISC of eye movements not only excels at
discerning binary attending and distracted states but also exhibits proficiency in capturing
nuanced shifts within attentional states. This nuanced capability allows for a more precise
assessment of students’ attention, offering a finer-grained understanding of the magnitude
of their attentional engagement. It contributes to the field by highlighting the practical
utility of ISC in facilitating attention assessment. Specifically, it allows for the establishment
of customizable thresholds for distraction detection, providing flexibility in tailoring the
assessment to specific needs. This adaptability in setting thresholds enhances the appli-
cability of ISC of eye movements as a versatile tool in gauging attention states within the
dynamic context of online learning.
Contrary to the previous findings [2], we were not able to observe significant cor-
relations between ISC of eye movements (or pupil sizes) and test scores. Although
Liu et al. (2023) also demonstrated that ISC of eye movements significantly correlated with
test scores, the magnitude of the coefficient was much smaller than what was found in
Madsen et al. [2] (~0.25 vs. ~0.50). Furthermore, ISC of pupil sizes was not found to be
Educ. Sci. 2024, 14, 548 10 of 12
significantly correlated with test scores. Another study that used a webcam-based eye
tracking system also failed to replicate the relationship between ISC of eye movements
and learning performance [32]. Several factors may contribute to these discrepancies, in-
cluding variations in learning materials and exam difficulty. For instance, Madsen et al. [2]
utilized short videos from YouTube channels, while Liu et al. [19] and our study employed
lecture videos from MOOCs, which typically follow a different structure and have longer
durations. Sauter et al. [32] utilized recorded conference videos. The variances in learning
materials and exam characteristics highlight the need for additional research to thoroughly
investigate and confirm the relationship between ISC of eye movements and test scores. It
is crucial to consider these contextual factors to discern the generalizability and robustness
of the observed correlations, emphasizing the necessity for further exploration in diverse
educational settings.
4.2. Learners’ Learning Styles Do Not Affect ISC’s Prediction Rate of Attention Assessment in
Video Learning
Importantly, we tested whether learners’ learning styles affect ISC’s prediction rate
of attention assessment in video learning. The outcomes revealed that distinguishing
among learners’ learning styles did not yield notable changes in the effectiveness of ISC
in predicting attention. As highlighted in our introduction, lecture videos commonly
employ strategies like animation and teacher’s gestures to guide viewer attention, poten-
tially resulting in more consistent viewing patterns. This finding aligns with the work of
Mu et al. [33], who explored learners’ attention preferences in the context of online learning.
Their study, conducted during video learning, found no significant differences in attention
preferences among students with varying visual-verbal preferences.
To further explore this finding, we conducted a focused analysis on three segments of
the lecture video stimuli which lasted around 200 s with the mean duration being 65.85 s
where content remained relatively static. The information presented in the segment contains
images and texts at the same time. We investigated the dwell time of visual and verbal
learners on images and text contents. We found that visual learners spent significantly more
time on images (mean ± SD: 30.71 s ± 10.12) than textual information (23.71 s ± 11.62),
t (44) = 2.27, p = 0.03, while verbal learners dedicated more time on texts (33.95 s ± 12.01)
than image information (23.65 s ± 10.22), t (14) = 2.25, p = 0.04. This implies that learners
with distinct learning styles indeed exhibit divergent viewing patterns and preferences
when confronted with static materials. However, these distinctions seem less pronounced
in the context of lecture videos, where the dynamic presentation and additional guiding
elements might contribute to more uniform viewing behaviors.
There are a few limitations in the current study that need to be mentioned. Firstly, the
distribution of learning styles among subjects is not balanced in the current participants.
The representation of active, verbal, sensing, and sequential learners is notably lower
compared to their counterparts, introducing potential bias into the results. Future studies
should aim for a more balanced representation to ensure a comprehensive understand-
ing of the relationship between learning styles and eye movements. Another limitation
pertains to the examination of a specific style of lecture videos—Presenter and animation.
Generalizing the findings to other lecture video formats, such as presenter and glass board
or recorded lectures, may not be warranted. Future investigations should encompass a
broader spectrum of video formats and diverse subjects to enhance the external validity of
the findings. Additional limitation relates to the ISC itself. If we use ISC of eye movements
to assess attention, we need to guarantee that the contents presented to learners are syn-
chronized. While it may not be a problem in synchronous online learning, in asynchronous
learning however, learners could not stop or re-play the video section if they did not
understand the section before continuing learning which is an important advantage of
asynchronous learning.
In summary, the current study verifies and extends previous studies by demonstrating
that ISC of eye movements can reliably predict learners’ attention at a more nuanced level.
Educ. Sci. 2024, 14, 548 11 of 12
Such prediction is not affected by the learner’s learning style and ISC of eye movements
can be effectively employed without considering learners’ individualized learning styles.
Author Contributions: Conceptualization, H.Z.; methodology, H.Z.; software, H.Z.; validation, H.Z.;
formal analysis, H.Z.; investigation, C.S.; resources, H.Z.; data curation, H.Z.; writing—original
draft preparation, H.Z., C.S., X.L. and X.G.; writing—review and editing, H.Z., C.S., X.L. and X.G.;
visualization, H.Z.; supervision, H.Z.; project administration, H.Z.; funding acquisition, H.Z. All
authors have read and agreed to the published version of the manuscript.
Funding: This research was funded by Guangdong Province Educational Science Planning under
Higher Education Special Projects (grant number: 2023GXJK664), Guangdong Province Philosophy
and Social Science Planning Project (grant number: GD20XJY61), and the Beijing Normal University
at Zhuhai (grant number: 111032101).
Institutional Review Board Statement: The study was conducted in accordance with the Declaration
of Helsinki, and approved by the Ethical committee of Education Department at Beijing Normal
University (IRB Number: BNU202302100001) on 1 March 2023.
Informed Consent Statement: Informed consent was obtained from all subjects involved in the
study.
Data Availability Statement: The data and scripts of analyses of the current study are available at
https://osf.io/fw3pz/ (accessed on 1 May 2024).
Conflicts of Interest: The authors declare no conflicts of interest.
References
1. Pokhrel, S.; Chhetri, R. A Literature Review on Impact of COVID-19 Pandemic on Teaching and Learning. High. Educ. Future
2021, 8, 133–141. [CrossRef]
2. Madsen, J.; Júlio, S.U.; Gucik, P.J.; Steinberg, R.; Parra, L.C. Synchronized Eye Movements Predict Test Scores in Online Video
Education. Proc. Natl. Acad. Sci. USA 2021, 118, e2016980118. [CrossRef] [PubMed]
3. Newmann, F.M. Student Engagement and Achievement in American Secondary Schools; Teachers College Press: New York, NY, USA,
1992.
4. Polderman, T.J.C.; Boomsma, D.I.; Bartels, M.; Verhulst, F.C.; Huizink, A.C. A Systematic Review of Prospective Studies on
Attention Problems and Academic Achievement. Acta Psychiatr. Scand. 2010, 122, 271–284. [CrossRef] [PubMed]
5. Reynolds, R.E.; Shirey, L.L. The Role of Attention in Studying And Learning. In Learning and Study Strategies; Weinstein, C.E.,
Goetz, E.T., Alexander, P.A., Eds.; Academic Press: San Diego, CA, USA, 1988; pp. 77–100. [CrossRef]
6. Chen, C.-M.; Wang, J.-Y.; Yu, C.-M. Assessing the Attention Levels of Students by Using a Novel Attention Aware System Based
on Brainwave Signals. Br. J. Educ. Technol. 2017, 48, 348–369. [CrossRef]
7. Chen, C.-M.; Wang, J.-Y. Effects of Online Synchronous Instruction with an Attention Monitoring and Alarm Mechanism on
Sustained Attention and Learning Performance. Interact. Learn. Environ. 2018, 26, 427–443. [CrossRef]
8. Lin, C.-H.; Chen, C.-M.; Lin, Y.-T. Improving Effectiveness of Learners’ Review of Video Lectures by Using an Attention-Based
Video Lecture Review Mechanism Based on Brainwave Signals. In 2018 7th International Congress on Advanced Applied Informatics
(IIAI-AAI); IEEE: Yonago, Japan, 2018; pp. 152–157. [CrossRef]
9. Li, Q.; Ren, Y.; Wei, T.; Wang, C.; Liu, Z.; Yue, J. A Learning Attention Monitoring System via Photoplethysmogram Using
Wearable Wrist Devices. In Artificial Intelligence Supported Educational Technologies; Pinkwart, N., Liu, S., Eds.; Advances in
Analytics for Learning and Teaching; Springer International Publishing: Cham, Swizerland, 2020; pp. 133–150. [CrossRef]
10. Abate, A.F.; Cascone, L.; Nappi, M.; Narducci, F.; Passero, I. Attention Monitoring for Synchronous Distance Learning. Future
Gener. Comput. Syst. 2021, 125, 774–784. [CrossRef]
11. Terraza Arciniegas, D.F.; Amaya, M.; Piedrahita Carvajal, A.; Rodriguez-Marin, P.A.; Duque-Muñoz, L.; Martinez-Vargas, J.D.
Students’ Attention Monitoring System in Learning Environments Based on Artificial Intelligence. IEEE Lat. Am. Trans. 2022, 20,
126–132. [CrossRef]
12. Alemdag, E.; Cagiltay, K. A Systematic Review of Eye Tracking Research on Multimedia Learning. Comput. Educ. 2018, 125,
413–428. [CrossRef]
13. Jamil, N.; Belkacem, A.N.; Lakas, A. On Enhancing Students’ Cognitive Abilities in Online Learning Using Brain Activity and
Eye Movements. Educ. Inf. Technol. 2023, 28, 4363–4397. [CrossRef]
14. Pouta, M.; Lehtinen, E.; Palonen, T. Student Teachers’ and Experienced Teachers’ Professional Vision of Students’ Understanding
of the Rational Number Concept. Educ. Psychol. Rev. 2021, 33, 109–128. [CrossRef]
15. Sharma, K.; Giannakos, M.; Dillenbourg, P. Eye-Tracking and Artificial Intelligence to Enhance Motivation and Learning. Smart
Learn. Environ. 2020, 7, 13. [CrossRef]
Educ. Sci. 2024, 14, 548 12 of 12
16. Tsai, M.-J.; Hou, H.-T.; Lai, M.-L.; Liu, W.-Y.; Yang, F.-Y. Visual Attention for Solving Multiple-Choice Science Problem: An
Eye-Tracking Analysis. Comput. Educ. 2012, 58, 375–385. [CrossRef]
17. Hasson, U.; Landesman, O.; Knappmeyer, B.; Vallines, I.; Rubin, N.; Heeger, D.J. Neurocinematics: The Neuroscience of Film.
Projections 2008, 2, 1–26. [CrossRef]
18. Yang, F.-Y.; Chang, C.-Y.; Chien, W.-R.; Chien, Y.-T.; Tseng, Y.-H. Tracking Learners’ Visual Attention during a Multimedia
Presentation in a Real Classroom. Comput. Educ. 2013, 62, 208–220. [CrossRef]
19. Liu, Q.; Yang, X.; Chen, Z.; Zhang, W. Using Synchronized Eye Movements to Assess Attentional Engagement. Psychol. Res. 2023,
87, 2039–2047. [CrossRef]
20. Esterman, M.; Rosenberg, M.D.; Noonan, S.K. Intrinsic Fluctuations in Sustained Attention and Distractor Processing. J. Neurosci.
2014, 34, 1724–1730. [CrossRef]
21. Felder, R.M.; Silverman, L.K. Learning and Teaching Styles in Engineering Education. Eng. Educ. 1988, 78, 674–681.
22. Nugrahaningsih, N.; Porta, M.; Klasnja-Milicevic, A. Assessing Learning Styles through Eye Tracking for E-Learning Applications.
ComSIS 2021, 18, 1287–1309. [CrossRef]
23. Al-Wabil, A.; ElGibreen, H.; George, R.P.; Al-Dosary, B. Exploring the Validity of Learning Styles as Personalization Parameters
in eLearning Environments: An Eyetracking Study. In Proceedings of the 2010 2nd International Conference on Computer
Technology and Development, Cairo, Egypt, 2–4 November 2010; pp. 174–178. [CrossRef]
24. Mehigan, T.J.; Barry, M.; Kehoe, A.; Pitt, I. Using Eye Tracking Technology to Identify Visual and Verbal Learners. In Proceedings
of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain, 11–15 July 2011; pp. 1–6. [CrossRef]
25. Luo, Z. Using Eye-Tracking Technology to Identify Learning Styles: Behaviour Patterns and Identification Accuracy. Educ. Inf.
Technol. 2021, 26, 4457–4485. [CrossRef]
26. Cao, J.; Nishihara, A. Understanding Learning Style by Eye Tracking in Slide Video Learning. J. Educ. Multimed. Hypermedia 2012,
21, 335–358.
27. Cao, J.; Nishihara, A. Viewing Behaviors Affected by Slide Features and Learning Style in Slide Video from a Sequence Analysis
Perspective. J. Inf. Syst. Educ. 2013, 12, 1–12. [CrossRef]
28. Ou, C.; Joyner, D.A.; Goel, A.K. Designing and Developing Videos for Online Learning: A Seven-Principle Model. OLJ 2019, 23,
82–104. [CrossRef]
29. Mayer, R.E.; DaPra, C.S. An Embodiment Effect in Computer-Based Learning with Animated Pedagogical Agents. J. Exp. Psychol.
Appl. 2012, 18, 239–252. [CrossRef] [PubMed]
30. Wang, F.; Li, W.; Mayer, R.E.; Liu, H. Animated Pedagogical Agents as Aids in Multimedia Learning: Effects on Eye-Fixations
during Learning and Learning Outcomes. J. Educ. Psychol. 2018, 110, 250–268. [CrossRef]
31. Diedenhofen, B.; Musch, J. Cocor: A Comprehensive Solution for the Statistical Comparison of Correlations. PLoS ONE 2015, 10,
e0121945. [CrossRef]
32. Sauter, M.; Hirzle, T.; Wagner, T.; Hummel, S.; Rukzio, E.; Huckauf, A. Can Eye Movement Synchronicity Predict Test Performance
With Unreliably-Sampled Data in an Online Learning Context? In 2022 Symposium on Eye Tracking Research and Applications; ACM:
Seattle, WA, USA, 2022; pp. 1–5. [CrossRef]
33. Mu, L.; Cui, M.; Wang, X.; Qiao, J.; Tang, D. Learners’ Attention Preferences of Information in Online Learning: An Empirical
Study Based on Eye-Tracking. Interact. Technol. Smart Educ. 2019, 16, 186–203. [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual
author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to
people or property resulting from any ideas, methods, instructions or products referred to in the content.