A Machine Learning Approach For Fall Detection and Daily Living Activity Recognition
A Machine Learning Approach For Fall Detection and Daily Living Activity Recognition
fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
Abstract— The number of older people in western countries is falls a major public health problem worldwide. The number
constantly increasing. Most of them prefer to live independently of fatal falls per year is estimated by the WHO to be equal
and are susceptible to fall incidents. Falls often lead to serious to 420,000 per year [4]. After a fall, rapid medical care can
or even fatal injuries which are the leading cause of death for
elderlies. To address this problem, it is essential to develop robust significantly reduce the potential damage from fall injuries,
fall detection systems. In this context, we develop a machine resulting in a higher survival rate. For this reason, fall detection
learning framework for fall detection and daily living activity systems that can detect and report falls as fast as possible are
recognition. We use acceleration and angular velocity data of great importance.
from two public databases to recognize seven different activities During the last years, the development of fall detection
including falls and activities of daily living. From the acceleration
and angular velocity data, we extract time and frequency domain systems has become a hot research topic. A plethora of
features and provide them to a classification algorithm. In this fall detection systems are being developed using different
work, we test the performance of four algorithms for classifying approaches. We can categorize the existing fall detection
human activities. These algorithms are artificial neural network systems into two main classes: (i) wearable device-based
(ANN), K-nearest neighbors (KNN), quadratic support vector systems and (ii) context-aware systems [5]. Wearable device-
machine (QSVM), and ensemble bagged tree (EBT). New features
that improve the performance of the classifier are extracted based systems utilize a device that is worn by the user to detect
from the power spectral density of the acceleration. In a first falls. These devices integrate a gyroscope and an accelerometer
step, only the acceleration data are used for activity recognition. that can measure the acceleration and the angular velocity.
Our results reveal that the KNN, ANN, QSVM, and EBT The movement and activity of the user results in a temporal
algorithms could achieve an overall accuracy of 81.2%, 87.8%, variation of the measured acceleration and angular velocity
93.2%, and 94.1%, respectively. The accuracy of fall detection
reaches 97.2% and 99.1% without any false alarms for the data, leaving different fingerprints for different activities. By
QSVM and EBT algorithms, respectively. In a second step, we analyzing the measured acceleration and angular velocity data,
extract features from the autocorrelation function and the power it is possible to determine the type of activity performed by
spectral density of both the acceleration and the angular velocity the user. Several studies have investigated the performance
data, which improves the classification accuracy. By using the of wearable device-based systems [6]–[10]. A big advantage
proposed features, we could achieve an overall accuracy of 85.8%,
91.8%, 96.1%, and 97.7% for the KNN, ANN, QSVM, and EBT of wearable device-based fall detection systems is that they
algorithms, respectively. The accuracy of fall detection reaches can recognize human activity without compromising the user
100% for both the QSVM and EBT algorithms without any false privacy. Widely used smartphones with built-in accelerometer
alarm, which is the best achievable performance. and gyroscope can also be used to measure the acceleration
Index Terms—Fall detection, activity recognition, machine and angular velocity as the user moves and performs various
learning, acceleration data, angular velocity data, feature extrac- activities. The measured data can be analyzed in real time
tion. to detect falls. This fall detection approach is very attractive
because it requires no new equipment and is therefore cost-
I. I NTRODUCTION effective. For wearable device-based systems, if the user
forget to wear the device, it becomes impossible to monitor
Advances in the diagnosis and treatment of diseases have
the person activity. This represents the major limitation of
led to an increase in life expectancy. In every country, the
wearable device-based systems.
percentage of elderlies in the society is increasing. The World
Context-aware systems represent the second main category
Health Organization (WHO) estimates that by 2050 the num-
of fall detection systems. These systems are based on sensors
ber of people over 60 years will exceed two billion [1]. With
placed in the area around the user to be monitored. The sensors
increasing age, people become more susceptible to falls. In
used for monitoring encompass floor sensors, pressure sensors,
fact, as the age increases from 65 to over 70 years, the rate of
microphones, and cameras. Context-aware systems can include
falls and fall related injuries rises from 28% to 42% according
a single or many types of sensors which are deployed in
to the WHO [2]. For people over 65 years of age, fall related
specific areas. This makes fall detection impossible if the user
injuries were the leading cause of death in 2013 [3]. Moreover,
leaves the monitoring area. The most common type of context-
fall related injuries cause a significant costs for society, making
aware systems is video surveillance. To detect falls, a camera
A. Chelli and M. Pätzold are with the Faculty of Engineering and is used to capture a series of images which is subsequently
Science, University of Agder, 4898 Grimstad, Norway (e-mails:{ali.chelli, processed by a classification algorithm to determine whether
matthias.paetzold}@uia.no). a fall has occurred or not [11]. The use of video surveillance
This work is an extended version of a paper submitted to the IEEE
Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC for activity recognition and fall detection has been extensively
2018), Bologna, Italy, September 2018. investigated in the literature [12]–[17]. The main shortcoming
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
of video surveillance systems is that they can compromise user accuracy of 93.5% and a precision of 94.2%. For our solution,
privacy. For this reason, video surveillance is considered illegal we use a feature vector of length 3 and achieve a fall detection
in some countries [18]. Moreover, context-aware systems are accuracy and precision of 96.8% and 100%, respectively. Thus,
susceptible to external events (e.g., changes in illuminance), we outperform the fall detection systems in [22] and [23] in
and have high installation costs. terms of accuracy and precision by using less features.
To evaluate the performance of fall detection systems, we Our second main contribution consists in proposing new
need records of actual falls. However, it is very difficult features that improve the classification accuracy of ADLs.
to collect real-world fall data, especially for older people. For instance, we have proposed new power spectral density
Generally, we need to monitor people for several weeks to (PSD) features that enhance the classification accuracy, espe-
obtain records of few actual falls. In the end, these few falls cially, for the activities walking, walking upstairs, and walking
are not enough to accurately evaluate the performance of the downstairs. In the literature, several features were extracted
developed fall detection system. Therefore, only a few studies from the PSD, such as the largest frequency value [6] and
have adopted this approach [19]–[21]. In the absence of data of the mean frequency value [24]. However, in this paper, we
actual falls, most researchers utilize simulated falls performed extract the main peaks of the PSD and use them as a feature
by volunteers. In addition to falls, these volunteers carry out for activity classification. To the best of our knowledge, this
activities of daily living (ADL) to check the accuracy of the feature has never been utilized before in activity classification.
developed fall detection system and its ability to differentiate Moreover, we extract additional novel features, such as the
between falls and ADL. peaks of the autocorrelation function (ACF), and the peaks
In the literature, several activity datasets are publicly avail- of the cross-correlation function (CCF), which are extracted
able which allow evaluating fall detection methods and assess- from the triaxial acceleration and the triaxial angular velocity
ing their performance on real-world data. An ADL database signals. These proposed new features allow a more accurate
which comprises acceleration and angular velocity data are distinction between different activities.
provided in [6], where a script describing the set of activities In this work, we combine the fall and ADL data from
to be carried out was provided to the participants. A total the datasets provided in [7] and [6]. These real-world data
of 30 participants of different genders, ages, and weights are then utilized to evaluate the performance of the proposed
contributed to this experiment. The experiment consisted in machine learning framework in human activity recognition.
performing ADL activities including: standing, sitting, walk- The acceleration and angular velocity signals are divided into
ing, walking upstairs, walking downstairs, and lying. To collect buffers of 2.56 s length. From each buffer, we extract a feature
the acceleration and angular velocity data, a smartphone was vector of length 66, in a first step. To improve the accuracy
attached to the waist of each participant. On average, the total of the classification, more features are extracted from each
time of recording for each participant was 192 seconds. It buffer, such that the length of the feature vector increases to
is worth mentioning that the dataset in [6] does not include 328. Note that the lengths of the considered feature vectors
fall data, but only ADL activities. Fall related data can be (66 and 328) are smaller than the number of features used
found in some public databases [7]–[10]. The authors of [7] in existing baseline solutions. We utilize 70% of the data to
provide a fall dataset which was performed by 42 participants. train the classifier, while 30% of the data are used to test
Both acceleration and angular velocity data were collected the trained classifier. For a feature vector of length 66, we
during this experiment. The participants in this experiment achieve a similar performance compared to existing solutions
were young healthy adults who performed planned falls. This [24], while for a feature vector of length 328, our approach
fact makes the collected data different from that of real falls outperforms existing solutions.
of elderly people. Due to the difficulty of gathering enough In this paper, we assess the performance of four different
real fall data from older people, the use of mimicked fall data classification algorithms, namely, the artificial neural network
for testing the performance of fall detection system is a well- (ANN), K-nearest neighbors (KNN), quadratic support vector
accepted approach by the researchers on this topic. machine (QSVM), and ensemble bagged tree (EBT). In a
In this paper, we propose a machine learning framework for first step, only the acceleration data are used for feature
fall detection and activity recognition. Our first main contri- extraction. A feature vector of length 66 is built and provided
bution is related to the features used for fall detection. More as input to the classification algorithm. Our results reveal
specifically, we use the mean value of the triaxial acceleration that the KNN algorithm has the worst performance with an
and achieve a fall detection accuracy and precision of 96.8% overall accuracy of 81.2%. The EBT algorithm has the best
and 100%, respectively. Even though, the mean value of the performance with an overall accuracy of 94.1%. The ANN
triaxial acceleration is not intrinsically a new feature since it and the QSVM algorithms achieve an overall accuracy of
was used in previous work [6] to classify ADLs, the mean 87.8% and 93.2%, respectively. The accuracy of fall detection
value for triaxial acceleration was not utilized as a feature in reaches 97.2% and 99.1% for the QSVM and EBT algorithms,
the classification of falls [22], [23]. Note that by extracting respectively, without any false alarm. In a second step, we
only the mean value of the triaxial acceleration, we construct extract features from both the acceleration and the angular
a feature vector of size 3. In [22], a feature vector of length velocity data and construct a feature vector of length 328. This
4 is used for fall detection. This resulted in a fall detection increase in the number of features improves the performance
accuracy of 92% and a precision of 81%, while in [23] a of the four classification algorithms. The KNN, the ANN, the
feature vector of length 23 is utilized leading to a fall detection QSVM, and the EBT algorithms achieve an overall accuracy
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
40 20
20
0
0
-20
-20
-40
-40
-60
-60
-80
-80
-100 -100
0 1 2 3 4 5 6 7 8 9 10 0 1 2 3 4 5 6 7 8 9 10
Fig. 4. PSDs Sab (f ) and Sab (f ) of the body accelerations abx (t) and aby (t) pertaining to the activities walking and walking upstairs.
x y
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
10
by the number of actual falls, i.e., The precision of the ANN algorithm achieved with the
FN FN features of the Subsets A, B, and C is provided in Table IV.
FN Rate = = This table shows that the precision of the predicted falls
Number of actual falls TP + FN
4 reaches 100% regardless of whether we use the features of
= = 3.2%. (12) Subset A, B, or C. This implies that there are no false alarms
125
and that all fall events detected by the algorithm are real falls.
The FN rate indicates the percentage of undetected falls by On the contrary, as the number of features increases, the false
the system. It is desirable that the fall detection system has a alarm rate for walking decreases. For example, if we use the
very low FN rate. features of Subset B instead of Subset A, the classification
Using Table II, we can compute the accuracy and precision precision for walking is enhanced by 48.2%. Additionally,
for fall detection as follows if we use the features of Subset C instead of Subset A, the
TP 121 classification precision is improved by 28.6%, 28.1%, 8.3%,
Accuracy = = = 96.8% (13)
TP + FN 121 + 4 and 47.2%, respectively, for the activities walking upstairs,
TP 121 walking downstairs, sitting, and standing.
Precision = = = 100%. (14)
TP + FP 121
TABLE IV
P RECISION OF THE ANN CLASSIFIER FOR VARIOUS ACTIVITIES AND
DIFFERENT FEATURE SUBSETS .
TABLE II
C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE
ANN ALGORITHM OBTAINED USING THE FEATURES FROM S UBSET A. Precision %
Features Wal. Up. Dow. Sit. Sta. Ly. Fal.
Subset A 29.4 56.1 57.9 75.5 36.3 99.5 100
Actual Non-Fall Actual Fall
Subset B 77.6 79.9 80.8 83.7 82 99.8 100
Predicted Non-Fall 3075 (TN) 4 (FN)
Subset C 84.2 84.7 86 83.8 83.5 99.8 100
Predicted Fall 0 (FP) 121 (TP)
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
11
guishes lying and falling from the other activities. For the 4
0 0 0 478 77 0 0 86.1%
0.0% 0.0% 0.0% 14.9% 2.4% 0.0% 0.0% 13.9%
remaining five activities, the classifier confuses standing and
sitting together, since these two activities are static. Besides, 0 0 0 55 494 0 0 90.0%
5
0.0% 0.0% 0.0% 1.7% 15.4% 0.0% 0.0% 10.0%
the classifier does not differentiate well the dynamic activities
0 0 0 0 0 583 1 99.8%
walking, walking upstairs, and walking downstairs. However, 6
0.0% 0.0% 0.0% 0.0% 0.0% 18.2% 0.0% 0.2%
the misclassification rate among dynamic activities drops 0 0 0 0 0 0 106 100%
7
significantly by using the features from Subset B instead of 0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 3.3% 0.0%
those from Subset A, as shown in Table III. This demonstrates 95.9% 95.9% 89.8% 89.5% 86.5% 100% 97.2% 93.2%
4.1% 4.1% 10.2% 10.5% 13.5% 0.0% 2.8% 6.8%
that the PSD features allow achieving a higher accuracy in
1 2 3 4 5 6 7
recognizing dynamic activities, since each of these activities
has its own rate and shape of oscillations as discussed in
Section III. Note that the classifier rarely misclassifies dynamic
activities as static and vice versa. For example, the number of Fig. 7. Confusion matrix of the QSVM algorithm obtained using the features
from Subset C.
misclassifications for the activity walking as standing drops
from 182 to 3 by using the feature Subset C instead of
In Table VI, we provide the confusion matrix of the binary
Subset A. This reveals that the features in Subset C allow
classification problem, where we classify the data into fall and
distinguishing static and dynamic activities.
non-fall classes. Table VI is obtained when using the QSVM
In Table V, we provide the confusion matrix of the binary
algorithm with the features from Subset C. From Table VI, we
classification problem, where we classify the data into fall and
see that the number of FP equals 0 and the number of FN is 3.
non-fall classes. Table V is obtained when using the ANN
The FP rate is equal to 0%, while the FN rate is 2.75%. The
classifier with the features from Subset C. From Table V, we
accuracy and precision for fall detection are equal to 97.25%
see that the number of FP equals 0 and the number of FN is
and 100%, respectively.
4. The FP rate and FN rate can be computed using (11) and
(12), which results in 0% and 3.2%, respectively. Utilizing (13) TABLE VI
and (14), we can compute the accuracy and precision of fall C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE
detection which are equal to 96.8% and 100%, respectively. QSVM ALGORITHM OBTAINED USING THE FEATURES FROM S UBSET C.
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
12
485 20 27 0 0 0 0 91.2%
1
15.2% 0.6% 0.8% 0.0% 0.0% 0.0% 0.0% 8.8%
algorithms, while the EBT algorithm has the best performance 15 433 18 1 0 0 0 92.7%
2
in terms of the overall accuracy. By using the EBT algorithm 0.5% 13.5% 0.6% 0.0% 0.0% 0.0% 0.0% 7.3%
1 2 3 4 5 6 7
B. Comparison
In this work, we obtained the acceleration data for ADL
Fig. 8. Confusion matrix of the KNN algorithm obtained using the features
from Subset C.
activities and falls from two different databases. This fact
makes it difficult to compare our results to existing work
In Table VII, we provide the confusion matrix of the binary in the literature. In [24], the authors use the support vector
classification problem. Table VII is obtained when using machine (SVM) algorithm to classify six ADL activities using
the KNN algorithm with the features from Subset C. From the same acceleration data that we use in this paper. Therefore,
Table VII, we see that the number of FP equals 4 and the we can roughly compare our results to those obtained in [24].
number of FN is 8. The FP rate is equal to 3.77%, while the The classification accuracy in [24] for the activities walking,
FN rate is 7.27%. The accuracy and precision of fall detection walking upstairs, walking downstairs, standing, sitting, and
are equal to 96.23% and 100%, respectively. lying are equal to 95.6%, 69.8%, 83.2%, 93%, 96.4%, and
100%, respectively, while the overall accuracy reaches 89.3%.
TABLE VII
C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE In our case, we achieve a better overall accuracy of 93.2%, if
KNN ALGORITHM OBTAINED USING THE FEATURES FROM S UBSET C. we use the QSVM algorithm and the features extracted only
from the acceleration signal. Note that in our case we classify
Actual Non-Fall Actual Fall
Predicted Non-Fall 3085 (TN) 8 (FN) seven different activities compared to six activities in [24]. Our
Predicted Fall 4 (FP) 102 (TP) solution improves the classification accuracy for the activities
walking, walking upstairs, and walking downstairs by 0.3%,
Table VIII illustrates the confusion matrix of the binary 26.1%, and 6.6% compared to the method proposed in [24].
classification problem resulting from using the EBT algorithm On the other hand, the solution in [24] outperforms our method
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
13
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
14
x, y, z), the triaxial angular velocity ωi (t) (i = x, y, z), 0 0 0 389 145 1 0 72.7%
4
the magnitude of the body acceleration kab (t)k, and the 0.0% 0.0% 0.0% 12.2% 4.5% 0.0% 0.0% 27.3%
It is important to mention that machine learning algorithms 95.0% 87.5% 81.9% 72.8% 74.7% 99.8% 98.2% 85.8%
5.0% 12.5% 18.1% 27.2% 25.3% 0.2% 1.8% 14.2%
have a nested and non-linear structure, which makes it difficult
to understand how classifiers can achieve a high recognition 1 2 3 4 5 6 7
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
15
TABLE IX TABLE X
C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE
KNN ALGORITHM OBTAINED USING 328 FEATURES . ANN ALGORITHM OBTAINED USING 328 FEATURES .
0 458 4 1 0 0 0 98.9%
2
0.0% 14.3% 0.1% 0.0% 0.0% 0.0% 0.0% 1.1%
498 6 19 4 4 0 0 93.8%
1
15.6% 0.2% 0.6% 0.1% 0.1% 0.0% 0.0% 6.2% 1 2 416 0 0 0 0 99.3%
3
0.0% 0.1% 13.0% 0.0% 0.0% 0.0% 0.0% 0.7%
12 443 21 4 1 0 0 92.1%
2
0.4% 13.8% 0.7% 0.1% 0.0% 0.0% 0.0% 7.9% 0 1 0 488 66 0 0 87.9%
4
0.0% 0.0% 0.0% 15.3% 2.1% 0.0% 0.0% 12.1%
15 16 390 0 0 0 0 92.6%
3
0.5% 0.5% 12.2% 0.0% 0.0% 0.0% 0.0% 7.4% 0 1 0 45 505 0 0 91.7%
5
0.0% 0.0% 0.0% 1.4% 15.8% 0.0% 0.0% 8.3%
0 1 0 445 60 0 0 87.9%
4
0.0% 0.0% 0.0% 13.9% 1.9% 0.0% 0.0% 12.1% 0 0 0 0 0 583 0 100%
6
0.0% 0.0% 0.0% 0.0% 0.0% 18.2% 0.0% 0.0%
0 0 0 96 490 0 0 83.6%
5
0.0% 0.0% 0.0% 3.0% 15.3% 0.0% 0.0% 16.4% 0 0 0 0 0 0 110 100%
7
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 3.4% 0.0%
0 0 0 1 0 561 1 99.6%
6
0.0% 0.0% 0.0% 0.0% 0.0% 17.5% 0.0% 0.4% 99.8% 98.9% 98.6% 91.4% 88.4% 100% 100% 96.1%
0.2% 1.1% 1.4% 8.6% 11.6% 0.0% 0.0% 3.9%
0 0 0 0 0 0 112 100%
7
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 3.5% 0.0% 1 2 3 4 5 6 7
1 2 3 4 5 6 7
Fig. 13. Confusion matrix of the QSVM algorithm obtained using 328
features.
Fig. 12. Confusion matrix of the ANN algorithm obtained using 328 features. Table XI illustrates the confusion matrix of the binary clas-
sification problem obtained when using the QSVM algorithm
Table X provides the confusion matrix of the binary classifi- with 328 features. From Table XI, we observe that the number
cation problem resulting from using the ANN algorithm with of FP equals 0 and the number of FN is 0. The FP rate and FN
328 features. From Table X, we see that the number of FP rate are both equal to 0%, this implies that the fall detection
equals 0 and the number of FN is 1. The FP rate is equal to system has zero false alarm and has zero undetected falls.
0%, while the FN rate is 0.88%. The accuracy and precision TABLE XI
for fall detection are equal to 99.12% and 100%, respectively. C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE
QSVM ALGORITHM OBTAINED USING 328 FEATURES .
In Figs. 13 and 14, we provide the confusion matrices
Actual Non-Fall Actual Fall
for the QSVM and the EBT algorithms, respectively. These Predicted Non-Fall 3089 (TN) 0 (FN)
two algorithms have a better performance compared to the Predicted Fall 0 (FP) 110 (TP)
KNN and the ANN algorithms. The QSVM and the EBT
algorithms achieve an overall accuracy of 96.1% and 97.7%, Table XII shows the confusion matrix of the binary classifi-
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
16
V. C ONCLUSION
511 0 4 0 0 0 0 99.2% A robust fall detection system is essential to support the
1
16.0% 0.0% 0.1% 0.0% 0.0% 0.0% 0.0% 0.8% independent living of elderlies. In this paper, we have pro-
2
2 457 3 0 0 0 0 98.9% posed a machine learning approach for fall detection and
0.1% 14.3% 0.1% 0.0% 0.0% 0.0% 0.0% 1.1%
ADL recognition. We have tested the performance of four
3 5 415 0 0 0 0 98.1%
3
0.1% 0.2% 13.0% 0.0% 0.0% 0.0% 0.0% 1.9%
algorithms in recognizing the activities falling, walking, walk-
ing upstairs, walking downstairs, sitting, standing, and lying
0 0 0 505 26 0 0 95.1%
4
0.0% 0.0% 0.0% 15.8% 0.8% 0.0% 0.0% 4.9% based on the acceleration and the angular velocity data. We
0 1 0 29 545 0 0 94.8% have proposed new time and frequency domain features and
5
0.0% 0.0% 0.0% 0.9% 17.0% 0.0% 0.0% 5.2% have demonstrated the importance of these features and their
6
0 0 0 0 0 584 0 100% positive impact on enhancing the accuracy and precision of
0.0% 0.0% 0.0% 0.0% 0.0% 18.3% 0.0% 0.0%
the classifier.
0 0 0 0 0 0 109 100%
7
0.0% 0.0% 0.0% 0.0% 0.0% 0.0% 3.4% 0.0%
Moreover, we have tested the performance of the KNN,
ANN, QSVM, and EBT classification algorithms on real-
99.0% 98.7% 98.3% 94.6% 95.4% 100% 100% 97.7%
1.0% 1.3% 1.7% 5.4% 4.6% 0.0% 0.0% 2.3% world acceleration data obtained from public databases. The
1 2 3 4 5 6 7 internal parameters of these algorithms have been optimized
using the training data. Afterwards, the performance of the
trained algorithms has been assessed using the test data. In
Fig. 14. Confusion matrix of the EBT algorithm obtained using 328 features.
a first step, only the acceleration data have been used for
activity recognition. A feature vector of size 66 has been
obtained and has been provided as an input to the classification
algorithm. Our results reveal that the KNN, ANN, QSVM, and
cation problem obtained when using the EBT algorithm with EBT algorithm achieve an overall accuracy of 81.2%, 87.8%,
328 features. From Table XII, we see that the number of FP 93.2%, and 94.1%, respectively.
equals 0 and the number of FN is 0. The FP rate and FN rate In a second step, we have extracted new features from both
are both equal to 0%, thus the fall detection system has an the acceleration and the angular velocity data which has sig-
accuracy of 100% and generates zero false alarm. nificantly improved the performance of the four classification
algorithms. The constructed feature vector has a size of 328.
TABLE XII By using the proposed feature vector, we have shown that the
C ONFUSION MATRIX OF THE BINARY CLASSIFICATION PROBLEM OF THE KNN, ANN, QSVM, and EBT algorithms achieve an overall
EBT ALGORITHM OBTAINED USING 328 FEATURES .
accuracy of 85.8%, 91.8%, 96.1%, and 97.7%, respectively.
Actual Non-Fall Actual Fall It is worth to mention that the accuracy of fall detection for
Predicted Non-Fall 3090 (TN) 0 (FN) QSVM and EBT reaches 100% with no false alarm which is
Predicted Fall 0 (FP) 109 (TP)
the best achievable performance.
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2019.2906693, IEEE Access
17
[7] O. Ojetola, E. Gaura, and J. Brusey, “Data set for fall events and criminant analysis,” in 2010 5th International Conference on Future
daily activities from inertial sensors,” in 6th ACM Multimedia Systems Information Technology, Busan, South Korea, May 2010, pp. 1–6.
Conference - MMSys ’15, Portland, OR, USA, Mar. 2015, pp. 243–248. [28] C. M. Bishop, Pattern Recognition and Machine Learning, 1st ed.
[8] A. Wertner, P. Czech, and V. Pammer-Schindler, “An open labelled Cambridge, UK: Springer, 2006.
dataset for mobile phone sensing based fall detection,” in 12th EAI In- [29] I. Steinwart and A. Christmann, Support Vector Machines, 1st ed. New
ternational Conference on Mobile and Ubiquitous Systems: Computing, York, USA: Springer, 2008.
Networking and Services (MOBIQUITOUS 2015), Coimbra, Portugal, [30] T. G. Dietterich, “An experimental comparison of three methods for
Jul. 2015, pp. 277–278. constructing ensembles of decision trees: Bagging, boosting, and ran-
[9] A. Sucerquia, J. D. López, and J. F. Vargas-Bonilla, “SisFall: A fall and domization,” Machine Learning, vol. 40, no. 2, pp. 139–157, Aug. 2000.
movement dataset,” Sensors, vol. 17, no. 1, pp. 1–14, Jan. 2017. [31] R. W. Schafer, “What is a Savitzky-Golay filter? [Lecture notes],” IEEE
[10] E. Casilari, J. A. Santoyo-Ramón, and J. M. Cano-Garcı́a, “Analysis of Signal Processing Magazine, vol. 28, no. 4, pp. 111–117, Jul. 2011.
a smartphone-based architecture with multiple mobility sensors for fall [32] D. Bailey and P. Swarztrauber, “A fast method for the numerical
detection,” PLoS ONE, vol. 11, pp. 1–17, Dec. 2016. evaluation of continuous Fourier and Laplace transforms,” SIAM Journal
[11] O. P. Popoola and K. Wang, “Video-based abnormal human behavior on Scientific Computing, vol. 15, no. 5, pp. 1105–1110, Sep. 1994.
recognition–A review,” IEEE Transactions on Systems, Man, and Cyber- [33] J. M. Knudsen and P. G. Hjorth, Elements of Newtonian Mechanics:
netics, Part C (Applications and Reviews), vol. 42, no. 6, pp. 865–878, Including Nonlinear Dynamics, 3rd ed. Springer-Verlag Berlin Heidel-
Nov. 2012. berg, 2000.
[12] C. Rougier, J. Meunier, A. St-Arnaud, and J. Rousseau, “Robust video [34] M. Vallejo, C. Isaza, and J. López, “Artificial Neural Networks as
surveillance for fall detection based on human shape deformation,” IEEE an alternative to traditional fall detection methods,” in 35th Annual
Transactions on Circuits and Systems for Video Technology, vol. 21, International Conference of the IEEE Engineering in Medicine and
no. 5, pp. 611–622, May 2011. Biology Society (EMBC), Osaka, Japan, Jul. 2013, pp. 1648–1651.
[13] C. Zhang, Y. Tian, and E. Capezuti, “Privacy preserving automatic fall [35] A. Vellido, J. D. M. n Guerrero, and P. J. G. Lisboa, “Making machine
detection for elderly using RGBD cameras,” in International Conference learning models interpretable,” in European Symposium on Artificial
on Computers for Handicapped Persons (ICCHP 2012). Linz, Austria: Neural Networks, Computational Intelligence and Machine Learning
Springer, Berlin, Heidelberg, Jul. 2012, pp. 625–633. (ESANN 2012), Bruges, Belgium, Apr. 2012, pp. 163–172.
[14] I. Charfi, J. Miteran, J. Dubois, M. Atri, and R. Tourki, “Definition and [36] W. Samek, T. Wiegand, and K. Müller, “Explainable artificial in-
performance evaluation of a robust SVM based fall detection solution,” telligence: Understanding, visualizing and interpreting deep learning
in 2012 Eighth International Conference on Signal Image Technology models,” arXiv e-prints, arXiv:1708.08296, Aug. 2017.
and Internet Based Systems. Naples, Italy: IEEE, Nov. 2012, pp. 218–
224.
[15] H. A. Nguyen and J. Meunier, “Gait analysis from video: camcorders
vs. Kinect,” in International Conference Image Analysis and Recognition
(ICIAR 2014). Vilamoura, Portugal: Springer, Oct. 2014, pp. 66–73.
[16] R. K. Tripathy, L. N. Sharma, S. Dandapat, B. Vanrumste, and T. Croo-
nenborghs, “Bridging the gap between real-life data and simulated data
by providing a highly realistic fall dataset for evaluating camera-based
fall detection algorithms,” Healthcare Technology Letters, vol. 3, no. 1,
pp. 6–11, Mar. 2016.
[17] K. Sehairi, F. Chouireb, and J. Meunier, “Comparative study of motion
detection methods for video surveillance systems,” Journal of Electronic
Imaging, vol. 26, no. 2, pp. 26–29, Apr. 2017.
[18] J. Klonovs et al., Distributed Computing and Monitoring Technologies
for Older Patients, 1st ed. London, UK: SpringerBriefs in Computer
Science, 2016.
[19] A. Bourke, P. Van de Ven, A. Chaya, G. OLaighin, and J. Nelson,
“Testing of a long-term fall detection system incorporated into a custom
vest for the elderly,” in 30th Annual International Conference of the
IEEE Engineering in Medicine and Biology Society (EMBS 2008),
Vancouver, BC, Canada, Aug. 2008, pp. 2844–2847.
[20] P. Barralon, I. Dorronsoro, and E. Hernandez, “Automatic fall detection:
Complementary devices for a better fall monitoring coverage,” in IEEE
15th International Conference on e-Health Networking, Applications and
Services (Healthcom 2013), Lisbon, Portugal, Oct. 2013, pp. 590–593.
[21] P. Kostopoulos, T. Nunes, K. Salvi, M. Deriaz, and J. Torrent, “F2D: A
fall detection system tested with real data from daily life of elderly
people,” in 17th International Conference on e-Health Networking,
Application & Services (HealthCom), Boston, MA, USA, Oct. 2015,
pp. 397–403.
[22] O. Ojetola, E. I. Gaura, and J. Brusey, “Fall detection with wearable
sensors–Safe (smart fall detection),” in Seventh International Conference
on Intelligent Environments, Jul. 2011, pp. 318–321.
[23] P. Putra, J. Brusey, and E. Gaura, “A cascade-classifier approach for
fall detection,” in 5th EAI International Conference on Wireless Mobile
Communication and Healthcare (MOBIHEALTH’15), London, UK, Oct.
2015, pp. 94–99.
[24] D. Anguita, A. Ghio, L. Oneto, X. Parra, and J. L. Reyes-Ortiz, “Human
activity recognition on smartphones using a multiclass hardware-friendly
support vector machine,” in International Workshop of Ambient Assited
Living, Vitoria-Gasteiz, Spain, Dec. 2012, pp. 216–223.
[25] A. B. Williams and F. J. Taylor, Electronic Filter Design Handbook,
4th ed. New York, USA: McGraw-Hill, 2006.
[26] J.-Y. Yang, J.-S. Wang, and Y.-P. Chen, “Using acceleration mea-
surements for activity recognition: An effective learning algorithm for
constructing neural classifiers,” Pattern Recognition Letters, vol. 29,
no. 16, pp. 2213–2220, 2008.
[27] A. M. Khan, Y.-K. Lee, S. Y. Lee, and T.-S. Kim, “Human activity
recognition via an accelerometer-enabled-smartphone using kernel dis-
2169-3536 (c) 2018 IEEE. Translations and content mining are permitted for academic research only. Personal use is also permitted, but republication/redistribution requires IEEE permission. See
http://www.ieee.org/publications_standards/publications/rights/index.html for more information.