Automated IoT Device Identification Based On Full Packet
Automated IoT Device Identification Based On Full Packet
Article
Automated IoT Device Identification Based on Full Packet
Information Using Real-Time Network Traffic
Narges Yousefnezhad 1 , Avleen Malhi 1,2 and Kary Främling 1,3, *
certificates are accessed by an unauthorized entity, it will allow identity theft, allowing
the transmission of false data to other devices in the network. It will impact the decision-
making process because decisions are usually made on the aggregated data from different
devices and a single falsification of data will affect the entire system. As a countermeasure,
hardware-based and software-based approaches can be applied. The Root of Trust (RoT) is
an example of hardware-based solutions that can be equipped by using either hardware like
a chip called the Trusted Platform Module (TPM) or using software like Trusted Execution
Environment (TEE). Using RoT, which is embedded on an IoT device, the authenticity of
other certificates can be verified by the root, which is chained to those certificates [5].
Apart from hardware-based approaches to countermeasure these types of identity
thefts, the system should be capable enough to identify these threats. The identity claimed
by the user is validated and proved by employing authentication techniques which in-
clude certificates, local user/password setting, or OAuth server connection (as shown in
client-side identification of Figure 1). This problem has been resolved by using Facebook
authentication with OAuth2 and an Access Control List (ACL) approach, where access
rules are specified using a tree-based information structure [6].
In contrast, there are very few options for verifying the device identity as shown in
device-side identification of Figure 1. The main challenge on the device-side is to authenti-
cate the origin of the received messages on the server to detect identity thefts. One solution
is using a certificate, which can easily be spoofed. Device fingerprinting can be considered
a more optimal solution in which the process of identification of connected devices can be
automated by network administrators [7]. Similar to fingerprinting, device profiling can be
an approach for continuous device identification by monitoring behavioral features.
Paper Contribution: This work addresses the challenge of IoT device identification
within a network by analyzing and classifying network traffic data for device identification
from arriving packets with high accuracy by taking machine learning (ML) approaches. We
solve this problem by presenting an identification approach based on sensor measurements,
statistics, and header information for device behavior or device profiling by monitoring the
data packets coming from smart devices to protect the server from receiving and spreading
false data. Each device’s behavior is defined by its features, which are characterized by
seven profiling models specified in the proposed framework. We adopt ML methods to
learn unique features of each IoT device without expert supervision and evaluate them
using network performance and algorithms’ accuracy. For proof-of-concept, we implement
a demonstrator system for the IoT system comprising of temperature and humidity sensors
Sensors 2021, 21, 2660 3 of 17
integrated through open standards called Open Messaging Interface (O-MI) and Open
Data Format (O-DF). The present paper is an extended version of our paper, presented at
the INDIN conference [8]. This paper significantly expands the feature sets and develops
realistic results by running a real use-case, collecting real data, and evaluating the system
through various attack scenarios.
2. Literature Review
As there is a need for multiple users and devices to authenticate each other using
trustable services, it is imperative that identity authentication be managed in IoT networks.
The device identification idea for handling data privacy was first coined by Sarma and
Girao [9] in 2009, however, most of the identity management systems based on Service
Oriented Architecture (SOA) in IoT such as Shibboleth, Card-Space, and Liberty Alliance
rarely considered the identity of devices in the framework [10]. To identify the device
identity, the security module can either employ its identity or its specific features. Recently,
various types of features in communication networks have been adopted by researchers for
device identifications. The most common features lately applied could be location based
on Global Positioning System (GPS) and Wi-Fi [11,12], the familiarity of devices derived
from Bluetooth, time [12], identity (MAC—Media Access Control, IP—Internet Protocol, or
RFID—Radio-Frequency Identification tag) [12], unique hardware-specific characteristics
such as clock skew [13–15], device fingerprinting using MAC and properties of packets
received from a device such as address and port of client and server [16], inter-arrival
time [17], padding, packet size, and destination IP counter [18]. During attack detection,
real-time and environmental factors also need to be considered to avoid false alarms [19].
Header information has been adopted extensively for device identification methods
in the literature. A device identification technique has been proposed for the identification
of the device model and type based on header information’s similarity calculation and
this method is built for factory-used devices and network cameras by relying on general
communication information [20]. Another method employs device fingerprinting for
authentication and identification purposes by training an ML method based on extracted
features from the network traffic for the detection of similar types of devices [21]. An
automated classification system was also developed for device characteristics known as
System Identifier (SysID) [4]. They applied Genetic Algorithms (GA) to distinguish the
most unique features and various ML methods to classify the device type. ML algorithms
were used in another approach on network traffic data for device identification of the
devices connected in an IoT network [22]. The labeled data were employed for training and
validating the classifier where labeled data were collected from nine unique IoT devices,
smartphones, and personal computers. A multi-stage meta classifier was trained using
supervised learning in the first stage to distinguish the IoT and non-IoT devices generating
traffic and in the second stage, a unique class has been associated with each IoT device.
In some use-cases, the non-white list (not allowed to be used within any premises
of organization) devices for trustworthy IoT device types need to be detected. For this
purpose, a supervised ML algorithm, Random Forest (RF) was adopted to train extracted
features from network traffic data with an objective to correctly identify the types of IoT
devices from the white list [23]. A multi-stage classification algorithm was developed based
on the network activity, demonstrating its ability to identify particular IoT devices [24].
A device identification approach was proposed where packet sequences from high-level
network-flow traffic data were analyzed using supervised ML techniques to extract distinct
flow-based features to create device fingerprinting [25]. The proposed approach was able
to automatically identify device types in the white list and the individual device instances
in the IoT network. Furthermore, a system security model was also designed to enable
rules enforcement to constrain IoT device communications according to their privileges.
This helps in suspicious device identification with abnormal behavior restricting their
communication to avoid further monitoring. A similar ML approach based on sent and
received packet streams was proposed to recognize connected IoT device types in an exper-
Sensors 2021, 21, 2660 4 of 17
imental smart home network and the designed model helps describe IoT device network
behaviors [26]. Another system named AuDI (Autonomous Device Identification) [27] was
proposed by analyzing the device-related network communications to identify the device
type in an IoT network traffic. An unsupervised learning algorithm was used for model-
ing periodic IoT devices’ communication traffic for identification. Hence, after the initial
learning phase, AuDI’s operation is fully automatic for the identification of previously
unfamiliar devices and device types in any operational mode or device lifecycle stage.
Table 1 represents a summary of state-of-the-art from various perspectives. Most of
the current identification methods focus on identifying the types of IoT devices from the
predefined list of legitimate devices while our method verifies identities of the devices
by making profiling models for each of them. The proposed method in this paper is
considered a traffic analysis method with certain differences. Since payloads mostly are
encrypted in traffic analysis, payload information is not operational in constructing the
fingerprinting. However, in our implementation, due to the new messaging standards (O-
MI/O-DF), the unencrypted payload data are accessible on the server. Therefore, we adopt
the payload data for device identification alongside header information and generated
statistical features. The idea of using payload data or sensor measurements for device
identification is initially proposed by [8]. This identification framework employs only
payload data and some extracted statistical features from a public dataset without taking
into consideration any attack scenarios. However, in the current paper, by collecting data
from a real use-case, header information can also be extracted to be combined with payload
data, which creates a variety set of diverse features. In addition, the paper accomplishes the
experimental results by evaluating the classification results under various attack situations.
[20] To identify the device Calculating the similarity of fea- Communication features ex- Network cameras and No attack
type and device model tures tracted from header factory-used devices
[21] To employ behavioral fin- K-nearest-neighbors (K-NN), Header feature and payload- 14 home IoT devices No attack
gerprinting for identifica- Decision Trees (DT), gradient based features
tion and authentication boosting, and majority voting
[4] To automatically classify ML algorithms (DT, K48, OneR, GA to determine most unique a database from [18] No attack
the IoT devices using PART) to classify device type features from network, trans-
TCP/IP packets port, and application layer
[22] To identify IoT devices us- Two-stages classifier: features from network, trans- 9 distinct IoT devices, No attack
ing ML algorithms on net- I. distinguish IoT vs non-IoT port, and application layer + and PCs and smart-
work traffic data II. determine device class data from Alexa Rank and phones
GeoIP
[23] To identify IoT device multi-class classifier using RF Features from Transmission 17 different IoT devices Based on local
types from the white list Control Protocol/Internet (9 device type) by differ- organizational
Protocol (TCP/IP) sessions ent vendors security policies
violations
[24] To classify IoT devices us- multi-stage ML: statistical attributes: a living lab with 28 IoT User Datagram
ing traffic characteristics Stage-0. Naïve Bayes activity cycles, port number, devices Protocol (UDP)
Stage-1. RF signaling patterns, and cipher reflection and TCP
suites SYN attacks
[26] To recognize IoT devices RF, DT, Support Vector Machine Size of first 10 pack sent/ re- experimental smart No attack
by analyzing the gener- (SVM), k-NN, Artificial Neural ceived and interval times home network of 4
ated network traffic Network and Gaussian Naive devices
Bayes
[25] To automatically identify ML classifiers ( e.g., SVM and K- behavioural and flow-based 31 off-the-shelf IoT de- Adversaries com-
white-listed device types NN) features vice (27 device types) promising devices
on network
[27] To identify device-type unsupervised learning method 4 types of features: periodic a dataset comprising 33 Spoofing device
without human interven- flaws, periodic accuracy, pe- typical commercial IoT fingerprints
tion riod duration, and period sta- devices
bility
Our To identify the device us- ML methods (RF, SVM, and Lo- header information, sensor 2 types of sensors in an physical and
work ing device profiling gistic Regression (LR)) measurements, and statistical office remote attacks
features (Object emulation
and Botnet attack)
Sensors 2021, 21, 2660 5 of 17
This is a data-driven framework obtained from sensor devices via GPS (Global Posi-
tioning System), Wi-Fi, and Bluetooth at the beginning of the Data Collection layer. Simul-
taneously, a packet analyzer such as Pyshark captures the traffic features extracted from
packet headers. Then, the Features Extraction module applies all the features for calculating
the feature vector to describe the present observation based on the value of the extracted
features. Subsequently, a data query will be sent to Sensor and Header DB by the Data
Preparation module for finding the necessary sensor and header information. During the
device’s learning phase, the sensor data and extracted features are trained to build the
model for the device using a training and testing set. The Classifier DB stores the trained
model from the Classification Engine.
Once the learning phase is completed, it is time for the prediction phase (red lines
in Figure 2), as the next step in model processing. After extracting the features from the
new input, the features and the classifier models loaded from the Classifier DB will be
employed by the Classification Engine to classify new observations. As the classification
result is prepared, the engine will assign a security level (e.g., binary value) to the classifier
and forward it to the Security Management layer, which considers the security level during
the verification of device identity. The Identification module verifies the identity and labels
the available feature set of current observations as malicious or legitimate. The feature
set in the last module can be composed of header features, sensor measurement, some
statistical features, or a combination of them all.
Sensors 2021, 21, 2660 6 of 17
4. Enforcement
The implementation details including system methodology and steps, model creation
and selection, and feature extraction are discussed as follows.
4.1. Methodology
To describe the device identification framework, Figure 3 depicts a workflow of
implementation steps. As seen in the figure, the first step in implementing the device
identification framework is to set up the environmental modules including the O-MI node,
security modules on top of the O-MI node, IoT gateways such as Raspberry Pi (RPi) and
Electronic Stability Program (ESP), and sensors connected to IoT gateways. From thereon,
all the modules will be operated over the O-MI server, which is implemented on a virtual
machine at Aalto University. All these modules generate a concrete security module that is
easy to plug-in on various servers, due to its modularity. On the other hand, running all
the security modules in a single place convert it to a new product that can be reused on
any other smart environment with minor patches.
Once all the required modules are running, in the Data capturing and preparation step,
the sensor measurements can be stored in the sensor-DB on the O-MI server. Simulta-
neously, header information related to HTTP messages arriving from the sensors can be
captured and extracted to the header-DB. Python has been selected as the core program-
ming language of this phase and the remaining phases. Thus, to allow Python packet
capturing and packet parsing, a Python wrapper called Pyshark has been employed which
uses Wireshark dissectors (or tshark).
Alongside the sensor measurements and header features, few time-based features
are also calculated based on the date-time attribute measured by sensors. Such features
generate the third category called statistical features, which will be stored in sensor-DB.
Therefore, for each sensor data sent to the server, three categories of features (sensor
measurements, header features, and statistical features) can be incorporated in seven
combinations as input vectors for the Training phase. Accordingly, seven classifiers are
defined for each gateway device with their best estimators. These classifiers will be stored
in classifier-DB and later during the Testing phase, they will be fetched from the same
DB. Once the classifiers are loaded in the Testing phase, seven input vectors similar to the
training vector but containing testing data are created and the real-time data are evaluated
with previously learned classifiers. The best classifier is selected based on the evaluation
results. Finally, to investigate the effect of attacks on the evaluation results and if the
best classifier can efficiently find the attack, some attacks will be implemented and the
performance results will be compared.
Sensors 2021, 21, 2660 7 of 17
Temperature 0.2409
Measurements (2)
Humidity 0.0597
5. Adversary Model
As IoT devices are spreading around the world, cyber-attacks are becoming more
sophisticated in multiple stages by coordinating various attacks from different places,
thus requiring more complex defense mechanisms and vulnerability assessments. To
properly assess the vulnerabilities, the network requires to face a simulated attack. In
other words, a real synthetic attack scenario will occur in the network so that the potential
breaches or vulnerabilities and the related defense mechanisms can be identified [28]. In
this investigation of two general attack categories called physical attack and remote access
attack, we simulate two attacks from these categories including object emulation attack
and Botnet attack. However, only one set of results is presented in the paper, since both
attacks have the same effect on the system, and as a result, they yield similar evaluation
results. “we introduce two types of common attack scenarios in IoT environments, which
can be detected by our device identification approach. We want to show that our model
not only identifies the device identity but also can detect the attacks originating from the
identity theft.”
smart campuses are impractical, the exploration of the network layer security control is a
more feasible solution.
6. Evaluation
In this section, initially, the testing scenario and performance metrics will be defined.
Then, according to the introduced metrics, the classification results will be analyzed in
various circumstances: the normal implementation of the network with no attack and the
implementation while the attack running on the network.
6.2.2. Accuracy
Accuracy refers to the total number of correct classifications performed by the network
out of the total number of classifications made. As shown in Equation (2), accuracy is the
rate of true predictions by all the true and false predictions combined.
( TP + TN )
Accuracy = (2)
( TP + FP + FN + TN )
Sensors 2021, 21, 2660 11 of 17
TP
Recall = (3)
( TP + FN )
6.2.4. Precision
Precision is the rate of accurately predicted positives to all the predicted positives.
Precision is about predicting frames correctly, whereas Recall is about predicting all the
positive frames correctly [30]. So, to minimize false negatives, we must focus on enhancing
Recall as high as possible with a decent and acceptable Precision value. The values of both
Precision and Recall can be monitored by a single value performance metric called the
F1-score.
TP
Precision = (4)
( TP + FP)
6.2.5. F-Score
To consider the role of both precision and recall, the F1-score is computed. As presented
in Equation (5), the F1-score is simply the harmonic mean of precision and recall. In the case
of unbalanced class distribution in the dataset, the F1-score is a more optimal evaluation
metric than accuracy. A low value of the F1 score indicates a problem when either Precision
or Recall has a low value. In that case, the F1-score is closer to the smaller value than the
bigger value out of these two.
(2 × Precision × Recall )
F1 − score = (5)
( Precision + Recall )
Considering all the tables, it can be determined that SVM has the highest accuracy
and F1-score compared to other ML methods in most of the profiling models. Moreover,
although both SVM and LR generally perform fast, SVM has the fastest building time
(in seconds) in our case scenario. Hence, we nominate SVM for attack analysis in the
next step. Our data have two fundamental properties: (I) they include a relatively small
level of sparsity, and (II) classes are not linearly separable. SVM can nicely handle binary
Sensors 2021, 21, 2660 13 of 17
classification tasks where the classes are not linearly separable while LR is unable to cope
with such problems well. We think the reason why SVM outperforms RF is because of
the first property of the data above. Although the sparsity level is not large, it can still
negatively affect the performance of RF. Furthermore, SVM is intrinsically suitable for
binary classification tasks, same as our classification task, while RF is suitable mostly for
multi-class classification tasks. On the other hand, regarding the profiling system, the
Measurement-only model reports the best results for all the ML methods. The next best
models are measurement-statistic and measurement-header. Therefore, we can conclude
that models including measurement features generally perform better compared to the rest
of the profiling models.
Figure 6. False Positive (FP) rate for attack scenario 1 for all the profiling models.
Figure 7. FP rate for attack scenario 2 for all the profiling models.
Since measurement features have a more profound effect on the results, they are
selected and the performance metrics for the measurement-only and measurement-header
are calculated using the SVM classifier and presented in Table 10. The values are calculated
for the different inter-arrival times for both the attack scenarios. Further, the measurement
header performs better than measurement only for all the inter-arrival times.
Sensors 2021, 21, 2660 15 of 17
Table 10. Support Vector Machine (SVM) results for measurement-only and measurement-header
during attack scenarios.
Author Contributions: Conceptualization, N.Y.; methodology, N.Y.; software, N.Y.; validation, N.Y.;
formal analysis, N.Y.; investigation, N.Y.; resources, N.Y.; data curation, N.Y.; writing—original
draft preparation, N.Y. and A.M.; writing—review and editing, N.Y.; visualization, N.Y. and A.M.;
supervision, K.F. and A.M.; funding acquisition, K.F. All authors have read and agreed to the
published version of the manuscript.
Funding: This research was funded by the European Union’s Horizon 2020 research and innovation
program (grant 688203), H2020 project FINEST TWINS (grant No. 856602), and Academy of Finland
(Open Messaging Interface; grant 296096).
Institutional Review Board Statement: Not applicable.
Informed Consent Statement: Not applicable.
Data Availability Statement: The data used to generate the plots and figures can be accessed by
contacting the author at [email protected].
Acknowledgments: The research leading to this publication is supported by the European Union’s
Horizon 2020 research and innovation program (grant 688203), H2020 project FINEST TWINS (grant
No. 856602), and Academy of Finland (Open Messaging Interface; grant 296096).
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Gratner Newsroom. 2015. Available online: https://gartner.com/en/newsroom/id/283971 (accessed on 4 June 2020).
2. Bertino, E.; Islam, N. Botnets and internet of things security. Computer 2017, 50, 76–79. [CrossRef]
Sensors 2021, 21, 2660 16 of 17
3. Shah, T.; Venkatesan, S. Authentication of IoT device and IoT server using secure vaults. In Proceedings of the 2018 17th IEEE In-
ternational Conference On Trust, Security And Privacy In Computing And Communications/12th IEEE International Conference
On Big Data Science And Engineering (TrustCom/BigDataSE), New York, NY, USA, 31 July–3 August 2018; pp. 819–824.
4. Aksoy, A.; Gunes, M.H. Automated IoT Device Identification using Network Traffic. In Proceedings of the 2019 IEEE International
Conference on Communications, ICC 2019, Shanghai, China, 20–24 May 2019; pp. 1–7. [CrossRef]
5. Raval, M.; Sunkireddy, R. Hardware Root of Trust on IoT Gateway. 2019. Available online: https://www.techdesignforums.com/
practice/technique/hardware-roots-of-trust-for-iot-security/ (accessed on 8 April 2021)
6. Yousefnezhad, N.; Filippov, R.; Javed, A.; Buda, A.; Madhikermi, M.; Främling, K. Authentication and Access Control for Open
Messaging Interface Standard. In Proceedings of the 14th EAI International Conference on Mobile and Ubiquitous Systems:
Computing, Networking and Services, Melbourne, Australia, 7–10 November 2017; Gu, T., Kotagiri, R., Liu, H., Eds.; ACM: New
York, NY, USA, 2017; pp. 20–27. [CrossRef]
7. Li, B.; Gunes, M.H.; Bebis, G.; Springer, J. A supervised machine learning approach to classify host roles on line using sflow.
In Proceedings of the First Edition Workshop on High Performance and Programmable Networking, New York, NY, USA, 18
June 2013; pp. 53–60.
8. Yousefnezhad, N.; Madhikermi, M.; Främling, K. MeDI: Measurement-based Device Identification Framework for Internet of
Things. In Proceedings of the 16th IEEE International Conference on Industrial Informatics, INDIN 2018, Porto, Portugal, 18–20
July 2018; pp. 95–100. [CrossRef]
9. Sarma, A.C.; Girão, J. Identities in the future internet of things. Wirel. Pers. Commun. 2009, 49, 353–363. [CrossRef]
10. Mahalle, P.; Babar, S.; Prasad, N.R.; Prasad, R. Identity management framework towards internet of things (IoT): Roadmap and
key challenges. In International Conference on Network Security and Applications; Springer: Berlin/Heidelberg, Germany, 2010;
pp. 430–439.
11. Miettinen, M.; Heuser, S.; Kronz, W.; Sadeghi, A.; Asokan, N. ConXsense: automated context classification for context-aware
access control. In Proceedings of the 9th ACM Symposium on Information, Computer and Communications Security, ASIA CCS
’14, Kyoto, Japan, 3–6 June 2014; pp. 293–304. [CrossRef]
12. Perera, C.; Zaslavsky, A.B.; Christen, P.; Georgakopoulos, D. Context Aware Computing for The Internet of Things: A Survey.
IEEE Commun. Surv. Tutorials 2014, 16, 414–454. [CrossRef]
13. Kohno, T.; Broido, A.; Claffy, K.C. Remote Physical Device Fingerprinting. In Proceedings of the 2005 IEEE Symposium on
Security and Privacy (S&P 2005), Oakland, CA, USA, 8–11 May 2005; pp. 211–225. [CrossRef]
14. Arackaparambil, C.; Bratus, S.; Shubina, A.; Kotz, D. On the reliability of wireless fingerprinting using clock skews. In Proceedings
of the Third ACM Conference on Wireless Network Security, WISEC 2010, Hoboken, NJ, USA, 22–24 March 2010; pp. 169–174.
[CrossRef]
15. Jana, S.; Kasera, S.K. On Fast and Accurate Detection of Unauthorized Wireless Access Points Using Clock Skews. IEEE Trans.
Mob. Comput. 2010, 9, 449–462. [CrossRef]
16. Barbosa, R.R.R.; Sadre, R.; Pras, A. Flow whitelisting in SCADA networks. Int. J. Crit. Infrastruct. Prot. 2013, 6, 150–158.
[CrossRef]
17. Radhakrishnan, S.V.; Uluagac, A.S.; Beyah, R. GTID: A technique for physical device and device type fingerprinting. IEEE Trans.
Dependable Secur. Comput. 2015, 12, 519–532. [CrossRef]
18. Miettinen, M.; Marchal, S.; Hafeez, I.; Asokan, N.; Sadeghi, A.; Tarkoma, S. IoT SENTINEL: Automated Device-Type Identification
for Security Enforcement in IoT. In Proceedings of the 37th IEEE International Conference on Distributed Computing Systems,
ICDCS 2017, Atlanta, GA, USA, 5–8 June 2017; pp. 2177–2184. [CrossRef]
19. Sharaf-Dabbagh, Y.; Saad, W. On the authentication of devices in the Internet of things. In Proceedings of the 17th IEEE
International Symposium on A World of Wireless, Mobile and Multimedia Networks, WoWMoM 2016, Coimbra, Portugal, 21–24
June 2016; pp. 1–3. [CrossRef]
20. Noguchi, H.; Kataoka, M.; Yamato, Y. Device Identification Based on Communication Analysis for the Internet of Things. IEEE
Access 2019, 7, 52903–52912. [CrossRef]
21. Bezawada, B.; Bachani, M.; Peterson, J.; Shirazi, H.; Ray, I.; Ray, I. Behavioral Fingerprinting of IoT Devices. In Proceedings of
the 2018 Workshop on Attacks and Solutions in Hardware Security, ASHES@CCS 2018, Toronto, ON, Canada, 19 October 2018;
Chang, C., Rührmair, U., Holcomb, D., Guajardo, J., Eds.; ACM: New York, NY, USA, 2018; pp. 41–50. [CrossRef]
22. Meidan, Y.; Bohadana, M.; Shabtai, A.; Guarnizo, J.D.; Ochoa, M.; Tippenhauer, N.O.; Elovici, Y. ProfilIoT: A machine learning
approach for IoT device identification based on network traffic analysis. In Proceedings of the Symposium on Applied Computing,
SAC 2017, Marrakech, Morocco, 3–7 April 2017; Seffah, A., Penzenstadler, B., Alves, C., Peng, X., Eds.; ACM: New York, NY, USA,
2017; pp. 506–509. [CrossRef]
23. Meidan, Y.; Bohadana, M.; Shabtai, A.; Ochoa, M.; Tippenhauer, N.O.; Guarnizo, J.D.; Elovici, Y. Detection of Unauthorized IoT
Devices Using Machine Learning Techniques. arXiv 2017, arXiv:1709.04647.
24. Sivanathan, A.; Gharakheili, H.H.; Loi, F.; Radford, A.; Wijenayake, C.; Vishwanath, A.; Sivaraman, V. Classifying IoT Devices in
Smart Environments Using Network Traffic Characteristics. IEEE Trans. Mob. Comput. 2019, 18, 1745–1759. [CrossRef]
Sensors 2021, 21, 2660 17 of 17
25. Hamad, S.A.; Zhang, W.E.; Sheng, Q.Z.; Nepal, S. IoT Device Identification via Network-Flow Based Fingerprinting and
Learning. In Proceedings of the 18th IEEE International Conference On Trust, Security And Privacy In Computing And
Communications/13th IEEE International Conference On Big Data Science And Engineering, TrustCom/BigDataSE 2019,
Rotorua, New Zealand, 5–8 August 2019; pp. 103–111. [CrossRef]
26. Shahid, M.R.; Blanc, G.; Zhang, Z.; Debar, H. IoT Devices Recognition Through Network Traffic Analysis. In Proceedings of the
IEEE International Conference on Big Data, Big Data 2018, Seattle, WA, USA, 10–13 December 2018; pp. 5187–5192. [CrossRef]
27. Marchal, S.; Miettinen, M.; Nguyen, T.D.; Sadeghi, A.R.; Asokan, N. Audi: Toward autonomous iot device-type identification
using periodic communication. IEEE J. Sel. Areas Commun. 2019, 37, 1402–1412. [CrossRef]
28. Eom, J.h.; Han, Y.J.; Park, S.H.; Chung, T.M. Active cyber attack model for network system’s vulnerability assessment. In Pro-
ceedings of the 2008 International Conference on Information Science and Security (ICISS 2008), Jammu, India, 16–20 December
2008; pp. 153–158.
29. SolarWinds. Detecting and Preventing Rogue Devices. 2017. Available online: https://www.solarwinds.com/-/media/
solarwinds/swdcv2/licensed-products/user-device-tracker/resources/whitepaper/udt_wp_detect_prevent_rogue_devices.
ashx?rev=7a261fece0b940ceb814f4985e1d74ff (accessed on 8 April 2021)
30. Buckland, M.K.; Gey, F.C. The Relationship between Recall and Precision. J. Am. Soc. Inf. Sci. 1994, 45, 12–19. [CrossRef]