Unit 5
Unit 5
IoT Data analytics and Security OLAP and OLTP, NoSQL databases, Row and column Oriented
databases, Introduction to Columnar DBMS C-Store, Run: Length and Bit vector Encoding, IoT Data
Analytics. Cryptographic algorithms, 46 Analysis of Light weight Cryptographic solutions IoT
security, Key exchange using Elliptical Curve Cryptography, Comparative analysis of Cryptographic
Library for IoT.
Unit 6 [6 Hrs]
IoT Applications IoT applications like Home Automation, Precision Agriculture, Smart vehicles,
Smart Grid, Industry 5.0.
––
OLAP Full Form
OLAP stands for Online Analytical Processing (OLAP) could be a innovation that’s utilized to
organize expansive business databases and back business intelligence. OLAP databases are
separated into one or more cubes, and each cube is organized and designed by a cube administrator
to fit the way simply recover and analyze data so that it is less demanding to form and utilize the
PivotTable reports and PivotChart reports that you just require.
Characteristics of OLAP
Fast It characterizes which the framework focused on to provide the foremost input to the client
inside almost five seconds, with the basic investigation taking no more than one moment and
exceptionally few taking more than 15 seconds.
Analysis It characterizes which the strategy can adapt with any trade rationale and measurable
examination that’s significant for the work and the client, keep it simple sufficient for the target
client. In spite of the fact that a few pre-programming may be needed, we don’t think it
acceptable in the event that all application definitions need to be permit the client to
characterize modern Ad hoc calculations as portion of the examination and to record on the
information in any wanted strategy, without having to program so we avoids items (like Oracle
Discoverer)
Share It characterizes which the framework devices all the security prerequisites for
understanding and, in case numerous type in association is required, concurrent overhaul area at
an appropriated level, not all capacities require client to compose information back, but for the
expanding number which does, the framework ought to be able to oversee numerous upgrades
in a convenient, secure way.
Multidimensional OLAP framework must give a multidimensional conceptual see of the
information, counting full bolster for chains of command, as usually certainly the foremost
consistent strategy to analyse commerce and organizations.
Information The framework ought to be able to hold all the information required by the
applications. Information sparsity ought to be taken care of in an proficient way.
Advantages of OLAP
Quick inquiry execution due to optimized capacity, multidimensional ordering and caching.
Smaller on-disk measure of information compared to information put away in social database
due to compression techniques.
Automated computation of higher level totals of the data.
It is exceptionally compact for most measurement information sets. Array models give common
indexing.
Effective information extraction accomplished through the pre-structuring of amassed
information.
Disadvantages of OLAP
Inside a few OLAP Arrangements the preparing step (information stack) can be very long,
particularly on expansive information volumes. This is often ordinarily helped by doing as it
were incremental handling, i.e., preparing as it were the data which have changed (usually
modern information) rather than reprocessing the whole information set.
Some OLAP techniques present data redundancy.
OLTP stands for Online Transaction Processing. It is a kind of data processing approach
which involves the use of transaction-oriented tasks. OLTP has a major role in business
intelligence which helps in the decision making process. The database systems are used to
perform online transactions and query processing. It consists of real-life operations such as
banking, accounting, purchasing, etc. It has high performance and is used by IT professionals.
OLTP was first made operational in 1970 and was made by IBM for American Airlines. It first ran
on IBM 7090 computers. Then gradually OLTP came into use for fields like banks, hotels, credit
card companies, etc.
Characteristics
High availability: OLTP systems have high availability requirements. It is well integrated with
high availability offerings that SQL server provides.
Data storage: The data in OLTP systems is stored at transaction level.
Normalized database design: It has a normalized database design.
Useful for small transactions: OLTP can work properly on small amounts of data and is useful
for small transactions.
Less response time: OLTP systems have less response time which is very useful.
Advantages
It is faster and has more accuracy.
It can handle large amount of data and complex calculations.
It is very user friendly and easy to use.
It is has high efficiency and excellent response time.
Disadvantages
There are chances of the system getting hacked by unethical users.
In case of hardware or power failure the data can be lost.
Multiple requests at a time are difficult to handle.
The server may hang for some time due to excess load.
Here's a complete roadmap for you to become a developer: Learn DSA -> Master
Frontend/Backend/Full Stack -> Build Projects -> Keep Applying to Jobs
And why go anywhere else when our DSA to Development: Coding Guide helps you do this in a
single program! Apply now to our DSA to Development Program and our counsellors will connect
with you for further guidance & support.
OLAP stands for Online Analytical Processing. OLAP systems have the capability to analyse
database information of multiple systems at the current time. The primary goal of OLAP Service is
data analysis and not data processing.
OLTP stands for Online Transaction Processing. OLTP has the work to administer day-to-day
transactions in any organization. The main goal of OLTP is data processing not data analysis.
Online Analytical Processing (OLAP)
Online Analytical Processing (OLAP) consists of a type of software tool that is used for data
analysis for business decisions. OLAP provides an environment to get insights from the database
retrieved from multiple database systems at one time.
OLAP Examples
Any type of Data Warehouse System is an OLAP system. The uses of the OLAP System are
described below.
Spotify analysed songs by users to come up with a personalized homepage of their songs and
playlist.
Netflix movie recommendation system.
OLAP Services requires professionals to handle the data because of its complex modelling
procedure.
OLAP services are expensive to implement and maintain in cases when datasets are large.
We can perform an analysis of data only after extraction and transformation of data in the case
of OLAP which delays the system.
OLAP services are not efficient for decision-making, as it is updated on a periodic basis.
Online Transaction Processing (OLTP)
Online transaction processing provides transaction-oriented applications in a 3-tier architecture.
OLTP administers the day-to-day transactions of an organization.
OLTP Examples
An example considered for OLTP System is ATM Center a person who authenticates first will
receive the amount first and the condition is that the amount to be withdrawn must be present in the
ATM. The uses of the OLTP System are described below.
ATM center is an OLTP application.
OLTP handles the ACID properties during data transactions via the application.
It’s also used for Online banking, Online airline ticket booking, sending a text message, add a
book to the shopping cart.
OLTP vs OLAP
OLTP services allow users to read, write and delete data operations quickly.
OLTP services help in increasing users and transactions which helps in real-time access to data.
OLTP services help to provide better security by applying multiple security features.
OLTP services help in making better decision making by providing accurate data or current
data.
OLTP Services provide Data Integrity, Consistency, and High Availability to the data.
OLTP has limited analysis capability as they are not capable of intending complex analysis or
reporting.
OLTP has high maintenance costs because of frequent maintenance, backups, and recovery.
OLTP Services get hampered in the case whenever there is a hardware failure which leads to
the failure of online transactions.
OLTP Services many times experience issues such as duplicate or inconsistent data.
Difference between OLAP and OLTP
OLAP (Online Analytical OLTP (Online Transaction
Category Processing) Processing)
It is well-known as an online
It is well-known as an online database
Definition database query management
modifying system.
system.
It is subject-oriented. Used
It is application-oriented. Used for
Application for Data Mining, Analytics,
business tasks.
Decisions making, etc.
Backup and It only needs backup from time to The backup and recovery process is
Recovery time as compared to OLTP. maintained rigorously
Introduction to NoSQL
NoSQL is a type of database management system (DBMS) that is designed to handle and store large
volumes of unstructured and semi-structured data. Unlike traditional relational databases that use
tables with pre-defined schemas to store data, NoSQL databases use flexible data models that can
adapt to changes in data structures and are capable of scaling horizontally to handle growing
amounts of data.
The term NoSQL originally referred to “non-SQL” or “non-relational” databases, but the term has
since evolved to mean “not only SQL,” as NoSQL databases have expanded to include a wide range
of different database architectures and data models.
NoSQL databases are generally classified into four main categories:
1. Document databases: These databases store data as semi-structured documents, such as JSON
or XML, and can be queried using document-oriented query languages.
2. Key-value stores: These databases store data as key-value pairs, and are optimized for simple
and fast read/write operations.
3. Column-family stores: These databases store data as column families, which are sets of
columns that are treated as a single entity. They are optimized for fast and efficient querying of
large amounts of data.
4. Graph databases: These databases store data as nodes and edges, and are designed to handle
complex relationships between data.
NoSQL databases are often used in applications where there is a high volume of data that needs to
be processed and analyzed in real-time, such as social media analytics, e-commerce, and gaming.
They can also be used for other applications, such as content management systems, document
management, and customer relationship management.
However, NoSQL databases may not be suitable for all applications, as they may not provide the
same level of data consistency and transactional guarantees as traditional relational databases. It is
important to carefully evaluate the specific needs of an application when choosing a database
management system.
NoSQL originally referring to non SQL or non relational is a database that provides a mechanism
for storage and retrieval of data. This data is modelled in means other than the tabular relations used
in relational databases. Such databases came into existence in the late 1960s , but did not obtain the
NoSQL moniker until a surge of popularity in the early twenty-first century. NoSQL databases are
used in real-time web applications and big data and their use are increasing over time.
NoSQL systems are also sometimes called Not only SQL to emphasize the fact that they may
support SQL-like query languages. A NoSQL database includes simplicity of design, simpler
horizontal scaling to clusters of machines,has and finer control over availability. The data
structures used by NoSQL databases are different from those used by default in relational
databases which makes some operations faster in NoSQL. The suitability of a given NoSQL
database depends on the problem it should solve.
NoSQL databases, also known as “not only SQL” databases, are a new type of database
management system that has, gained popularity in recent years. Unlike traditional relational
databases, NoSQL databases are designed to handle large amounts of unstructured or semi-
structured data, and they can accommodate dynamic changes to the data model. This makes
NoSQL databases a good fit for modern web applications, real-time analytics, and big data
processing.
Data structures used by NoSQL databases are sometimes also viewed as more flexible than
relational database tables. Many NoSQL stores compromise consistency in favor of availability,
speed,, and partition tolerance. Barriers to the greater adoption of NoSQL stores include the use
of low-level query languages, lack of standardized interfaces, and huge previous investments in
existing relational databases.
Most NoSQL stores lack true ACID(Atomicity, Consistency, Isolation, Durability) transactions
but a few databases, such as MarkLogic, Aerospike, FairCom c-treeACE, Google Spanner
(though technically a NewSQL database), Symas LMDB, and OrientDB have made them
central to their designs.
Most NoSQL databases offer a concept of eventual consistency in which database changes are
propagated to all nodes so queries for data might not return updated data immediately or might
result in reading data that is not accurate which is a problem known as stale reads.
Also,has some NoSQL systems may exhibit lost writes and other forms of data loss. Some
NoSQL systems provide concepts such as write-ahead logging to avoid data loss.
One simple example of a NoSQL database is a document database. In a document database,
data is stored in documents rather than tables. Each document can contain a different set of
fields, making it easy to accommodate changing data requirements
For example, “Take, for instance, a database that holds data regarding employees.”. In a
relational database, this information might be stored in tables, with one table for employee
information and another table for department information. In a document database, each
employee would be stored as a separate document, with all of their information contained
within the document.
NoSQL databases are a relatively new type of database management system that has a gained
popularity in recent years due to their scalability and flexibility. They are designed to handle
large amounts of unstructured or semi-structured data and can handle dynamic changes to the
data model. This makes NoSQL databases a good fit for modern web applications, real-time
analytics, and big data processing.
Key Features of NoSQL:
1. Dynamic schema: NoSQL databases do not have a fixed schema and can accommodate
changing data structures without the need for migrations or schema alterations.
2. Horizontal scalability: NoSQL databases are designed to scale out by adding more nodes to a
database cluster, making them well-suited for handling large amounts of data and high levels of
traffic.
3. Document-based: Some NoSQL databases, such as MongoDB, use a document-based data
model, where data is stored in a scalessemi-structured format, such as JSON or BSON.
4. Key-value-based: Other NoSQL databases, such as Redis, use a key-value data model, where
data is stored as a collection of key-value pairs.
5. Column-based: Some NoSQL databases, such as Cassandra, use a column-based data model,
where data is organized into columns instead of rows.
6. Distributed and high availability: NoSQL databases are often designed to be highly available
and to automatically handle node failures and data replication across multiple nodes in a
database cluster.
7. Flexibility: NoSQL databases allow developers to store and retrieve data in a flexible and
dynamic manner, with support for multiple data types and changing data structures.
8. Performance: NoSQL databases are optimized for high performance and can handle a high
volume of reads and writes, making them suitable for big data and real-time applications.
Advantages of NoSQL: There are many advantages of working with NoSQL databases such as
MongoDB and Cassandra. The main advantages are high scalability and high availability.
1. High scalability: NoSQL databases use sharding for horizontal scaling. Partitioning of data and
placing it on multiple machines in such a way that the order of the data is preserved is sharding.
Vertical scaling means adding more resources to the existing machine whereas horizontal
scaling means adding more machines to handle the data. Vertical scaling is not that easy to
implement but horizontal scaling is easy to implement. Examples of horizontal scaling
databases are MongoDB, Cassandra, etc. NoSQL can handle a huge amount of data because of
scalability, as the data grows NoSQL scalesThe auto itself to handle that data in an efficient
manner.
2. Flexibility: NoSQL databases are designed to handle unstructured or semi-structured data,
which means that they can accommodate dynamic changes to the data model. This makes
NoSQL databases a good fit for applications that need to handle changing data requirements.
3. High availability: The auto, replication feature in NoSQL databases makes it highly available
because in case of any failure data replicates itself to the previous consistent state.
4. Scalability: NoSQL databases are highly scalable, which means that they can handle large
amounts of data and traffic with ease. This makes them a good fit for applications that need to
handle large amounts of data or traffic
5. Performance: NoSQL databases are designed to handle large amounts of data and traffic,
which means that they can offer improved performance compared to traditional relational
databases.
6. Cost-effectiveness: NoSQL databases are often more cost-effective than traditional relational
databases, as they are typically less complex and do not require expensive hardware or
software.
7. Agility: Ideal for agile development.
Disadvantages of NoSQL: NoSQL has the following disadvantages.
1. Lack of standardization: There are many different types of NoSQL databases, each with its
own unique strengths and weaknesses. This lack of standardization can make it difficult to
choose the right database for a specific application
2. Lack of ACID compliance: NoSQL databases are not fully ACID-compliant, which means that
they do not guarantee the consistency, integrity, and durability of data. This can be a drawback
for applications that require strong data consistency guarantees.
3. Narrow focus: NoSQL databases have a very narrow focus as it is mainly designed for storage
but it provides very little functionality. Relational databases are a better choice in the field of
Transaction Management than NoSQL.
4. Open-source: NoSQL is an database open-source database. There is no reliable standard for
NoSQL yet. In other words, two database systems are likely to be unequal.
5. Lack of support for complex queries: NoSQL databases are not designed to handle complex
queries, which means that they are not a good fit for applications that require complex data
analysis or reporting.
6. Lack of maturity: NoSQL databases are relatively new and lack the maturity of traditional
relational databases. This can make them less reliable and less secure than traditional databases.
7. Management challenge: The purpose of big data tools is to make the management of a large
amount of data as simple as possible. But it is not so easy. Data management in NoSQL is much
more complex than in a relational database. NoSQL, in particular, has a reputation for being
challenging to install and even more hectic to manage on a daily basis.
8. GUI is not available: GUI mode tools to access the database are not flexibly available in the
market.
9. Backup: Backup is a great weak point for some NoSQL databases like MongoDB. MongoDB
has no approach for the backup of data in a consistent manner.
10. Large document size: Some database systems like MongoDB and Couch DB store data in
JSON format. This means that documents are quite large (BigData, network bandwidth, speed),
and having descriptive key names actually hurts since they increase the document size.
Types of NoSQL database: Types of NoSQL databases and the name of the database system that
falls in that category are:
1. Graph Databases: Examples – Amazon Neptune, Neo4j
2. Key value store: Examples – Mem cached, Redis, Coherence
3. Column: Examples – Hbase, Big Table, Accumulo
4. Document-based: Examples – MongoDB, CouchDB, Cloudant
When should NoSQL be used:
1. When a huge amount of data needs to be stored and retrieved.
2. The relationship between the data you store is not that important
3. The data changes over time and is not structured.
4. Support of Constraints and Joins is not required at the database level
5. The data is growing continuously and you need to scale the database regularly to handle the
data.
In conclusion, NoSQL databases offer several benefits over traditional relational databases, such as
scalability, flexibility, and cost-effectiveness. However, they also have several drawbacks, such as a
lack of standardization, lack of ACID compliance, and lack of support for complex queries. When
choosing a database for a specific application, it is important to weigh the benefits and drawbacks
carefully to determine the best fit.
When it comes to choosing a database the biggest decision is picking a relational (SQL) or non-
relational (NoSQL) data structure. While both databases are viable options still there are certain key
differences between the two that users must keep in mind when making a decision.
Main differences between NoSQL and SQL
Type
SQL databases are primarily called Relational Databases (RDBMS); whereas NoSQL databases are
primarily called non-relational or distributed databases.
Language
SQL databases define and manipulate data-based structured query language (SQL). Seeing from a
side this language is extremely powerful. SQL is one of the most versatile and widely-used options
available which makes it a safe choice, especially for great complex queries. But from another side, it
can be restrictive. SQL requires you to use predefined schemas to determine the structure of your data
before you work with it. Also, all of your data must follow the same structure. This can require
significant up-front preparation which means that a change in the structure would be both difficult and
disruptive to your whole system.
A NoSQL database has a dynamic schema for unstructured data. Data is stored in many ways which
means it can be document-oriented, column-oriented, graph-based, or organized as a key-value store.
This flexibility means that documents can be created without having a defined structure first. Also,
each document can have its own unique structure. The syntax varies from database to database, and
you can add fields as you go.
Scalability
In almost all situations SQL databases are vertically scalable. This means that you can increase the
load on a single server by increasing things like RAM, CPU, or SSD. But on the other hand, NoSQL
databases are horizontally scalable. This means that you handle more traffic by sharing, or adding
more servers in your NoSQL database. It is similar to adding more floors to the same building versus
adding more buildings to the neighbourhood. Thus NoSQL can ultimately become larger and more
powerful, making these databases the preferred choice for large or ever-changing data sets.
Structure
SQL databases are table-based on the other hand NoSQL databases are either key-value pairs,
document-based, graph databases, or wide-column stores. This makes relational SQL databases a
better option for applications that require multi-row transactions such as an accounting system or for
legacy systems that were built for a relational structure.
Here is a simple example of how a structured data with rows and columns vs a non-structured data
without definition might look like. A product table in SQL dbmight accept data looking like this:
{
“id”: “101”,
“category”:”food”
“name”:”Apples”,
“qty”:”150″
}
Whereas a unstructured NOSQL DB might save the products in many variations without constraints
to change the underlying table structure
Products=[
{
“id”:”101:
“category”:”food”,,
“name”:”California Apples”,
“qty”:”150″
},
{
“id”:”102,
“category”:”electronics”
“name”:”Apple MacBook Air”,
“qty”:”10″,
“specifications”:{
“storage”:”256GB SSD”,
“cpu”:”8 Core”,
“camera”: “1080p FaceTime HD camera”
}
}
]
Property followed
SQL databases follow ACID properties (Atomicity, Consistency, Isolation, and Durability) whereas
the NoSQL database follows the Brewers CAP theorem (Consistency, Availability, and Partition
tolerance).
Support
Great support is available for all SQL databases from their vendors. Also, a lot of independent
consultants are there who can help you with SQL databases for very large-scale deployments but for
some NoSQL databases you still have to rely on community support and only limited outside experts
are available for setting up and deploying your large-scale NoSQL deploy.
When To Use: SQL vs NoSQL
SQL is a good choice when working with related data. Relational databases are efficient, flexible, and
easily accessed by any application. A benefit of a relational database is that when one user updates a
specific record, every instance of the database automatically refreshes, and that information is
provided in real-time.
SQL and a relational database make it easy to handle a great deal of information, scale as necessary
and allow flexible access to data only needing to update data once instead of changing multiple files,
for instance. It’s also best for assessing data integrity. Since each piece of information is stored in a
single place, there’s no problem with former versions confusing the picture.
While NoSQL is good when the availability of big data is more crucial, SQL is valued for ensuring
data validity. It’s also a wise decision when a business needs to expand in response to shifting
customer demands. NoSQL offers high performance, flexibility, and ease of use.
NoSQL is also a wise choice when dealing with large or constantly changing data sets, flexible data
models, or requirements that don’t fit into a relational model. Document databases, like CouchDB,
MongoDB, and Amazon Document DB, are useful for handling large amounts of unstructured data.
Key Highlights on SQL vs NoSQL
SQL NoSQL
These databases are not suited for hierarchical These databases are best suited for
data storage. hierarchical data storage.
These databases are best suited for complex These databases are not so good for complex
queries queries
SQL stands for Structured Query Language. Which is based on relational algebra and schema is
fixed in this which means data is stored in the form of columns and tables. SQL follows ACID
properties which means Atomicity, Consistency, Isolation, and Durability are maintained. There are
three types of languages present in SQL :
Data Definition Language
Data Manipulation Language
Data Control Language
Examples:
Commercial:Oracle,IBM
Open Source: Mysql
NoSql stands for no-SQL which can be related to non-relational. In a NoSQL database storage and
retrieval of data done in other than tabular relations are used in relational databases therefore it is
schema-free. NoSQL is also an open-source and distributed type of database.
There are four categories depending on the type of data model they support:
Document Stores
Graph Database
Wide Column Databases
Key-value stores
Examples: Cassandra, MongoDB, CouchDB
NewSQL is a modern relational database whose goal is to combine ACID guarantees of SQL with
the scalability and high performance of NoSQL. It is made using modern programming languages
and technology that wasn’t available before.
Examples: VoltDB, Clustrix
It is both schema-fix
Schema It is schema-fix. It is schema-free.
and schema-free.
No distributed Distributed
Databases Distributed database.
database. database.
Complex queries
Simple queries can be Highly efficient for
Query Handling can be directed
handled. complex queries.
better than SQL.
“This course was packed with amazing and well-organized content! The project-based approach of
this course made it even better to understand concepts faster. Also the instructor in the live classes
is really good and knowledgeable.”- Tejas | Deutsche Bank
With our revamped Full Stack Development Program : master Node.js and React that enables you
to create dynamic web applications.So get ready for salary hike only with our Full Stack
Development Course .
The Columnar Data Model of NoSQL is important. NoSQL databases are different from SQL
databases. This is because it uses a data model that has a different structure than the previously
followed row-and-column table model used with relational database management systems
(RDBMS). NoSQL databases are a flexible schema model which is designed to scale horizontally
across many servers and is used in large volumes of data.
Columnar Data Model of NoSQL :
Basically, the relational database stores data in rows and also reads the data row by row, column
store is organized as a set of columns. So if someone wants to run analytics on a small number of
columns, one can read those columns directly without consuming memory with the unwanted data.
Columns are somehow are of the same type and gain from more efficient compression, which
makes reads faster than before. Examples of Columnar Data Model: Cassandra and Apache Hadoop
Hbase.
Working of Columnar Data Model:
In Columnar Data Model instead of organizing information into rows, it does in columns. This
makes them function the same way that tables work in relational databases. This type of data model
is much more flexible obviously because it is a type of NoSQL database. The below example will
help in understanding the Columnar data model:
Row-Oriented Table:
S.No. Name ID
01. Tanmay 2
02. Abhishek 5
03. Samriddha 7
04. Aditi 8
S.No. Course ID
01. B-Tech 2
02. B-Tech 5
03. B-Tech 7
04. B-Tech 8
S.No. Course ID
S.No. Branch ID
01. Computer 2
02. Electronics 5
03. IT 7
04. E & TC 8
Columnar Data Model uses the concept of keyspace, which is like a schema in relational models.
Advantages of Columnar Data Model :
Well structured: Since these data models are good at compression so these are very structured
or well organized in terms of storage.
Flexibility: A large amount of flexibility as it is not necessary for the columns to look like each
other, which means one can add new and different columns without disrupting the whole
database
Aggregation queries are fast: The most important thing is aggregation queries are quite fast
because a majority of the information is stored in a column. An example would be Adding up
the total number of students enrolled in one year.
Scalability: It can be spread across large clusters of machines, even numbering in thousands.
Load Times: Since one can easily load a row table in a few seconds so load times are nearly
excellent.
Disadvantages of Columnar Data Model:
Designing indexing Schema: To design an effective and working schema is too difficult and
very time-consuming.
Suboptimal data loading: incremental data loading is suboptimal and must be avoided, but
this might not be an issue for some users.
Security vulnerabilities: If security is one of the priorities then it must be known that the
Columnar data model lacks inbuilt security features in this case, one must look into relational
databases.
Online Transaction Processing (OLTP): Online Transaction Processing (OLTP) applications
are also not compatible with columnar data models because of the way data is stored.
Applications of Columnar Data Model:
Columnar Data Model is very much used in various Blogging Platforms.
It is used in Content management systems like WordPress, Joomla, etc.
It is used in Systems that maintain counters.
It is used in Systems that require heavy write requests.
It is used in Services that have expiring usage.
In our contemporary online era, data and the Internet of Things( IoT) are inherently
interconnected, playing vital places in shaping our digital existence.
In this article, we will explore What is IoT analytics and how it works. and why it is important to
explore IoT analytics.
Table of Content
What is IoT Analytics?
The Significance of Data Analytics in IoT
How Does Employing a Data Analyst in IoT Benefit Businesses?
The 7 Roles of Data Analysts in IoT
Why is IoT data analytics important?
What can IoT analytics do?
IoT Analytics – Use case
IoT Analytics Tools
Conclusion
What is IoT Analytics?
As the internet’s usage continues to grow among individuals and businesses, there’s an exponential
increase in both data consumption and production. As of the close of 2020, it was anticipated that
nearly 30.73 billion IoT devices would be in operation. The Internet of Thing represents a digital
ecosystem where various elements — human resources, technologies, networks, and tools interact
and connect to achieve shared objectives.
IoT analytics refers to collect , process and analyze data that are generated by IoT devices. As
more devices are connected in the internet, it generate a large amount of data that provides a
valuable insights and provide valuable information from that particular data. IoT can be the subset
of Big data and it consist of heterogenous streams that combined and transformed to correct
information.
The Significance of Data Analytics in IoT
Data analytics is a process of analysing unstructured data to give meaningful conclusions.
Numerous of the methods and processes of data analytics are automated and algorithms
designed to process raw data for humans to understand.
IoT devices give large volumes of precious data that are used for multiple applications. The
main goal is to use this data in a comprehensive and precise way so that it’s organized, and
structured into a further usable format.
A Data analytics uses methods to process large data sets of varying sizes and characteristics, it
provides meaningful patterns, and extracts useful outputs from raw data.
Manually analyzing these large data sets is veritably time consuming, resource intensive, and
expensive. Data analytics is used for saving time, energy, resources and gives precious
information in the form of statistics, patterns ,and trends.
Organizations use this information to improve their decision making processes, apply further
effective strategies, and achieve desired outcomes.
How Does Employing a Data Analyst in IoT Benefit Businesses?
Data analysis gives substantial value to their associations by interpreting, analyzing, and presenting
findings grounded on specific data sets. In the environment of IoT, data analysis offers precious
data entry services that enable businesses to harness the potential of IoT alongside effective data
analytics methods, driving growth and competitive advantage.
Generally, businesses matriculate the services of data analysis to streamline primary operations
related to the optimal application of precious datasets. Data analysis helps in sifting through
expansive data clusters, recognizing user patterns, and structuring valuable insights into a
comprehensible format for the business platoon’s interpretation and operation.
Well- organized and structured data yields perceptivity into consumer preferences and choices,
enabling businesses to boost profit and gain a competitive edge. A data analyst’s primary ideal in
IoT is to identify ongoing industry trends and offer competitive analysis. In this capacity, data
analysts take over seven pivotal responsibilities and roles within organizations.
The 7 Roles of Data Analysts in IoT
The roles of data analysts within organizations are contingent on their knowledge, skills, and
expertise. Here are seven prominent roles that data analysts fulfil:
1. Determining Organizational Goals: A data analyst’s most crucial part is helping a business
define its primary organizational objectives. This original step is vital for setting a business
apart, outperforming challengers, and attracting the right audience. Data analysts collaborate
with staff and team members to monitor, track, gather, and analyse data, necessitating access to
all available data used within the organization.
2. Data Mining: Data analysts gather and mine data from internet sources and company
databases, conducting analysis and research. This research helps businesses understand market
dynamics, current trends, competitor activities, and consumer preferences.
3. Data Cleaning: Data analysts play an essential part in data cleansing, a critical aspect of data
preparation. Data cleansing involves correcting, identifying, and analyzing raw data,
significantly improving decision making by providing accurate and precise data.
4. Data Analysis: Data analysts offer data entry services that include data analysis. They employ
ways to efficiently explore data, excerpt relevant information, and give accurate answers to
business-specific questions. Data analysts bring statistical and logical tools to the table,
enhancing a business’s competitive advantage.
5. Recognizing Patterns and Identifying Trends: Data analysts excel in recognizing trends
within industries and making sense of vast datasets. Their expertise in identifying industry
trends enables businesses to enhance performance, estimate strategy effectiveness, and more.
6. Reporting: Data analysts convert essential insights from raw data into reports that drive
advancements in business operations. Reporting is vital for monitoring online business
performance and safeguarding against data misuse. It serves as the primary means to measure
overall business performance.
7. Data and System Maintenance: Data analysts also contribute to maintaining data systems and
databases, ensuring data coherence, availability, and storage align with organizational
requirements. Data analysts employ ways to enhance data gathering, structuring, and evaluation
across various datasets.
Why is IoT data analytics important?
IoT( Internet of Things) data analytics is pivotal for several reasons:
1. Practicable Insights: IoT devices generate massive amounts of data from various sources.
Analysing this data allows organizations to extract precious insights and make informed
decisions. By understanding patterns and trends, businesses can optimize processes, upgrade
effectiveness, and enhance overall performance.
2. Real- Time Decision: Making IoT data analytics enables real- time processing and analysis of
data aqueducts. This is particularly important in applications where quick decisions are
essential, such as in industrial settings, healthcare monitoring, and smart megacity
infrastructure. Real- time insights empower organizations to respond instantly to changing
conditions.
3. Predictive Maintenance: IoT data analytics can be used to predict when equipment or devices
are likely to fail. By covering and analyzing performance data, organizations can apply
predictive maintenance strategies, reducing time-out and minimizing the costs associated with
unanticipated failures.
4. Cost effectiveness Analyzing: IoT data helps identify areas for optimization and cost
reduction. Whether it’s streamlining operations, enhancing resource utilization, or minimizing
energy consumption, data analytics plays a crucial part in achieving cost effectiveness.
5. Enhanced Customer Experience: In sectors like retail and healthcare, IoT data analytics can
be leveraged to understand customer behavior and preferences. This information can be used to
personalize services, enhance customer satisfaction, and tailor offerings to meet specific
requirements.
6. Security and Anomaly Detection: With the increasing number of connected devices, security
becomes a paramount concern. IoT data analytics can be applied to detect anomalies and
possible security pitfalls. By continuously monitoring data streams, organizations can identify
unusual patterns that may indicate a security breach.
7. Scalability and Flexibility: As IoT ecosystems grow, traditional techniques of data analysis
may become inadequate. IoT data analytics platforms are designed to handle the scalability and
diversity of data generated by a multitude of devices. This ensures that analytics capabilities
can evolve alongside expanding IoT infrastructures.
8. Regulatory Compliance: In certain industries, there are regulatory demands regarding data
collection, storage, and privacy. IoT data analytics platforms can help organizations adhere to
these regulations by delivering tools for secure data management and compliance reporting.
9. Innovation and Product Development: Understanding how customers interact with IoT
devices can inform the development of new products and services. Analytics on usage patterns
and user feedback can guide innovation and lead to the creation of further effective and user-
friendly solutions.
What can IoT analytics do?
IoT analytics can do the following:
1. Data Processing and Integration: Handle large volumes of data generated by IoT devices,
integrating and processing different data types for meaningful insights.
2. Real-Time Decision-Making: Give nonstop monitoring of IoT data streams, enabling
immediate responses to changing conditions or events.
3. Predictive Analytics: Forecast trends and potential issues based on historical data, facilitating
visionary decision- making and preventative measures.
4. Functional Effectiveness: Optimize processes, resource allocation, and energy usage by
identifying patterns and inefficiencies in IoT- generated data.
5. Predictive conservation: Anticipate equipment failures or maintenance requirements, reducing
time-out and minimizing functional dislocations.
6. Cost Optimization: Identify areas for cost reduction and efficiency enhancement by analysing
IoT data for resource utilization and process optimization.
7. Security and Anomaly: Detection Monitor IoT data for irregularities and potential security
pitfalls, enabling timely detection and response to cybersecurity issues.
8. Customer Perceptivity: Analyze user behavior and preferences from IoT data to enhance
customer experiences, personalize services, and tailor offerings.
9. Supply Chain Optimization: Improve supply chain visibility and effectiveness by analysing
data from connected devices throughout the supply chain process.
10. Regulatory Compliance: Helps organizations in adhering to data privacy and regulatory
essentials by providing tools for secure data management and compliance reporting.
IoT Analytics – Use case
Here are several IoT analytics use cases across various industries
Smart Agriculture
Use Case: Monitoring crop conditions
How IoT Analytics Helps: Analysing data from detectors measuring soil humidity,
temperature, and thundershower conditions to optimize irrigation and enhance crop yield.
Healthcare
Use Case: Remote patient monitoring
How IoT Analytics Helps: using wearable devices to collect and analyse patient health data,
enabling proactive healthcare interventions and reducing hospital admissions.
Manufacturing
Use Case: Quality control and predictive maintenance
How IoT Analytics Helps: Monitoring product line data to identify defects, ensure product
quality, and predict equipment maintenance needs to minimize time-out.
Smart metropolises
Use Case: Traffic management
How IoT Analytics Helps: Analysing data from connected traffic cameras, sensors, and GPS to
optimize traffic flow, reduce congestion, and enhance overall urban mobility.
Retail
Use Case: Customer behaviour analysis
How IoT Analytics Helps: Analysing data from in- store sensors and beacons to understand
customer preferences, optimize product placements, and personalize the shopping experience.
Energy Management
Use Case: Smart grid optimization
How IoT Analytics Helps: Analysing data from smart measures and grid sensors to optimize
energy distribution, predict and prevent outages, and encourage energy effectiveness.
Supply Chain
Use Case: Inventory management
How IoT Analytics Helps: Monitoring and analysing data from connected devices throughout
the supply chain to optimize inventory situations, reduce carrying costs, and minimize stock
outs.
Environmental Monitoring
Use Case: Air quality management
How IoT Analytics Helps: Analysing data from air quality sensors to monitor pollution
situations, assess environmental impact, and implement measures to enhance air quality.
Building Automation
Use Case: Energy effectiveness in smart buildings
How IoT Analytics Helps: Analysing data from sensors to optimize heating, ventilation, and
air conditioning( HVAC) systems for energy effectiveness and occupant comfort.
Logistics
Use Case: Fleet management
How IoT Analytics Helps: Analysing data from GPS trackers and sensors on vehicles to
optimize routes, monitor fuel consumption, and improve overall fleet effectiveness.
IoT Analytics Tools
There are several IoT analytics tools available that cater to different aspects of data processing,
analysis, and visualization in the context of the Internet of Things( IoT). Here are some notable
ones:
1. Microsoft Azure IoT Analytics: Description Part of the Azure IoT Suite, it offers capabilities
for processing and analysing large quantities of IoT data. It includes tools for data storage,
transformation, and querying.
2. AWS IoT Analytics: Description A service provided by Amazon Web Services( AWS), it
allows users to clean, process, store, and analyze IoT data. It integrates with other AWS
services for comprehensive IoT solutions.
3. IBM Watson IoT Platform: Description Offers analytics and AI capabilities for IoT data,
allowing organizations to conclude actionable insights. It includes features for real- time data
analysis and predictive maintenance.
4. Google Cloud IoT Core and Cloud IoT Analytics: Description Google Cloud offers IoT Core
for device management and Cloud IoT Analytics for processing and analysing IoT data. It
integrates with other Google Cloud services for comprehensive data solutions.
5. Thing Speak: Description An IoT analytics platform by Math Works, it allows users to collect,
analyse, and visualize IoT data in real- time. It’s well- suited for applications involving sensor
data and monitoring. DescriptionC3.ai provides an IoT analytics platform that enables
organizations to build and deploy AI- driven applications for various use cases, including
predictive maintenance and energy management.
6. Predix( by GE Digital): Description Predix is a platform specifically designed for industrial
IoT applications. It provides tools for data analytics, machine learning, and application
development in the artificial sector.
7. Ubidots: Description Ubidots is a cloud- based IoT platform that offers analytics and
visualization tools. It’s designed to simplify the process of building IoT applications and
dashboards.
8. Particle: Description Particle provides an IoT platform that includes tools for device
management, connectivity, and data visualization. It’s suitable for IoT systems ranging from
prototypes to product.
9. Kaa IoT Platform: Description Kaa is an open- source IoT platform that offers features for
data analytics, device management, and application development. It provides flexibility for
customization based on specific IoT project requirements.
Conclusion
In our digital world, where data plays a very important role in every aspect of our day to day lives,
the capability to harness data’s potential and leverage it for organizational benefit which offers
huge advantages. These advantages empower ultramodern businesses to enhance their operations
and secure sustainable growth.
Cryptography plays a vital role in securing information and data through the use of codes that
transform the data into an unreadable format without a proper key thus preventing unauthorized
access. In simple words, we can say that organizations use cryptography to secure communication
channels.
What is IoT?
The Internet of Things (IoT) describes a network of physical devices, like appliances and vehicles,
equipped with sensors, and software. These environmentally aware objects can gather data about
their surroundings and communicate with other devices to share information. Sensors are a major
part of any IoT application. Sensors gather real-time information from the environment such as
temperature, pressure, and more. It is massively scalable and efficient with less power.
Let’s look at a real-world example, Modern refrigerators often come with in-built sensors that detect
if the door has been left open for an extended period. These sensors trigger an alert system, such as an
audible beep to notify the user to close the door so that optimal temperature and energy efficiency can
be maintained.
Why Do We Need Cryptography in IoT?
IoT devices pump out large amounts of sensitive data which is not safe in this digital world. Hackers
can easily steal valuable information that they shouldn’t have access to. Even worse, they might
hijack these devices and use them to spread misinformation or perform malicious attacks.
Cryptography can keep IoT devices and the data they transfer secure. Messages come with a special
key and they can only be decoded with that particular key.
Here comes the role of encryption that significantly reduce the potential entry points for hackers that
target the data between IoT devices which result application incomplete. Although
most data undergoes in-transit encryption during web transmission thus centralized encryption are
often lacking due to resource limitations.
Diagrammatic Representation of Symmetric Key Encryption in Cryptography
2.1 Introduction:
Internet of things applications span wide range of Dubai including homes, cities environment,
energy systems, retail, logistics, industry, agriculture and health.
2.2 Home Automation:
2.2.1 Smart lighting : Smart lighting for homes helps in saving energy by adapting the lighting to
the ambient condition and switching on/off or dimming the lights when needed. Key enabling
technologies for smart lighting include solid-state lighting such as LED light and IP enabled the
lights. For solid state lighting solution both spectral and temporal characteristics can be configured
to adapt illumination to various needs. Smart lighting solutions for Home achieve energy saving by
sensing the human movements and their environments and controlling the lights accordingly.
Wireless enabled and internet connected lights can be controlled remotely from IoT applications
such as a mobile or web application. Smart lights with sensor for occupancy, Temperature Lux level
etc can be configured to adapt the lighting based on the ambient conditions sensed, in order to
provide a good ambience. In controllable LED lighting system is presented that is embedded with
ambient intelligence gathered from a distributed smart wireless sensor network to optimize and
control the lighting system to be more efficient and user oriented, A solid state lighting model is
implemented on your wireless sensor network that provide services for sensing illumination
changes and dynamically adjusting luminary brightness according to user preferences.
2.2.2 Smart Appliances Modern homes have a number of appliances such as TVs, refrigerator,
music system, washer / dryer etc. Managing and controlling these appliances can be cumbersome
with the each appliance having its own controls or remote controls. Smart appliance make the
management easier and also provide status information to the user remotely. Examples smart
watches /dryers that can be controlled remotely and notify when the washing / driving cycle is
complete smart thermostat Allow controlling the temperature remotely and can learn the user
preferences smart refrigerator can keep track of the item stored and send update to the user when an
item is low on stock. Smart TV Allows user to search and stream videos and movies from the
internet on a local storage drive, search TV channel schedule and fetch news weather updates and
other content from the internet. Open remote is an open source automation platform for homes
buildings. Open remote is platform agnostic and works with standard hardware. With OpenRemote,
user can control various appliances using mobile or web publications. OpenRemote comprises of
three components - a controller that manages scheduling and runtime integrations between device, a
designer that allows you to create both configuration for the controller and create user interface
design and control panel that allow you to interact with the devices and control them.
2.2.3 Intrusion Detection: Home Intrusion detection system used security cameras and sensor such
as PIR sensors and door sensor to detect intrusion and raise alert. Alerts can be in the form of an
SMS and an email sent to the user. Advanced systems can even send detailed alert such as an image
grab or a short video clip send to email attachment. cloud controlled intrusion detections system is
described in that uses location aware services, where the geo location of each node of your home
automation system used independently detected and the stored in the cloud in the event of
Institutions the cloud services alert the accurate neighbors who are using the home automation
system is independently detected and stored in the cloud. Event of intrusion, the cloud services
after the accurate neighbors or local police. In an intrusion detection system based on UPnP
technology is described. The system uses image processing to recognize the Institutions and extract
institution subject and to generate Universal plug and play instant messaging for alert .
2.2.4 Smoke / Gas Detector Smoke detectors are installed in home and buildings to detect smoke
that is typically and early sign of Fire. Smoke detectors use optical detection class ionization for
sampling techniques to detect smoke. Alerts raised by smoke detectors can be in the form of signals
to fire alarm system. Gas detectors can detect the presence of harmful gases such as carbon
monoxide liquid Petroleum gas (LPG). A Smoke / gas detector raise alerts in human was this
describing where the problem is send or an SMS or email to the user or the local fire safety
department and provide visual feedback on its status the design of the system that detects gas
leakage on smoke and it gives visual level indication.
2.7 logistics
2.7.1 Route Generation and Scheduling Modern transportation systems are driven by data collected
from multiple sources which is process to provide a new services to the stockholders. By collecting
large amount of data from various sources and processing the data into Useful information data
driven. Transportation system can provide new services such as advanced route guidance dynamic
vehicle routing anticipating customer demand for pickup and delivery problem, for instance route
generations and scheduling systems candidate end-to-end using combinations of road patterns and
transportation smooth and feasible schedule based on the availability of vehicles.
2.7.2 Fleet Tracking Vehicle fleet tracking system using GPS technology to track the locations of
vehicle in real time. Cloud based fleet tracking systems can be scaled up on-demand to handle large
number of vehicles. Alerts Can be generated in case of deviation in planned routes. The vehicle
locations and routes data can be aggregated and analyzed for detecting bottlenecks in the supply
chain such as a traffic conditions or route of elements and generations of alternative route and
supply chain Optimization in a fleet tracking system for commercial vehicle is described the system
can analyze messages sent from the vehicles to identify unexpected incidence and descriptions is
between the actual and applied data.
2.7.3 Shipment Monitoring: Shipment monitoring solutions for Transportation systems allow
monitoring the conditions inside container. For example, containers carrying Fresh Food produce
can be monitored to prevent spoilage of food. IoT based shipment monitoring system you sensor
such as temperature pressure and humidity for instance to monitor the conditions inside the
container and send the data to the cloud where it can be analyzed to detect food spoilage . The
analysis and interpretation of the data in the environmental conditions in the container and food
truck positioning can enable more effective routing decisions and their time therefore it is possible
to take remedial measures such as the food that has a limited time budget before it get rotten can be
rerouted to a closer destinations, alerts can be raised to the driver and the distributor about the
transit conditions, such as container temperature exceeding the allowed limit, humidity levels going
out of the allowed limit. For instance, and corrective actions can be taken before the food gets
damaged. A Cloudbased frame work for real time fresh Food Supply tracking and monitoring was
proposed . For fragile products vibrations levels during shipment can be tracked using
accelerometer and gyroscope sensors attached to IoT device. A system for monitoring container in
integrity and operating conditions described. The system monitors the vibrations patterns on their
container and its contents to reveal information related to its operating environment and integrity
during transport handling and storage.
2.7.4 Remote Vehicle Diagnostics Remote vehicle diagnostic systems can detect faults in the
vehicles warn of impending fault. These Diagnostic system use on-board IoT devices for collecting
data on vehicle operation such a speed, engine RPM, coolant temperature, fault code number ,and
status of the various vehicle subsystem such data can be captured by integrating on-board
diagnostic systems with IoT devices using protocols such as CAN bus. Modern commercial
vehicles support on-board diagnostics OBD standards such as OBDII. OBD system provides real
time data status of vehicle subsystems and diagnostic trouble code which Allow rapidly identifying
the fault in the vehicle. IoT based vehicle diagnostic system can send the vehicle data to centralized
serves or the cloud where it can be analyzed to generate alerts and suggest remedial actions.
2.9 Industry:
2.9.1 Machine Diagnosis and Prognosis: Machine prognosis refers to predicting the performance
of a machine by analyzing the data and the current operating conditions and how much deviations
exist from the normal operating conditions. Machine diagnosis refer to determining the causes of a
machine fault. IoT plays a major role in both the prognosis s and Diagnostics of industrial
machines . Industrial machines have a large number of components that must function correctly for
the machine to perform its operations. Sensors in machine can monitor the operating conditions
such as temperatures and vibrations levels. The sensor data measurements are done on time scale of
few milliseconds to few seconds, which leads to generations of the massive amount of the data.
Case-based reasoning (CBR) is a commonly used method that find solutions to new problem based
on past experience. This past experience is organized and represented as case in a case base. CBR
is an effective techniques for problem solving in the field in which it is hard to establish a
quantitative mathematical model, such as machine Diagnostics and prognosis . Since foe each
machine, data from a very large number of sensors is collected using search high-dimensional data
for creation of a case library reduce the case retrieval efficiency. Data reduction and feature
extraction methods are used to find the representatives set of ability as the of features.
2.9.2 Indoor Air Quality Monitoring: Monitoring indoor air quality in factories is important for
health and safety of the workers. harmful and toxic gas such as carbon monoxide, nitrogen
monoxide and Nitrogen dioxide etc t can cause serious health problems .IoT based gas monitoring
system can help in monitoring the indoor air quality using various gas sensors. The indoor air
quality can vary for different locations. Wireless sensor networks based IoT devices can identify the
other hazardous zones, so that a corrective measures can be taken to ensure proper ventilation
2.8 Agriculture
2.8.1 Smart Irrigation Smart irrigation systems can improve crop yield while saving water. smart
irrigation system using IoT devices with soil moisture sensors to determine the amount of moisture
in the soil and realize the flow of water through the irrigation pipe only when the moisture level go
below a predefined threshold. Smart irrigation systems also collect Moisture level measurements on
a server or in the cloud where they collected data can be analyzed to plant watering schedule.
2.8.2 Green House Control Green house structures with glass or plastic roofs that provide
conducive environment for growth of plants . The Climatological conditions inside a Greenhouse
can be monitored and controlled to provide the best conditions for growth of plants. The
temperature , humidity, soil moisture, light and carbon dioxide levels are monitored using sensors
and their climatological conditions are controlled automatically using actuation devices. IoT
system plays an important role in greenhouse controlled and help in improving productivity. The
data collected from various sensors is stored on centralized servers or in the cloud where analysis is
performed to optimize the control strategies and also correlate the productivity with different
control strategies. the system uses wireless sensor network to monitor and control the agriculture
parameters like temperature and humidity in real time for better management and maintenance of
Agricultural production
2.5 Energy
2.5.1 Smart grid is a data communication network integrated with electrical grid that collects and
analyses data captured in real-time about power transmission, distribution and consumption. Smart
grid Technology provides protective information and recommendations to utilities, their suppliers,
and their customers on how best to manage power.
Smart Grids collect data regarding electricity generation (centralized or distributed conceptions
(instantaneous or predictive ) storage(or conversion of the energy into other forms), distributions and
equipment health data.
Smart fleet use high speed, fully integrated two way communication real time information and power
exchange. Smart meters can capture almost real time consumption, remotely control the conceptions
of electricity and remotely switch off supply when required.
power thefts can be prevented using smart metering by analyzing the data on power generation,
transmission and consumption smart grid can improve efficiency throughout the electric system.
Storage collection and analysis of smart grids data in the cloud can help in dynamic smart grids data
can improve energy usage levels via energy feedback to user coupled with real-time pricing
information.
Real time demand response and management strategies can be used for lowering peak demand and the
overall load via appliance control and energy storage mechanism.
Condition monitoring data collected from power generation and transmission system can help in
detecting fault and predicting outage.
2.5.2 Renewable Energy System :
Due to the variability in the output from renewable energy sources such as solar and wind integrating
them into the grid can cause grid stability and reliability problems. Variable output produces local
voltage swing that can impact power quality. Existing grids were designed to handle power flow from
centralized to generation source to the loads through Transmission and distribution lines.
When distributed renewable energy sources are integrated into the grid, they create power
bidirectional power flow for which the great were not originally designed to handle power flow from
temperature generation sources to the loss through transmissions and distribution lines.
When Distributed renewable energy sources are integrated into the grid is a create power bi-
directional power flows for which the grid were not originally designed. IoT based system integrated
within Transformer at the point of interconnections measures the electrical variables and how much
power is fed into the grid.
To ensure the grid stability one solution is to simply cut of the over protections. For wind energy
systems, closed-loop controls can be used to regulate the voltage at points of interconnection
2.5.3 Prognostics:
Energy systems (Smart grids, power plants ,wind turbine forms) have a large number of critical
components that must function correctly so that the system can perform their operations correctly.
For example a wind turbine has a number of critical components example bearing, turning gears, for
instance that must be monitored carefully as wear and tear in such critical components or sudden
change in operating conditions of the missions can result in failures. In system such as Power Grids,
real time information using specialist electrical sensor is called phasor measurement unit(PMU)
substations.
The information received from PMU must be monitored in real time for estimating the state of a
system and for predicting failures. Energy system have thousands of sensors that gather real time
maintenance data continuously for condition monitoring and failure prediction purposes. IoT based
prognostic real-time health management systems can predict performance of machine or energy
Systems by analyzing the extent of deviations of the system from its normal operating profiles.
Analyzing massive amounts of maintenance data collected from sensors in energy systems and
equipment can provide protections for the impending failures (potentially in real time) so that their
reliability and availability can be improved. Prognostic health Management systems have been
developed for different energy systems open PDC set of applications for processing of streaming time
series data collected from phasor measurement units PMU in real time.
The paper aims to do a survey along with a comparative analysis of the various cryptography libraries
that are applicable in the field of Internet of Things (IoT). The first half of the paper briefly introduces
the various cryptography libraries available in the field of cryptography along with a list of all the
algorithms contained within the libraries. The second half of the paper deals with cryptography
libraries specifically aimed for application in the field of Internet of Things. The various libraries and
their performance analysis listed down in this paper are consolidated from various sources with the
aim of providing a single comprehensive repository for reference to the various cryptography libraries
and the comparative analysis of their features in IoT. Keywords ECC, wolfSSL, RELIC,
AvrCryptoLib, TinyECC, WiseLib 1. INTRODUCTION The implementation of encryption and
decryption in the field of cryptography provides a solid means of relaying messages to and fro
between users without the added risk of the message being compromised to unwanted personnel. Such
encryptiondecryption operations are performed by various ways ([3], [7], [15]) through the use of
specific set of algorithms. A cryptography library is a sort of repository of the various algorithms
available for cryptographic purposes, which provides the added function of categorising the
multitudes of algorithms into specific collections based on their performance capacities and functions.
In the field of IoT, microprocessors and embedded devices with low computational power plays the
vital role of exchange of information using the internet infrastructure. Such constraints to
computational capabilities and the necessity of secure exchange of information calls upon the need to
implement algorithms specifically optimized to run in resource constrained environments. As such
cryptography libraries aimed for use in microprocessors and embedded devices plays a very important
role for providing the necessary security layers to IoT devices and securing up the overall IoT
infrastructure.
2. OVERVIEW: In this paper Section 3 will briefly introduce the various cryptography libraries
available for encryption in general. It will also list all the encryption algorithms available in the
various cryptography libraries. In section 4, we will discuss in details the various cryptography
libraries in IoT. In section 5, we will do a comparative analysis amongst the various cryptography
libraries discussed in section 4 based on their unique features. We conclude the paper in section 6. 3.
CRYPTOGRAPHY LIBRARY There exist numerous cryptography library encompassing multitudes
of encryption algorithms which can be implemented for encryption of different messages in various
fields. These cryptography libraries enable the implementation of various security measures ([11])
through the use of the containing algorithms. Some of the most prominent cryptography library ([5])
along with their encryption algorithms is listed below: i. Borzoi: The “borZoi” cryptography library
implements an algorithm based on elliptic curves (as such known as Elliptic Curve Cryptography
Library) ([4], [9], [10], [14], [36]). It implements the following algorithms which ranges over a finite
field bearing a characteristic 2 (GF2m) ([1]): a. ECDSA (Elliptic Curve Digital Signature Algorithm)
b. Elliptic Curve Diffie-Hellman Key Agreement Scheme c. ECIES (Elliptic Curve Integrated
Encryption Scheme) borZoi is also implemented with AES Symmetric encryption scheme and one
other algorithm to produce SHA-1, its digital signature which are as follows ([1]) : a. AES (Rijndael)
Symmetric Encryption Scheme b. SHA-1 hash algorithm ii. Crypto++ : Written in C++, this
cryptography library implements various algorithms ranging from authenticated encryption schemes
(like GCM, CCM etc.) to algorithms based on elliptic curves (like ECDSA, ECNR etc) ([13]). The
various algorithms implemented by Crypto++ are as follows ([2]): a. GCM, CCM, EAX
b. AES (Rijndael), RC6, MARS, CAST-256, Twofish, Serpent c. Panama, Sosemanuk, Salsa20,
XSalsa20 d. IDEA, Triple-DES, Camellia, SEED, XTEA, Skipjack, SHACAL-2, RC5, Blowfish e.
ECB, CBC, CTS, CFB, OFB, CTR f. VMAC, HMAC, CBC-MAC, GMAC, TwoTrack-MAC g.
SHA-1, SHA-2, SHA-3, WHIRLPOOL, Tiger, RIPEMD-128, RIPEMD-256, RIPEMD-160,
RIPEMD-320 h. ECDSA, ECNR, ECIES, ECMQV, ECDH i. MD2, MD4, MD5, Panama Hash,
Square, GOST, SAFER, SEAL 3.0, DES, ARC4, DESX, RC2, 3-WAY, WAKE-OFB, CAST-128,
SHARK j. Diffie-Hellman, XTR-DH, DH2, MQV, LUCDIF k. PKCS#1 v2.0, OAEP, PSS, IEEE
P1363 EMSA2- EMSA5, PSSR l. ESIGN, LUC, RSA, DSA, ElGamal, RW, NR, DLIES iii.
Libmcrypt: The “libmcrypt” cryptography library provides encryption of data and is thread safe. This
specific library contains a set of encryption algorithms and modes which are modular in nature. This
nature allows algorithms and the encryption modes to operate in a much efficient manner. The various
algorithms contained within the framework of this library are tabulated in Table 1: Table 1:
Algorithms in Libmcrpyt library iv. Botan (formerly known as OpenCL): This cryptography library is
written in C++ and licensed under BSD2 ([23], [28]). It was later implemented with a “Card
Verifiable Certificate” for ePassports and this modified version of Botan was named “InSiTO”. This
library contains a number of encryption formats, algorithms and protocols which are tabulated in
Table 2: Table 2: Algorithms in Botan library v. Libgcrypt: Written in C language, the “libgcrypt” is a
multi-platform cryptography library licensed under GNU Lesser General Public License GNU
General Public License ([32]). It features a multiple precision arithmetic implementation and entropy
gathering utility ([37]). The cryptography algorithms in this library are tabulated in Table 3:
Table 3: Algorithms in Libgcrypt library vi. Bouncy Castle: This particular cryptography library is
written in Java and C#
([41]). Designed mainly for use in devices with low computational memory, this library contains the
algorithms listed in
Table 4: Table 4: Algorithms in Bouncy Castle library vii. Cryptlib: The “cryptlib” cryptography
library is a library of algorithms which provides security to communication and information
exchange. Its simple interface makes it very user-friendly and its layered structure (the lower layers
each providing a layer of abstraction, the higher layers covering up the details of implementation of
the algorithms) makes up the whole library very secure and impermeable to intrusion to a very high
degree. The various algorithms within this library are tabulated in Table 5: Table 5: Algorithms in
Cryptlib library viii. Catacomb: Written using gcc, this cryptography library contains a set of
cryptographic primitives and used in Linux operating systems ([9]). Some of the most prominent
categories of algorithms within this library out of its many other are as shown in Table 6: Table 6:
Categories of algorithms in the Catacomb library ix. Cryptix: The “Cryptix” (say Cx) cryptography
library was made to provide a library of cryptographic algorithms to the Java platform as there were a
number of issues regarding adoption of cryptography in Java ([22]). With the removal of export
controls on cryptography, the use of “Cryptix” (last active development was in 2005) declined with
the increasing availability of other more secure cryptography libraries. The list of algorithms under
this library are shown in Tale 7: Table 7: Algorithms in Cryptix library x. Flexiprovider: This
cryptography library is built for use in encryption of any application built upon the JCA (Java