Introduction To Big Data: Types of Digital Data, History of Big Data Innovation
Introduction To Big Data: Types of Digital Data, History of Big Data Innovation
Introduction to Big Data: Types of digital data, history of Big Data innovation,
introduction to Big Data platform, drivers for Big Data, Big Data architecture and
characteristics, 5 Vs of Big Data, Big Data technology components, Big Data importance
I
and applications, Big Data features – security, compliance, auditing and protection, Big
Data privacy and ethics, Big Data Analytics, Challenges of conventional systems,
intelligent data analysis, nature of data, analytic processes and tools, analysis vs reporting,
modern data analytic tools.
DIGITAL DATA
Digital data is information stored on a computer system as a series of 0’s and 1’s in a
binary language. Digital data jumps from one value to the next in a step by step sequence.
Example: Whenever we send an email, read a social media post, or take pictures with our digital camera, we
are working with digital data.
Digital data can be classified into three forms:
a. Unstructured Data: The data which does not conform to a data model or is not in a form that can be used
easily by a computer program is categorized as unstructured data. About 80—90% data of an organization is
in this format.
Example: Memos, chat rooms, PowerPoint presentations, images, videos, letters, researches, white papers,
the body of an email, etc.
b. Semi-Structured Data: The data which does not conform to a data model but has some structure is
categorized as semi-structured data. However, it is not in a form that can be used easily by a computer
program.
Example : Emails, XML, markup languages like HTML, etc. Metadata for this data is available but is not
sufficient.
c. Structured Data: The data which is in an organized form (ie. in rows and columns) and can be easily used
by a computer program is categorized as semi-structured data. Relationships exist between entities of data,
such as classes and their objects.
Example: Data stored in databases.
Data sources: All big data solutions start with one or more data sources.
Example,
o Application data stores, such as relational databases.
o Static files produced by applications, such as web server log files.
o Real-time data sources, such as IoT devices.
Data storage: Data for batch processing operations is stored in a distributed file store that can hold high
volumes of large files in various formats (also called data lake).
Example,
Azure Data Lake Store or blob containers in Azure Storage.
Batch processing: Since the data sets are so large, therefore a big data solution must process data files
using long-running batch jobs to filter, aggregate, and prepare the data for analysis.
Real-time message ingestion: If a solution includes real-time sources, the architecture must include a way to
capture and store real-time messages for stream processing.
Stream processing: After capturing real-time messages, the solution must process them by filtering,
aggregating, and preparing the data for analysis. The processed stream data is then written to an output sink.
We can use open-source Apache streaming technologies like Storm and Spark Streaming for this.
Analytical data store: Many big data solutions prepare data for analysis and then serve the processed data
in a structured format that can be queried using analytical tools. Example: Azure Synapse Analytics provides a
managed service for large-scale, cloud-based data warehousing.
Analysis and reporting: The goal of most big data solutions is to provide insights into the data through
analysis and reporting. To empower users to analyze the data, the architecture may include a data modelling
layer. Analysis and reporting can also take the form of interactive data exploration by data scientists or data
analysts.
Orchestration: Most big data solutions consist of repeated data processing operations, that transform source
data, move data between multiple sources and sinks, load the processed data into an analytical data store, or
push the results straight to a report. To automate these workflows, we can use an orchestration technology
such as Azure Data Factory.
Big Data Characteristics :
Big data can be described by the following characteristics:
Volume
Variety
Velocity
1. Volume :
Big Data is a vast “volumes” of data generated from many sources daily, such as business processes,
machines, social media platforms, networks, human interactions, and so on.
Example: Facebook generates approximately a billion messages, 4.5 billion times the “Like” button is
recorded, and more than 350 million new posts are uploaded each day.
Big data technologies can handle large amounts of data.
2. Variety :
Big Data can be structured, unstructured, and semi-structured that are being collected from different sources.
Data were only collected from databases and sheets in the past, But these days the data will come in an array
of forms ie.- PDFs, Emails, audios, Social Media posts, photos, videos, etc.
3. Velocity :
Velocity refers to the speed with which data is generated in real-time.
Velocity plays an important role compared to others.
It contains the linking of incoming data sets speeds, rate of change, and activity bursts.
The primary aspect of Big Data is to provide demanding data rapidly.
Example of data that is generated with high velocity - Twitter messages or Facebook posts.
4. Veracity :
Veracity refers to the quality of the data that is being analyzed.
It is the process of being able to handle and manage data efficiently.
Example: Facebook posts with hashtags.
5. Value :
Value is an essential characteristic of big data.
It is not the data that we process or store, it is valuable and reliable data that we
store, process and analyse.
1. Ingestion :
The ingestion layer is the very first step of pulling in raw data.
It comes from internal sources, relational databases, non-relational databases, social media, emails, phone
calls etc.
There are two kinds of ingestions :
Batch, in which large groups of data are gathered and delivered together.
Streaming, which is a continuous flow of data. This is necessary for real-time data analytics.
2. Storage :
Storage is where the converted data is stored in a data lake or warehouse and eventually processed.
The data lake/warehouse is the most essential component of a big data ecosystem.
It needs to contain only thorough, relevant data to make insights as valuable as possible.
It must be efficient with as little redundancy as possible to allow for quicker processing.
3. Analysis :
In the analysis layer, data gets passed through several tools, shaping it into actionable insights.
There are four types of analytics on big data :
By analysing the big data pools effectively the companies can get answers to :
Cost Savings :
o Some tools of Big Data like Hadoop can bring cost advantages to business when large amounts of data are
to be stored.
o These tools help in identifying more efficient ways of doing business.
Time Reductions :
o The high speed of tools like Hadoop and in-memory analytics can easily identify new sources of data which
helps businesses analyzing data immediately.
o This helps us to make quick decisions based on the learnings.
Understand the market conditions :
o By analyzing big data we can get a better understanding of current market conditions.
o For example: By analyzing customers’ purchasing behaviours, a company can find out the products that are
sold the most and produce products according to this trend. By this, it can get ahead of its competitors.
Control online reputation :
o Big data tools can do sentiment analysis.
o Therefore, you can get feedback about who is saying what about your company.
o If you want to monitor and improve the online presence of your business, then big data tools can help in all
this.
Big data security is the collective term for all the measures and tools used to guard both the data and analytics
processes from attacks, theft, or other malicious activities that could harm or negatively affect them.
For companies that operate on the cloud, big data security challenges are multi-faceted.
When customers give their personal information to companies, they trust them with personal data which can
be used against them if it falls into the wrong hands.
Data compliance is the practice of ensuring that sensitive data is organized and managed in such a way as to
enable organizations to meet enterprise business rules along with legal and governmental regulations.
Organizations that don’t implement these regulations can be fined up to tens of millions of dollars and even
receive a 20-year penalty.
Auditors can use big data to expand the scope of their projects and draw comparisons over larger populations
of data.
Big data also helps financial auditors to streamline the reporting process and detect fraud.
These professionals can identify business risks in time and conduct more relevant and accurate audits.
Big data security is the collective term for all the measures and tools used to guard both the data and analytics
processes from attacks, theft, or other malicious activities that could harm or negatively affect them.
That’s why data privacy is there to prot ect those customers but also companies and their employees
from security breaches.
When customers give their personal information to companies, they trust them with personal
data which can be used against them if it falls into the wrong hands.
Data protection is also important as organizations that don’t implement these regulations can be
fined up to tens of millions of dollars and even receive a 20-year penalty.
One of the main reasons why companies comply with data privacy regulations is to avoid fines.
Organizations that don’t implement these regulations can be fined up to tens of millions of
dollars and even receive a 20-year penalty.
Big data analytics is a complex process of examining big data to uncover information, such as - hidden
patterns, correlations, market trends and customer preferences.
This can help organizations make informed business decisions.
Data Analytics technologies and techniques give organizations a way to analyze data sets and gather new
information.
Big Data Analytics enables enterprises to analyze their data in full context quickly and some also offer real-
time analysis.
Importance of Big Data Analytics :
Organizations use big data analytics systems and software to make data-driven decisions that can improve
business-related outcomes.
The benefits include more effective marketing, new revenue opportunities, customer personalization and
improved operational efficiency.
With an effective strategy, these benefits can provide competitive advantages over
rivals.
Big Data Analytics tools also help businesses save time and money and aid in gaining insights to inform data-
driven decisions.
Big Data Analytics enables enterprises to narrow their Big Data to the most relevant information and analyze it
to inform critical business decisions.
Algorithm principle
The scale
Type of the dataset
Intelligent Data Analysis (IDA) is one of the major issues in artificial intelligence and information.
Intelligent data analysis discloses hidden facts that are not known previously and provide potentially important
information or facts from large quantities of data.
It also helps in making a decision.
Based on machine learning, artificial intelligence, recognition of pattern, and records and visualization
technology, IDA helps to obtain useful information, necessary data and interesting models from a lot of data
available online in order to make the right choices.
IDA includes three stages:
(1) Preparation of data
(2) Data mining
(3) Data validation and Explanation
Analysis vs reporting
Reporting :
Once data is collected, it will be organized using tools such as graphs and tables.
The process of organizing this data is called reporting.
Reporting translates raw data into information.
Reporting helps companies to monitor their online business and be alerted when data falls outside of
expected ranges.
Good reporting should raise questions about the business from its end users.
Analysis :
Analytics is the process of taking the organized data and analyzing it.
This helps users to gain valuable insights on how businesses can improve their performance.
Analysis transforms data and information into insights.
The goal of the analysis is to answer questions by interpreting the data at a deeper level and
providing actionable recommendations.
Conclusion :
These days, organizations are realising the value they get out of big data analytics and hence they
are deploying big data tools and processes to bring more efficiency to their work environment.
Many big data tools and processes are being utilised by companies these days in the processes of
discovering insights and supporting decision making.
Data Analytics tools are types of application software that retrieve data from one or more systems
and combine it in a repository, such as a data warehouse, to be reviewed and analysed.
Most organizations use more than one analytics tool including spreadsheets with statistical functions,
statistical software packages, data mining tools, and predictive modelling tools.
Together, these Data Analytics Tools give the organization a complete overview of the company to
provide key insights and understanding of the market/business so smarter decisions may be made.
Data analytics tools not only report the results of the data but also explain why the results occurred
to help identify weaknesses, fix potential problem areas, alert decision-makers to unforeseen events
and even forecast future results based on decisions the company might make.
Below is the list some of data analytics tools :
R Programming (Leading Analytics Tool in the industry)
Python
Excel
SAS
Apache Spark
Splunk
RapidMiner
Tableau Public
KNime