0% found this document useful (0 votes)
5 views

20200321145947_DSS_Chapter_SIX

The document discusses the significance of big data analytics in modern organizations, emphasizing its characteristics such as volume, variety, veracity, value, and velocity. It outlines different types of analytics, including descriptive, predictive, and prescriptive, and highlights the benefits of predictive analytics for improving efficiency, gaining competitive advantage, and reducing risks. Additionally, it addresses challenges in big data analytics, including data growth, timely insights, talent shortages, data integration, validation, security, and organizational resistance.

Uploaded by

nomore chikosi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

20200321145947_DSS_Chapter_SIX

The document discusses the significance of big data analytics in modern organizations, emphasizing its characteristics such as volume, variety, veracity, value, and velocity. It outlines different types of analytics, including descriptive, predictive, and prescriptive, and highlights the benefits of predictive analytics for improving efficiency, gaining competitive advantage, and reducing risks. Additionally, it addresses challenges in big data analytics, including data growth, timely insights, talent shortages, data integration, validation, security, and organizational resistance.

Uploaded by

nomore chikosi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

HCS 410 DECISION SUPPORT SYSTEMS

CHAPTER 6: BIG DATA ANALYTICS

6.1. INTRODUCTION
Today, many organisations are collecting, storing, and analysing massive
amounts of data. This data is commonly referred to as “big data” because of its
volume, the velocity with which it arrives, and the variety of forms it takes. Big
data is creating a new generation of decision support data management.
Businesses are recognising the potential value of this data and are putting the
technologies, people, and processes in place to capitalize on the opportunities. A
key to deriving value from big data is the use of analytics. Collecting and storing
big data creates little value; it is only data infrastructure at this point. It must be
analysed and the results used by decision makers and organisational processes in
order to generate value.

6.2. Big data


Big Data is a term that is used for denoting the collection of datasets that are large
and complex, making it very difficult to process using legacy data processing
applications. The term is also used to describe data that is high volume, high
velocity, and/or high variety; requires new technologies and techniques to
capture, store, and analyse it; and is used to enhance decision making, provide
insight and discovery, and support and optimise processes. The term Big
Data refers to a huge volume of data that cannot be stored processed by any
traditional data storage or processing units. Big Data is generated at a very
large scale and it is being used by many multinational companies
to process and analyse in order to uncover insights and improve the business of
many organisations.

6.3. Characteristics of big data


The characteristics of Big Data are differentiated by the following Vs:

1
Volume
Volume refers to the magnitude of the data that is being generated and collected.
It is increasing at a faster rate from terabytes to petabytes (1024 terabytes). With
increase in storage capacities, what cannot be captured and stored now will be
possible in future. The classification of Big Data on the basis of volume is
relative with respect to the type of data generated and time. In addition, the type
of data, which is often referred as Variety, defines Big Data. Two types of data,
for instance, text and video of same volume may require different data
management technologies.

Variety
Big Data is generated in multiple varieties. Compared to the traditional data like
phone numbers and addresses, the latest trend of data is in the form of photos,
videos, and audios and many more, making about 80% of the data to be
completely unstructured.

Veracity
Veracity basically means the degree of reliability that the data has to offer. Since
a major part of the data is unstructured and irrelevant, Big Data needs to find an

2
alternate way to filter them or to translate them out as the data is crucial in
business developments.

Value
Value is the major issue that we need to concentrate on. It is not just the amount
of data that we store or process. It is actually the amount of valuable, reliable and
trustworthy data that needs to be stored, processed, analysed to find insights.

Velocity
Last but not least, Velocity plays a major role compared to the others; there is no
point in investing so much to end up waiting for the data. So, the major aspect of
Big Data is to provide data on demand and at a faster pace.

6.4. Big data analytics


The big data revolution has given birth to different types of big data analysis. The
big data industry is buzzing around with data analytics that offers enterprise-wide
solutions for business success. By itself, stored data does not generate business
value, and this is true of traditional databases, data warehouses, and the new
technologies such as Hadoop for storing big data. Once the data is appropriately
stored, however, it can be analysed, which can create tremendous value. A
variety of analysis technologies, approaches, and products have emerged that are
especially applicable to big data, such as in-memory analytics, prescriptive,
descriptive, predictive analysis.

6.4.1. Descriptive Analysis: Insight into the past


The descriptive analysis does exactly what the name implies; they summarise
or describe raw data and make it something that is interpretable by humans.
It analyses past events, here past events refer to any point of time that an
event has occurred, whether it is one minute ago, or one month ago.
Descriptive analytics are useful as they allow organisations to learn from past
behaviours, and help them in understanding how they might influence future
outcomes.

3
With the help of descriptive analysis, we analyse and describe the features of
a data. It deals with the summarisation of information. Descriptive analysis,
when coupled with visual analysis provides us with a comprehensive
structure of data. In the descriptive analysis, we deal with the past data to
draw conclusions and present our data in the form of dashboards. In
businesses, descriptive analysis is used for determining the Key Performance
Indicator or KPI to evaluate the performance of the business.

6.4.2. Predictive Analysis: Understanding the future


Predictive analytics suggest what will occur in the future. Prescriptive analytics
uses the latest technologies such as machine learning and artificial intelligence to
understand what the impact is of future decisions and uses those scenarios to
determine the best outcome. With prescriptive analytics it becomes possible to
understand and grasp future opportunities or mitigate future risks as predictions
are continuously updated with new data that comes in. Prescriptive analytics
basically offers organizations a crystal ball. Prescriptive analytics will become
really powerful when it has developed into a stage where decision makers can
predict the future and make prescriptions to improve that predicted future,
without the needs for Big Data scientists.

With the help of predictive analysis, we determine the future outcome. Based on
the analysis of the historical data, we are able to forecast the future. It makes use
of descriptive analysis to generate predictions about the future. With the help of
technological advancements and machine learning, we are able to obtain
predictive insights about the future. Predictive analytics is a complex field that
requires a large amount of data, skilled implementation of predictive models and
its tuning to obtain accurate predictions. This requires a skilled workforce that is
well versed in machine learning to develop effective models.

BENEFITS OF PREDICTIVE ANALYSIS

Improve efficiency in production


The benefits of predictive analytics for the production and manufacturing
industries are particularly prevalent. Using predictive analytics, companies can

4
effectively forecast for inventory and required production rates, while also using
past data to estimate potential production failures. They can then use this to
prevent the same errors from occurring.

Gain advantage over competitors


Why predictive analytics? Because it can give you insight into valuable
information that already exists. It just needs digging up. Tapping into the
customer data you have available can present you with insightful information as
to why customers chose you over your competitors, highlighting unique selling
points that you can then further promote to enhance leads.

Reduce risk
Dependent on the industry, predictive analytics can be used considerably when it
comes to risk reduction. Sectors such as finance and insurance use predictive
analytics to help construct a valid depiction of a person or business they’re
screening based on all data available to them. This can then form a more reliable
interpretation of that person, business or incident which can be used to make
sensible, effective decisions.

Detect fraud
One of the most beneficial uses of predictive analysis is fraud detection. The
process is particularly attuned to fraud detection and prevention by recognising
patterns in behaviour. By tracking changes in this behaviour on a site or network,
it can easily spot anomalies that may indicate threat or fraud, which can then be
highlighted and prevented.

Better marketing campaigns


Predictive analysis can look at consumer data for specific campaigns and suggest
what is working, what is not, and what can be done to cross-sell, up-sell and
increase revenue.

5
NB: Research for more practical examples which explain how the industry has
benefited from predictive analytics

6.4.3. Prescriptive analysis: Advise on possible outcomes


This relatively new field of prescriptive analytics facilitates users to “prescribe”
different possible actions to implement and guide them towards a solution.
Prescriptive analysis is all about providing advice. It attempts to quantify the
effect of future decisions in order to advise on possible outcomes before those
decisions are actually made. Prescriptive analytics not only predicts what will
happen, but also tells why it will happen and thereby provide recommendations
regarding actions that take advantage of these predictions.

These analytics go beyond descriptive and predictive analytics by recommending


one or more possible courses of action. Essentially they predict multiple futures
and allow companies to assess a number of possible outcomes based upon their
actions. Prescriptive analytics use a combination of techniques and tools such as
business rules, algorithms, machine learning and computational modelling
procedures. These techniques are applied against input from many different data
sets including historical and transactional data, real-time data feeds, and big data.

6.4.4. In-Memory Analytics


An impediment to fast response times for queries is the time required to find data
on disk and to read it to memory. This time can be reduced anywhere from 10 to
1,000 times by in-memory analytics where the data is stored in random access
memory (RAM) rather than on a physical disk. There is no need to page data in
and out of disk storage with this technology. In-memory technology comes in two
forms: either on the platform or with the BI tool. When implemented on the
platform, the server stores the data in-memory, and data is accessed by the BI
tools. Up to a terabyte of data can be stored on computers with 64- bit operating
systems.

In-memory BI tools also move data between disk (e.g., in the data warehouse)
and the local desktop memory so that the most frequently used data (so called
“hot data") is available in memory. Some applications are especially well suited

6
for in-memory analytics. For example, with OLAP (often incorporated in reports
and dashboards/scorecards), users want to “slice and dice” data to look at the
business from different perspectives, such as comparing this and last year’s sales
in different locations. When all of the data is in memory, this analysis can be
done very quickly providing analysis at the speed of thought. Not all applications
require or are well suited for in-memory analytics. Some applications, such as a
market basket analysis that is run weekly or applications that require more data
than can be provided by current in-memory technologies are not good candidates.
While the cost and reliability of in-memory technology continues to improve, it is
still relatively expensive and prone to failure.

6.5. CHALLENGES IN BIG DATA ANALYTICS


Some of the most common of those big data challenges include the following:

Dealing with data growth


The most obvious challenge associated with big data is simply storing and
analysing all that information. In its Digital Universe report, IDC estimates that
the amount of information stored in the world's IT systems is doubling about
every two years. By end of 2020, the total amount will be enough to fill a stack of
tablets that reaches from the earth to the moon 6.6 times. And enterprises have
responsibility or liability for about 85 percent of that information.

Much of that data is unstructured, meaning that it does not reside in a database.
Documents, photos, audio, videos and other unstructured data can be difficult to
search and analyse. In order to deal with data growth, organisations are turning to
a number of different technologies. When it comes to storage, converged and
hyper-converged infrastructure and software-defined storage can make it easier
for companies to scale their hardware. And technologies like compression,
deduplication and tiering can reduce the amount of space and the costs associated
with big data storage.

7
Generating insights in a timely manner
Of course, organisations do not just want to store their big data — they want to
use that big data to achieve business goals. According to the NewVantage
Partners survey, the most common goals associated with big data projects
included the following:

 Decreasing expenses through operational cost efficiencies


 Establishing a data-driven culture
 Creating new avenues for innovation and disruption
 Accelerating the speed with which new capabilities and services are
deployed
 Launching new product and service offerings

All of those goals can help organisations become more competitive — but only if
they can extract insights from their big data and then act on those insights
quickly.

Recruiting and retaining big data talent


But in order to develop, manage and run those applications that generate insights,
organisations need professionals with big data skills. That has driven up demand
for big data experts — and big data salaries have increased dramatically as a
result. In order to deal with talent shortages, organisations have a couple of
options. First, many are increasing their budgets and their recruitment and
retention efforts. Second, they are offering more training opportunities to their
current staff members in an attempt to develop the talent they need from within.
Third, many organisations are looking to technology. They are buying analytics
solutions with self-service and/or machine learning capabilities. Designed to be
used by professionals without a data science degree, these tools may help
organisations achieve their big data goals even if they do not have a lot of big
data experts on staff.

Integrating disparate data sources


The variety associated with big data leads to challenges in data integration. Big
data comes from a lot of different places — enterprise applications, social media

8
streams, email systems, employee-created documents, etc. Combining all that
data and reconciling it so that it can be used to create reports can be incredibly
difficult. Vendors offer a variety of Extract, Transform and Load (ETL) and data
integration tools designed to make the process easier, but many enterprises say
that they have not solved the data integration problem yet.

In response, many enterprises are turning to new technology solutions. In the IDG
report, 89 percent of those surveyed said that their companies planned to invest in
new big data tools in the next 12 to 18 months. When asked which kind of tools
they were planning to purchase, integration technology was second on the list,
behind data analytics software.

Validating data
Closely related to the idea of data integration is the idea of data validation. Often
organisations are getting similar pieces of data from different systems, and the
data in those different systems does not always agree. For example, the e-
commerce system may show daily sales at a certain level while the enterprise
resource planning (ERP) system has a slightly different number. Or a hospital's
electronic health record (EHR) system may have one address for a patient, while
a partner pharmacy has a different address on record.

The process of getting those records to agree, as well as making sure the records
are accurate, usable and secure, is called data governance. Solving data
governance challenges is very complex and is usually requires a combination of
policy changes and technology. Organisations often set up a group of people to
oversee data governance and write a set of policies and procedures. They may
also invest in data management solutions designed to simplify data governance
and help ensure the accuracy of big data stores — and the insights derived from
them.

Securing big data


Security is also a big concern for organisations with big data stores. After all,
some big data stores can be attractive targets for hackers or advanced persistent

9
threats (APTs). However, most organisations seem to believe that their
existing data security methods are sufficient for their big data needs as well.

Organisational resistance
It is not only the technological aspects of big data that can be challenging —
people can be an issue too. Usual organisations are confronted with the following
issues in their bid to creating a data-driven culture:
 Insufficient organisational alignment
 Lack of middle management adoption and understanding
 Business resistance or lack of understanding

In order for organisations to capitalize on the opportunities offered by big data,


they are going to have to do some things differently. And that sort of change can
be tremendously difficult for large organisations.

10

You might also like