0% found this document useful (0 votes)
55 views

Unit 1 FOBA

This document discusses fundamentals of business analytics including need for data driven decision making, analytical cycle and tools. It describes concepts like business analytics, intelligence and data science. It also discusses types of analytics and roles of data professionals.

Uploaded by

Dr. Roohi DSU
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
55 views

Unit 1 FOBA

This document discusses fundamentals of business analytics including need for data driven decision making, analytical cycle and tools. It describes concepts like business analytics, intelligence and data science. It also discusses types of analytics and roles of data professionals.

Uploaded by

Dr. Roohi DSU
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

FUNDAMENTALS OF BUSINESS

ANALYTICS
Module I: Monetizing data to drive
business decisions

Need for data driven decision making - Solving the


business problem using Analytics - Overview of
Analytical cycle and Hierarchy of information user -
The Complete Business Analytics professional -
Understand Business Analyst roles and Responsibilities
- Identify the Popular Business Analytics Tools.
Few concepts to understand
 Business Analytics
 Business Intelligence
 Data Analytics
 Data Science
 Big Data (4 V’s discussion)
 Sudden demand for data
analytics??

 WHY?????

 Who is a Data Analyst/


Scientist?
Analysis vs Analytics
 Analysis: Process done with the
information about what happened
(past)

 Analytics: Process of understanding


what will happen in the future
(forecast)
Types of Analytics
 Descriptive: involves only gathering data and
describing the nature of the data.

 Based on which report or summary is made. It


is past data and is reactive in nature.

 Eg. Manager collecting information about sales


of Reynolds pen and describing telling this is
the data about the sales of the pen.
 Predictive: Data from the past but will tell or inform what
will happen in future.

 Its like a business forecast or future behaviour and results.

 It is reactive in nature.

 Eg. Manager collecting information about sales of


Reynolds pen and based on the sales you are telling how
will the sales be in the next two years or in the future.

 It discusses about the association or the relationship


phenomenon. Increase in advertisement increase the sale,
increase in exercise increases your fitness.
 Prescriptive: It helps in understanding the
right course of action

 Just like the prescription from a doctor it tries


to offer solution by understanding if the
situation happens or does not.

 Generally cause and effect relationship is


understood.

 Eg. Factors affecting Job satisfaction.


Sources of Data
INDIAN ORGANIZATIONS:
 SEBI,
 SIAM,
 ISI- KOLKATTA
 NITI Aayog,
 NASS,
 NASSCOM,
 MSME,
 MINISTRY OF CORPORATE AFFAIRS,
 NSSO
GLOBAL LEVEL- PRIVATE PLAYERS
 GOLDMAN’s SACHS
 ERNEST & YOUNG
 GARTNER PUBLICATIONS
 MCKENZIE PUBLICATIONS
 KPMG
 PEW RESEARCH
 HBR RESEARCH
GLOBAL ORGANIZATIONS
 ADB(AASIAN DEVELOPMENT
BANK) especially data related to
APAC region
 WORLD BANK
 IMF
THINGS TO REMEMBER WHEN LOOKING AT A DATASET

1. Never assume dataset to be clean


2. Start with a specific question and hypothesis
3. Don’t be biased by having a hypothesis in
mind ( Stick to the objective)
4. Always investigate the why’s
5. Documentation is key
* Source: Melissa Barton HBS Online
Concept of DDDM
 Data Driven Decision Making:
Refers to the process of using data to
inform our decision-making process
and validate a course of action before
committing to it.
Need for data driven decision making or
Benefits of data in decision making

 Helps to identify business opportunities


 Helps to Improve Internal Processes with Data
 Better Targeting Customers with Business
Analytics
 To make more confident decisions
 It will make business more Proactive
 Helps in realizing Cost Savings
Solving the business problem using Analytics
 Data-driven decision making (DDDM) is
defined as using facts, metrics, and data to
guide strategic business decisions that align
with your goals, objectives, and initiatives.
 When organizations realize the full value of
their data, that means everyone—whether it’s a
business analyst, sales manager, or human
resource specialist—is empowered to make
better decisions with data, every day.
A study conducted by NewVantage Partners
recently reported that 98.6 percent of executives indicate
that their organization aspires to a data-driven culture,
while only 32.4 percent report having success.
 A 2018 IDC study also noted that organizations have
invested trillions of dollars to modernize their business,
but 70 percent of these initiatives fail because they
prioritized technology investments without building a
data culture to support it.
 So organizations need to focus on Three core
capabilities:
 Data proficiency,
 Analytics agility,
 Community
Real Time Examples
 Lufthansa Group is a global aviation group that at one
point, had no uniformity with analytics reporting across
its 550-plus subsidiaries. Using one analytics platform,
they increased efficiency by 30 percent gained greater
flexibility in decision making, and increased departmental
autonomy.
 “We’re in a stronger position to create and design our
analyses independently and a lot of people now
understand the central importance of data for the success
of Lufthansa,” shared Heiko Merten, Head of BI
Applications in Sales.
According to a recent Survey by McKinsey, an increasing share of organizations
report using analytics to generate growth. Here’s a look at how three companies
are aligning with that trend and applying data insights to their decision-making
processes.

Improving Productivity and Collaboration at Microsoft: At technology


giant Microsoft, collaboration is key to a productive, innovative work
environment. Following a 2015 move of its engineering group's offices, the
company sought to understand how fostering face-to-face interactions among
staff could boost employee performance and save money.
Microsoft’s Workplace Analytics, team hypothesized that moving the 1,200-
person group from five buildings to four could improve collaboration by
cutting down on the number of employees per building and by reducing the
distance that staff needed to travel for meetings. This assumption was
partially based on an earlier study by Microsoft,which found that people are
more likely to collaborate when they’re more closely located to one another.

In an article for the Harvard Business Review, the company’s analytics team


shared the outcomes they observed as a result of the relocation. Through
looking at metadata attached to employee calendars, the team found that the
move resulted in a 46 percent decrease in meeting travel time. This translated
into a combined 100 hours saved per week across all relocated staff members,
and an estimated savings of $520,000 per year in employee time.
PHASES OF DATA ANALYTICS LIFECYCLE:
 Phase1—Discovery: In Phase 1, the team learns the business domain, including relevant
history such as whether the organization or business unit has attempted similar projects in
the past from which they can learn. The team assesses the resources available to support
the project in terms of people, technology, time, and data. Important activities in this
phase include framing the business problem as an analytics challenge that can be
addressed in subsequent phases and formulating initial hypotheses (IHs) to test and begin
learning the data.

 Phase2—Data preparation: Phase 2 requires the presence of an analytic sandbox, in


which the team can work with data and perform analytics for the duration of the
project. The team needs to execute extract, load, and transform (ELT) or extract,
transform and load (ETL) to get data into the sandbox. The ELT and ETL are
sometimes abbreviated as ETLT. Data should be transformed in the ETLT process so the
team can work with it and analyze it. In this phase, the team also needs to familiarize itself
with the data thoroughly and take steps to condition the data

 Phase3—Model planning: Phase 3 is model planning, where the team determines the
methods, techniques, and workflow it intends to follow for the subsequent model
building phase. The team explores the data to learn about the relationships between
variables and subsequently selects key variables and the most suitable models.
 Phase 4—Model building: In Phase 4, the team develops datasets for
testing, training, and production purposes. In addition, in this phase the
team builds and executes models based on the work done in the model
planning phase. The team also considers whether its existing tools will
suffice for running the models, or if it will need a more robust
environment for executing models and workflows (for example, fast
hardware and parallel processing, if applicable).
  
 Phase 5—Communicate results: In Phase 5, the team, in collaboration
with major stakeholders, determines if the results of the project are a
success or a failure based on the criteria developed in Phase 1. The
team should identify key findings, quantify the business value, and
develop a narrative to summarize and convey findings to stakeholders.
  
 Phase 6—Operationalize: In Phase 6, the team delivers final reports,
briefings, code, and technical documents. In addition, the team may
run a pilot project to implement the models in a production
environment.
Role of a Business Analyst
 A business analyst’s primary objective is
helping businesses cost-effectively implement
technology solutions by precisely determining
the requirements of a project or a program, and
communicating them clearly to the key
stakeholders.
Pre-Requisites
 BAs typically require knowledge of statistics, and statistical
softwares such as R. Companies prefer a BA who also
possesses relevant SQL skills.
 The education and training requirements, although, may vary for
business by – employer, specific role, and industry.
 It is possible to enter into the field with just a two-year degree and
relevant work experience, but most employers would require at least
a bachelor’s degree.
 Business analysts should be able to create solutions to problems for
the business as a whole, and accordingly must effectively be able to
communicate with a variety of business areas. Thus communication
skills are another major prerequisite.
 BAs should be able to understand the business needs of customers
and should be able to translate them into the application and
operational requirements with the help of solid analytical and
product management skills.
The Job of a Business Analyst:
 Documenting and translating customer business
functions and processes.
 Warranting the system design is perfect as per the needs
of the customer.
 Participating in functionality testing and user
acceptance testing of the new system
 Helping technically in training and coaching
professional and technical staff.
 Developing a training program and conducting formal
training sessions covering designated systems module.
 Acting as a team-lead on assigned projects and
assignments; and providing work direction to the
developers and other project stakeholders.
Responsibilities Of A Business Analyst

 Understanding the Requirements of the


Business
 Analyzing Information
 Communicating With a Broad Range Of
People
 Documenting the Findings
 Evaluating and Implementing the Finest
Solution
Identify the Popular Business Analytics Tools
1.  R Programming: R is the leading analytics tool in
the industry and widely used for statistics and data
modeling. It can easily manipulate your data and
present in different ways.
2. Tableau Public: Tableau Public is a free software
that connects any data source be it corporate Data
Warehouse, Microsoft Excel or web-based data, and
creates data visualizations, maps, dashboards etc. with
real-time updates presenting on web. They can also be
shared through social media or with the client. It
allows the access to download the file in different
formats
SAMPLE TABLEAU
3.Python: Python is an object-oriented scripting language which is
easy to read, write, maintain and is a free open source tool. It
was developed by Guido van Rossum in late 1980’s which
supports both functional and structured programming methods.
Python is easy to learn as it is very similar to JavaScript, Ruby,
and PHP. Also, Python has very good machine learning libraries
viz. Scikitlearn, Theano, Tensorflow and Keras. Another
important feature of Python is that it can be assembled on any
platform like SQL server, a MongoDB database or JSON.
Python can also handle text data very well.
4. SAS: SAS is a programming environment and language for data manipulation and a
leader in analytics, developed by the SAS Institute in 1966 and further developed in
1980’s and 1990’s. SAS is easily accessible, manageable and can analyze data from
any sources. SAS introduced a large set of products in 2011 for customer intelligence
and numerous SAS modules for web, social media and marketing analytics that is
widely used for profiling customers and prospects. It can also predict their behaviors,
manage, and optimize communications.
SAMPLE SAMPLE
PYTHON SAS
5. Apache Spark: The University of California, Berkeley’s AMP Lab,
developed Apache in 2009. Apache Spark is a fast large-scale data
processing engine and executes applications in Hadoop clusters 100 times
faster in memory and 10 times faster on disk. Spark is built on data science
and its concept makes data science effortless. Spark is also popular for data
pipelines and machine learning models development. Spark also includes a
library – MLlib, that provides a progressive set of machine algorithms for
repetitive data science techniques like Classification, Regression,
Collaborative Filtering, Clustering, etc.
6. Excel: Excel is a basic, popular and widely used analytical tool almost in all
industries. Whether you are an expert in Sas, R or Tableau, you will still
need to use Excel. Excel becomes important when there is a requirement of
analytics on the client’s internal data. It analyzes the complex task that
summarizes the data with a preview of pivot tables that helps in filtering the
data as per client requirement. Excel has the advance business analytics
option which helps in modelling capabilities which have prebuilt options
like automatic relationship detection, a creation of DAX measures and time
grouping.
SAMPLE EXCEL
7. RapidMiner: RapidMiner is a powerful integrated data science
platform developed by the same company that performs predictive
analysis and other advanced analytics like data mining, text analytics,
machine learning and visual analytics without any programming.
RapidMiner can incorporate with any data source types, including
Access, Excel, Microsoft SQL, Tera data, Oracle, Sybase, IBM DB2,
Ingres, MySQL, IBM SPSS, Dbase etc. The tool is very powerful that
can generate analytics based on real-life data transformation settings,
i.e. you can control the formats and data sets for predictive analysis.

8. KNIME: KNIME Developed in January 2004 by a team of software


engineers at University of Konstanz. KNIME is leading open source,
reporting, and integrated analytics tools that allow you to analyze and
model the data through visual programming, it integrates various
components for data mining and machine learning via its modular
data-pipelining concept.
SAMPLE
SAMPLE
QLIKVIEW
RAPIDMINER
9. QlikView: QlikView has many unique features like
patented technology and has in-memory data processing,
which executes the result very fast to the end users and
stores the data in the report itself. Data association in
QlikView is automatically maintained and can be
compressed to almost 10% from its original size. Data
relationship is visualized using colors – a specific color is
given to related data and another color for non-related data.

10. Splunk: Splunk is a tool that analyzes and search the


machine-generated data. Splunk pulls all text-based log data
and provides a simple way to search through it, a user can
pull in all kind of data, and perform all sort of interesting
statistical analysis on it, and present it in different formats.
SAMPLE
Splunk>:

You might also like