0% found this document useful (0 votes)
3 views

Major Report

Uploaded by

pogow41499
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Major Report

Uploaded by

pogow41499
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 50

Managed Service Level Optimization

Major Project (CC4270) Report

Submitted in the partial fulfillment of the requirement for the award of

Bachelor of Technology

in

Computer and Communication Engineering

By:

Shivang Sharma
199303196

Under the supervision of:

Dr. Renu Kumawat

Type of project: External

July 2023

Department of Computer and Communication Engineering


School of Computer and Communication Engineering
Manipal University Jaipur
VPO. Dehmi Kalan, Jaipur, Rajasthan, India – 303007
Department of Computer and Communication Engineering
School of Computer and Communication Engineering, Manipal University Jaipur,
Dehmi Kalan, Jaipur, Rajasthan, India- 303007

STUDENT DECLARATION

I hereby declare that this project Managed Service Level Optimization is my own
work and that, to the best of my knowledge and belief, it contains no material previously
published or written by another person nor material which has been accepted for the award
of any other degree or diploma of the University or other Institute, except where due
acknowledgements have been made in the text.

Place: Noida Shivang Sharma


Date: 02/07/23 (199303196)
B. Tech (CCE) 8th Semester

i
Department of Computer and Communication Engineering
School of Computer and Communication Engineering, Manipal University Jaipur,
Dehmi Kalan, Jaipur, Rajasthan, India- 303007

Date:08/07/2023

CERTIFICATE FROM SUPERVISOR

This is to certify that the work entitled “Managed Service Level Optimization”
submitted by Shivang Sharma (199303196) to Manipal University Jaipur for the award of
the degree of Bachelor of Technology in Computer and Communication Engineering is a
bonafide record of the work carried out by him/ her under my supervision and guidance from
1/01/23 to 1/06/23.

Dr. Renu Kumawat


Department of Computer and Communication Engineering
Manipal University Jaipur

ii
Project Completion Certificate
(For external students only - On Company letterhead)

Sample format (Actual format of company may vary but it must have following items) –
Delete this line in final report

CERTIFICATE

This is to certify that the project entitled, “…………………………..” was carried out
by Student Name (Reg. No.) under my supervision from (Start date) to (End date) at
Company Name (Full address).

Date: Supervisor Name


Place: Designation
Organization Name, City

iii
ACKNOWLEDGEMENT

I would like to express my sincere gratitude and appreciation to all those who have supported
me in the successful completion of my major project report on Managed Service Level
Optimization. This project has been a significant endeavor for me during my final year in
college, and it would not have been possible without the help and guidance of several
individuals and organizations.

First and foremost, I extend my heartfelt thanks to my college faculty and mentors for their
valuable guidance and encouragement throughout the project. Their expertise and insightful
suggestions have played a pivotal role in shaping the project and enhancing its overall quality.
I am truly grateful for their constant support and motivation.

I would like to express my deep appreciation to the management of Ericsson India Global
Services for providing me with the opportunity to work on this project. Their trust in my
abilities and their willingness to share their domain knowledge and resources have been
instrumental in the successful execution of this endeavor. I am grateful for their cooperation
and the freedom they granted me to explore and implement innovative ideas.

I would like to extend my gratitude to my friends and family for their unwavering support and
encouragement. Their belief in my capabilities and their constant motivation have been a
source of strength throughout this challenging journey

iv
ABSTRACT

The role of a data analyst at Ericsson encompasses the critical task of harnessing the power of
data to drive informed decision-making and support strategic initiatives. This abstract
explores the key responsibilities, skills, and significance of data analysts within Ericsson's
context.

Data analysts at Ericsson are responsible for collecting, organizing, and analyzing data from
various sources to uncover insights and patterns. They employ statistical techniques, data
mining, and machine learning algorithms to extract meaningful information and provide
valuable recommendations to stakeholders.

The abstract highlights the core competencies required for success in this role, including
proficiency in data manipulation, data visualization, and statistical analysis. Data analysts
must possess a strong understanding of data modelling, database management, and
programming languages such as Python or R.

Furthermore, the abstract emphasizes the importance of effective communication skills for
data analysts. They must be able to present complex findings in a clear and concise manner,
enabling non-technical stakeholders to comprehend and utilize the insights derived from the
data.

Data analysts play a pivotal role in supporting Ericsson's strategic goals, whether it involves
optimizing network performance, improving customer experience, or identifying market
trends. By analyzing vast amounts of data, data analysts help uncover opportunities, enhance
operational efficiencies, and contribute to the company's overall growth and competitiveness.

In conclusion, data analysts at Ericsson hold a critical position in driving data-driven


decision-making and enabling the company to leverage the potential of data. With their
expertise in data analysis, statistical modelling, and communication, they play an integral role
in supporting Ericsson's strategic objectives and facilitating continuous improvement in the
telecommunications industry.

v
TABLE OF CONTENTS

Student declaration i
Certificate from Supervisor ii
Certificate from Company (For external students only) iii
Acknowledgement iv
Abstract v
List of figures vi
List of tables vii

Chapter 1 Introduction
1.1 About the organization 1
1.2 Role Description 2
1.3 Importance of DA 4

Chapter 2 Background Overview


2.1 Concept used 6
2.2 Tools and Software used 7

Chapter 3 Project Analysis


3.1 Project requirement and Customers demand 20

Chapter 4 Methodology
4.1 Methodology for Data Analysis and Dashboard Creation 21

Chapter 5 Implementation and work done


5.1 Phase 1 24
5.2 Phase 2 32
5.3 Phase 3 35

Chapter 6 Conclusion

References

vi
LIST OF FIGURES

Figure Page
Figure Title
No. No.
1 Process involved Data Analysis 5
2 Visual Basic for Application 10
3 Python 12
4 Tableau 14
5 SQL 16
6 Power BI 19
7 Raw dump of Sites 26
8 KPI data of downtime 26
9 FN8 Site 26
10 KPI values 31
11 Raw data from FFtool 33
12 Data after applying automation 33
13 VBA code 34
14 Macros Interface 34
15 Raw data for making dashboard 38
16 Dashboard on Power BI 38

vii
LIST OF TABLES

viii
INTRODUCTION

1.1About the organization

Ericsson is a global leader in communications technology and services, providing innovative


solutions to transform industries and society at large. With a rich history spanning over 140
years, Ericsson has become synonymous with cutting-edge technology, expertise, and a
commitment to driving digital transformation worldwide.

Founded in 1876 by Lars Magnus Ericsson in Stockholm, Sweden, the company has evolved
from a small telegraph repair workshop to a multinational corporation at the forefront of the
telecommunications industry. Today, Ericsson operates in over 180 countries, collaborating
with customers, partners, and governments to build a more connected and sustainable future.

Ericsson's comprehensive portfolio encompasses a wide range of products, services, and


solutions in areas such as 5G, Internet of Things (IoT), cloud computing, network
infrastructure, software platforms, and managed services. The company's offerings enable
seamless connectivity, enhance user experiences, and empower businesses across various
sectors, including telecommunications, industry automation, transportation, healthcare, and
more.

As a technology leader, Ericsson has played a pivotal role in driving the evolution of
telecommunications networks. Its advancements in mobile communications, including the
development of GSM and LTE standards, have shaped the way people communicate and
transformed the global telecommunications landscape.

Beyond its technological expertise, Ericsson is committed to sustainability and social


responsibility. The company actively works towards reducing its environmental footprint,
promoting diversity and inclusion, and leveraging technology to address social challenges and
bridge the digital divide.

1
1.2 Role Description

In Ericsson, Data Analysis (DA) plays a vital role in driving innovation, enabling data-driven
decision-making, and supporting the company's mission to connect the world. Here are some
key roles of Data Analysis in Ericsson:

1. Business Intelligence and Insights:


 Data-driven Decision Making: Data analysis helps Ericsson make informed
business decisions by analysing and interpreting data from various sources.
 Trend Analysis: Data analysis allows Ericsson to identify patterns, trends, and
insights from vast amounts of data, enabling proactive actions and strategic
planning.

 Performance Monitoring: Data analysis helps monitor key performance


indicators (KPIs) and assess the effectiveness of business strategies and
operations.
 Market Analysis: Data analysis supports market research and competitive
analysis, enabling Ericsson to understand customer needs, market trends, and
emerging opportunities.

2. Network Optimization and Planning:


 Network Performance Analysis: Data analysis helps optimize the performance
of Ericsson's telecommunications networks by analysing network data,
identifying bottlenecks, and optimizing network configurations.
 Capacity Planning: Data analysis assists in forecasting network capacity
requirements, anticipating network growth, and optimizing resource allocation
to meet customer demands.
 Quality of Service (QoS) Analysis: Data analysis enables the assessment of
network quality and customer experience, allowing Ericsson to improve
service delivery and customer satisfaction.

3. Product Development and Innovation:


 Data-Driven Product Design: Data analysis provides insights into customer
preferences, usage patterns, and market demands, guiding the development of
new products and services.
 User Experience Enhancement: Data analysis helps understand user behaviour,
interactions, and feedback to improve the user experience of Ericsson's
products and solutions.
 Innovation and Research: Data analysis supports research and development
activities, exploring emerging technologies and trends to drive innovation and
stay at the forefront of the telecommunications industry.

4. Predictive Analytics and Maintenance:


 Predictive Maintenance: Data analysis enables predictive maintenance by
analysing data from network equipment, identifying potential failures, and
scheduling proactive maintenance activities to minimize downtime.
 Service Level Agreement (SLA) Management: Data analysis helps monitor
SLAs, analyse service performance, and identify areas for improvement to
meet customer expectations and contractual commitments.
2
5. Customer Insights and Support:
 Customer Segmentation and Targeting: Data analysis assists in segmenting
customers based on their behaviour, needs, and preferences, enabling Ericsson
to tailor products and services to specific customer segments.
 Customer Support and Issue Resolution: Data analysis supports customer
support by analysing customer data, identifying patterns in issues or
complaints, and providing insights for effective troubleshooting and issue
resolution.

Data Analysis in Ericsson involves leveraging advanced analytics, machine learning, and
artificial intelligence techniques to extract actionable insights, optimize operations, enhance
customer experiences, and drive innovation. It is an essential function that empowers Ericsson
to harness the power of data and transform it into valuable knowledge for the benefit of its
customers and the industry.

3
1.3 Importance of Data Analysis

Data Analysis (DA) holds immense importance in today's data-driven world.


Here are some key reasons why DA is crucial:

1. Data-Driven Decision Making: DA enables organizations to make informed


decisions based on data insights rather than relying solely on intuition or
guesswork. It helps uncover trends, patterns, and correlations in data,
providing a solid foundation for decision-making processes.

2. Business Efficiency and Optimization: DA helps organizations identify


inefficiencies, bottlenecks, and areas for improvement across various
operations. By analysing data, organizations can streamline processes,
optimize resource allocation, reduce costs, and enhance overall efficiency.

3. Enhanced Business Performance: Through DA, organizations can monitor key


performance indicators (KPIs) and track progress towards goals and targets.
This enables proactive management and timely intervention to improve
performance, meet objectives, and drive business growth.

4. Customer Understanding and Personalization: DA allows organizations to gain


deep insights into customer behaviour, preferences, and needs. This knowledge
helps in delivering personalized experiences, targeted marketing campaigns,
and tailored products or services that meet specific customer demands.

5. Identifying Market Trends and Opportunities: DA helps organizations uncover


market trends, emerging opportunities, and potential gaps in the market. By
analysing data, organizations can identify new market segments, explore
untapped areas, and make strategic decisions to gain a competitive edge.

6. Risk Assessment and Mitigation: DA enables organizations to identify and


assess potential risks by analysing historical data, patterns, and anomalies. This
supports proactive risk management, fraud detection, compliance monitoring,
and the development of risk mitigation strategies.

7. Innovation and Product Development: DA fuels innovation by identifying


customer needs, market gaps, and areas for improvement. It helps
organizations in developing new products, improving existing offerings, and
staying ahead of the competition by leveraging data insights.

8. Predictive Analytics and Forecasting: DA facilitates predictive modelling and


forecasting by analysing historical data and identifying patterns and trends.
This empowers organizations to make accurate predictions, anticipate future
outcomes, and take proactive measures to mitigate risks or exploit
opportunities.

9. Data Governance and Compliance: DA plays a crucial role in ensuring data


governance and compliance with relevant regulations, standards, and privacy
policies. It helps organizations maintain data integrity, security, and privacy
while handling and analysing data.
4
10. Continuous Improvement and Adaptation: DA fosters a culture of
continuous improvement by providing feedback and measurement metrics.
It enables organizations to monitor performance, identify areas for
optimization, adapt strategies, and drive ongoing improvement across
processes and operations.

In summary, Data Analysis is vital for organizations to leverage the power of


data and gain actionable insights. It drives informed decision-making,
enhances efficiency, identifies market trends, personalizes customer
experiences, manages risks, and supports innovation and growth in today's
data-centric business landscape.

Fig 1: Process involved Data Analysis

5
2. Background Overview

2.1 Concept used

Data analysis involves various concepts and techniques that help extract insights and draw
meaningful conclusions from data. Here are some key concepts used in data analysis:

1. Data Cleaning and Pre-processing: Data cleaning involves identifying and handling
missing values, outliers, inconsistent data, and errors to ensure data quality. Pre-
processing includes data normalization, standardization, and transformation to prepare
the data for analysis.

2. Descriptive Statistics: Descriptive statistics provide summary measures and


descriptive characteristics of the data, such as measures of central tendency (mean,
median, mode), measures of dispersion (variance, standard deviation), and data
distributions.

3. Exploratory Data Analysis (EDA): EDA involves visualizing and exploring the data to
gain insights, identify patterns, relationships, and trends. Techniques used in EDA
include scatter plots, histograms, box plots, and correlation analysis.

4. Inferential Statistics: Inferential statistics involves drawing conclusions and making


inferences about a population based on sample data. Techniques like hypothesis
testing, confidence intervals, and regression analysis are used to make statistical
inferences.

5. Data Visualization: Data visualization techniques help represent data visually to


facilitate understanding and interpretation. Charts, graphs, heatmaps, and interactive
visualizations are used to present data in a meaningful and intuitive way.

6. Statistical Modelling: Statistical modeling involves building mathematical models that


capture relationships between variables in the data. Techniques like linear regression,
logistic regression, time series analysis, and machine learning algorithms are used for
predictive modeling and data analysis.

7. Data Mining: Data mining involves extracting patterns, insights, and knowledge from
large datasets using techniques like clustering, classification, association rules, and
anomaly detection. It aims to discover hidden patterns and relationships in the data.

8. Time Series Analysis: Time series analysis focuses on analyzing and forecasting data
that is collected over time. Techniques like trend analysis, seasonality analysis, and
forecasting models (ARIMA, exponential smoothing) are used to understand and
predict time-based patterns.

9. Dimensionality Reduction: Dimensionality reduction techniques aim to reduce the


number of variables in the data while preserving important information. Techniques
6
like principal component analysis (PCA) and feature selection help reduce complexity
and improve efficiency in data analysis.
10. Data Integration and Fusion: Data integration involves combining data from multiple
sources to create a unified dataset for analysis. Data fusion techniques help integrate
and merge heterogeneous data from different sources, such as databases, APIs, or
external files.

11. Data Mining Ethics and Privacy: Data analysis requires adherence to ethical guidelines
and privacy regulations to ensure responsible and lawful handling of data. Privacy-
preserving techniques like anonymization and data de-identification are used to protect
sensitive information.

These concepts form the foundation of data analysis and are employed using various tools,
programming languages (such as Python, R, SQL), statistical software, and data analysis
platforms to extract valuable insights, support decision-making, and drive business outcomes.

7
2.2 Tools and software used

2.2.1Excel

Excel is widely used for data analysis due to its rich functionality and user-friendly interface.
It offers several features that make it a valuable tool for data analysts. Here's why Excel is
commonly used for data analysis:

1. Data Import and Preparation:

 Data Import: Excel allows users to import data from various sources, including
CSV files, databases, and other spreadsheet formats. It provides seamless
integration to bring data into the application for analysis.

 Data Cleaning and Transformation: Excel offers tools for data cleaning and
transformation, enabling users to handle missing values, remove duplicates,
and format data for analysis. Functions like FIND, REPLACE, and TRIM help
in data cleansing tasks.

2. Formulas and Functions:

 Powerful Functions: Excel provides a wide range of built-in functions for data
analysis, such as SUM, AVERAGE, COUNT, MAX, MIN, and more. These
functions allow users to perform calculations, aggregations, and statistical
operations on their data.

 Custom Formulas: Users can create custom formulas using Excel's formula
language. By combining functions, operators, and cell references, analysts can
perform complex calculations and derive new insights from the data.

3. PivotTables and Pivot Charts:

 PivotTables: Excel's PivotTables are powerful tools for summarizing and


analysing large datasets. They allow users to dynamically rearrange and
reorganize data, perform aggregations, apply filters, and create custom
calculations.

 Pivot Charts: Pivot Charts complement PivotTables by providing visual


representations of data. Users can create bar charts, line charts, pie charts, and
other chart types based on the summarized data in PivotTables, helping to
identify trends and patterns.

4. Data Visualization:

 Charting Capabilities: Excel offers a wide range of chart types, including


column charts, line graphs, scatter plots, and more. These visualizations help
analysts present data in a visually appealing and easily understandable format.

8
 Conditional Formatting: Excel's conditional formatting feature allows users to
highlight cells or ranges based on specific conditions. It helps in visually
identifying outliers, trends, or patterns within the data.

5. What-If Analysis:

 Data Analysis Tools: Excel provides tools for performing What-If analysis.
Data tables, goal seek, and scenarios allow users to explore different scenarios,
assess the impact of changing variables, and make informed decisions based on
the results.

6. Data Exploration and Filtering:

 Sorting and Filtering: Excel enables users to sort data based on specific criteria
and apply filters to focus on subsets of data. This capability helps in data
exploration and isolating specific data points of interest.

7. Automation and Macros:

 Macros: Excel's macro recording feature allows users to automate repetitive


tasks, apply calculations or transformations to multiple datasets, and save time
in data analysis workflows. Macros can be created using Visual Basic for
Applications (VBA) to automate complex processes.

8. Collaboration and Sharing:

 Sharing and Collaboration: Excel supports sharing and collaboration, enabling


multiple users to work on the same workbook simultaneously. Users can track
changes, leave comments, and ensure data integrity in a collaborative
environment.

9
2.2.2 Visual Studio for Application

VBA (Visual Basic for Applications) is a programming language integrated within Microsoft
Excel that allows users to automate tasks, build custom functions, and create macros to
enhance data analysis capabilities. Here's how VBA can be utilized for data analysis:

1. Automation of Data Cleaning and Transformation:

 VBA enables users to write code to automate repetitive data cleaning tasks,
such as removing duplicates, handling missing values, or standardizing data
formats.

 By creating custom functions or subroutines in VBA, users can streamline the


data preparation process, saving time and ensuring consistency in data analysis
workflows.

2. Advanced Calculations and Data Manipulation:

 VBA allows users to create custom formulas and perform complex calculations
that may not be feasible using Excel's built-in functions.

 With VBA, users can manipulate data, apply conditional logic, perform
iterative calculations, and create dynamic ranges for more advanced data
analysis tasks.

3. Interactive User Interfaces and Dashboards:

 VBA enables the creation of interactive user interfaces (UI) within Excel,
allowing users to input parameters, select options, and interact with data
analysis tools or models.

 Users can develop custom dashboards using VBA to provide interactive data
visualizations, dynamic charting, and user-friendly controls for data
exploration.

4. Integration with External Data Sources and APIs:

 VBA can be used to connect Excel with external data sources, such as
databases or web APIs, to fetch real-time data for analysis.

 By leveraging VBA, users can automate data retrieval, perform data


transformations, and update analysis results with the latest data from external
sources.

5. Custom Data Analysis Tools and Algorithms:

 VBA allows users to implement custom algorithms and statistical methods that
may not be available in Excel's built-in functions or data analysis tools.

 Users can develop their own data analysis tools or models using VBA to solve
specific analytical problems or perform specialized analyses.
10
6. Report Generation and Automation:

 VBA can be used to automate report generation processes, allowing users to


extract analysis results, format them, and create customized reports or
presentations.

 Users can utilize VBA to generate reports based on specific criteria, automate
the inclusion of charts, tables, or summary statistics, and export reports in
different formats.

7. Error Handling and Validation:

 VBA provides error handling mechanisms, allowing users to handle errors


gracefully and provide informative error messages.

 Users can implement data validation checks in VBA to ensure data integrity
and validate user input within data analysis tools or models.

VBA empowers Excel users to extend the functionality of the software and customize
data analysis workflows according to specific needs. It enables automation, advanced
calculations, custom data manipulation, integration with external sources, and the
development of interactive dashboards and tools. By leveraging VBA, users can
enhance their data analysis capabilities and streamline repetitive tasks, making data
analysis more efficient, accurate, and flexible within the Excel environment

Fig2: Visual Basic for Application

11
2.2.3 Python

Python has emerged as one of the most popular programming languages for data analysis due
to its simplicity, versatility, and extensive libraries and tools specifically designed for data
manipulation and analysis. With its user-friendly syntax and a rich ecosystem of packages,
Python offers a powerful and efficient environment for conducting data analysis tasks.

1. Data Manipulation and Cleaning:

Python provides several libraries, such as pandas, NumPy, and SciPy, that facilitate data
manipulation and cleaning. Pandas offers high-performance data structures and data
analysis tools, making it easy to load, manipulate, and transform data. It enables tasks like
filtering, sorting, merging, reshaping, and summarizing data, ensuring data is in a suitable
format for analysis.

2. Data Visualization:

Python offers robust data visualization capabilities through libraries like Matplotlib,
Seaborn, and Plotly. These libraries provide a wide range of visualization options,
including line plots, scatter plots, bar charts, histograms, heatmaps, and interactive
visualizations. Python's visualization tools enable data analysts to explore patterns,
relationships, and trends in the data effectively.

3. Statistical Analysis:

Python's statistical analysis capabilities are enhanced by libraries like SciPy, Stats models,
and Scikit-learn. SciPy offers a comprehensive set of statistical functions, probability
distributions, hypothesis testing, and descriptive statistics. Statsmodels provides advanced
statistical models and methods for regression analysis, time series analysis, and multivariate
analysis. Scikit-learn offers machine learning algorithms for classification, regression,
clustering, and dimensionality reduction.

4. Machine Learning:

Python has become a dominant language for machine learning due to its extensive
machine learning libraries, including Scikit-learn, TensorFlow, and Keras. These
libraries provide a wide range of algorithms and tools for tasks like supervised
learning, unsupervised learning, deep learning, and natural language processing.
Python's simplicity and readability make it easy to implement and experiment with
machine learning models.

5. Integration and Scalability:

Python's versatility allows seamless integration with other programming languages


and tools, making it suitable for data analysis workflows. It can be integrated with
databases, cloud platforms, big data frameworks like Apache Spark, and other data
analysis tools. Python's scalability is enhanced by parallel processing libraries like
Dask which enable efficient processing of large datasets.

12
6. Reproducibility and Collaboration:

Python's emphasis on code readability and its support for Jupyter Notebooks make it
an ideal choice for reproducible research and collaborative data analysis. Jupyter
Notebooks allow data analysts to create interactive documents that combine code,
visualizations, and explanatory text, facilitating easy sharing, collaboration, and
documentation of analysis workflows.

In conclusion, Python has established itself as a go-to language for data analysis due to its
simplicity, extensive libraries, and vibrant community. Whether it's data manipulation,
visualization, statistical analysis, or machine learning, Python offers a versatile and efficient
environment for performing a wide range of data analysis tasks. Its popularity continues to
grow, making it an indispensable tool for data professionals across industries.

Fig3: Python

13
2.2.4 Tableau

Tableau is a powerful and widely used data visualization and business intelligence tool that
enables organizations to transform raw data into actionable insights. With its intuitive
interface and interactive visualizations, Tableau empowers users to explore, analyse, and
communicate complex data effectively.

1. Data Connectivity and Integration:

Tableau allows seamless connectivity to a wide variety of data sources, including


databases, spreadsheets, cloud-based platforms, and big data frameworks. It provides
native connectors to popular databases like SQL Server, Oracle, and MySQL, as well
as direct integration with cloud platforms such as Google Big Query and Amazon
Redshift. This flexibility in data connectivity enables users to work with diverse
datasets for analysis.

2. Interactive Visualizations:

Tableau offers a wide range of visualization options, including bar charts, line graphs,
scatter plots, heatmaps, treemaps, and geographic maps. These visualizations can be
easily created and customized using a drag-and-drop interface, allowing users to
explore data and uncover insights visually. The interactivity of Tableau visualizations
enables users to drill down into the data, filter, sort, and dynamically change
parameters for deeper analysis.

3. Data Blending and Data Prep:

Tableau's data blending capabilities allow users to combine data from multiple
sources, even if they have different structures or granularities. This enables
comprehensive analysis by blending data at a granular level to provide a holistic view.
Tableau also offers data preparation tools that allow users to clean, transform, and
reshape data without the need for complex coding. This makes it easier to prepare data
for analysis and ensures data accuracy and consistency.

4. Advanced Analytics and Statistical Tools:

Tableau provides built-in advanced analytics functions, such as forecasting, clustering,


and trend analysis, allowing users to perform complex calculations and gain deeper
insights from their data. It also integrates with statistical programming languages like
R and Python, enabling users to leverage their existing code and algorithms within
Tableau for more advanced analytics.

5. Collaboration and Sharing:

Tableau facilitates collaboration and sharing of insights through its online platform,
Tableau Server or Tableau Online. Users can publish their visualizations and
dashboards to a centralized server, where authorized users can access and interact with
the shared content. This enables real-time collaboration, data storytelling, and
decision-making based on up-to-date information.

14
6. Scalability and Performance:

Tableau is designed to handle large datasets and complex visualizations without


compromising performance. It leverages technologies like data indexing, in-memory
processing, and query optimization to deliver fast and interactive visualizations, even
with massive datasets. Tableau's scalability allows organizations to scale their
analytics deployments to accommodate growing data volumes and user demands.

7. Mobile Accessibility:

Tableau provides mobile apps for iOS and Android devices, allowing users to access
and interact with visualizations on-the-go. The mobile app provides a responsive and
optimized experience, ensuring that insights and reports are accessible anytime,
anywhere.

In conclusion, Tableau is a robust data analysis tool that empowers users to visualize, analyze,
and gain insights from their data. Its intuitive interface, diverse visualization options, data
blending capabilities, and advanced analytics features make it an asset for organizations
across industries. Whether it's exploratory data analysis, dashboards for executive reporting,
or interactive data storytelling, Tableau enables users to unlock the full potential of their data
and make data-driven decisions with confidence.

Fig3: Tableau

2.2.4 SQL
15
Structured Query Language (SQL) is a powerful tool for data analysis that enables
organizations to extract meaningful insights from their relational databases. SQL provides a
standardized language for interacting with databases, allowing analysts to retrieve,
manipulate, and analyse data efficiently. This article explores the role of SQL in data analysis
and its key features that enable analysts to unlock valuable insights.

1. Data Retrieval:

SQL's SELECT statement is fundamental to data analysis as it allows analysts to


retrieve specific data from relational databases. Analysts can specify columns, apply
filters, sort data, and join multiple tables to combine information. This flexibility in
retrieving data enables targeted analysis of relevant subsets of data.

2. Data Manipulation:

SQL offers data manipulation statements such as INSERT, UPDATE, and DELETE,
which allow analysts to modify and manage data within the database. Analysts can
insert new records, update existing data, and remove unnecessary information. Data
manipulation is crucial for data cleaning, transformation, and maintaining data quality
for accurate analysis.

3. Aggregation and Grouping:

SQL supports aggregation functions like SUM, AVG, COUNT, MAX, and MIN,
which enable analysts to calculate summary statistics and derive insights from data.
By combining these functions with the GROUP BY clause, analysts can group data
based on specific attributes and perform calculations at different levels of granularity.

4. Filtering and Conditional Logic:

SQL's WHERE clause allows analysts to filter data based on specific conditions.
Analysts can apply logical operators, comparison operators, and functions to select
subsets of data that meet certain criteria. This capability enables analysts to focus on
relevant data subsets and refine their analysis based on specific requirements.

5. Joining Tables:

SQL's ability to join multiple tables based on common columns is vital for data
analysis. Analysts can merge datasets from different tables, enabling comprehensive
analysis by leveraging relationships between data entities. Joining tables facilitates
data integration, allows for complex queries, and enables holistic insights.

6. Advanced Analytics:

SQL has evolved to support advanced analytics with the inclusion of window
functions. Window functions enable analysts to perform calculations and analyze data
within specific windows or partitions. This capability is particularly useful for time
series analysis, ranking, and performing cumulative calculations.
7. Integration with Business Intelligence Tools:
16
SQL seamlessly integrates with popular business intelligence tools, enabling analysts
to leverage its querying capabilities within visualization and reporting environments.
By combining SQL with visualization tools, analysts can create interactive
dashboards, reports, and data visualizations to communicate insights effectively.

8. Scalability and Performance:

SQL is highly scalable, allowing analysts to handle large volumes of data efficiently.
Relational databases optimize SQL queries through indexing, query optimization
techniques, and parallel processing, ensuring fast and responsive data analysis, even
with extensive datasets.

Conclusion: SQL is a critical tool for data analysis, providing the means to retrieve,
manipulate, and analyze data stored in relational databases. With its querying capabilities,
data manipulation functionalities, and integration with visualization tools, SQL empowers
analysts to unlock valuable insights, make data-driven decisions, and drive business success.

By leveraging SQL's features, analysts can extract targeted data subsets, perform calculations,
aggregate information, and integrate datasets from multiple sources. SQL's role in data
analysis extends beyond simple querying, making it a versatile and indispensable tool for
organizations seeking to derive insights from their relational databases.

Fig4: Structured Query Language

2.2.5 Google Cloud Platform


17
In the era of big data, organizations are constantly seeking efficient and scalable solutions for
data analysis. Google Cloud Platform (GCP) offers a comprehensive suite of cloud services
that enable organizations to leverage the power of cloud computing for their data analysis
needs. This article explores the key features and benefits of GCP in data analysis, highlighting
its capabilities for processing, storing, and analysing large volumes of data.

1. Data Storage and Management:

GCP provides a range of storage services that facilitate efficient data management for
analysis purposes. Cloud Storage offers scalable and durable object storage, allowing
organizations to store vast amounts of structured and unstructured data. Cloud
Bigtable provides a NoSQL database for high-performance, large-scale workloads.
Cloud Spanner offers a horizontally scalable, globally consistent relational database
for transactional applications. These storage services enable organizations to securely
store and access data for analysis in a flexible and cost-effective manner.

2. Data Processing and Analytics:

GCP offers several services for data processing and analytics, allowing organizations
to derive valuable insights from their data.

 Big Query:

Big Query is a serverless, highly scalable data warehouse that enables fast SQL-based
queries on large datasets. It supports parallel processing and can handle petabytes of
data, making it ideal for ad-hoc analysis and interactive queries.

 Dataflow:

Dataflow is a fully managed service for real-time and batch data processing. It allows
organizations to transform and analyze data in a scalable and serverless environment,
supporting both streaming and batch processing workflows.

 Dataproc:

Dataproc is a managed Apache Hadoop and Apache Spark service. It enables


organizations to run big data processing and analytics workloads using popular open-
source frameworks. Dataproc provides a flexible and scalable environment for data
analysis and machine learning tasks.

 AI Platform:

AI Platform offers a range of services for machine learning and data analysis,
including training and deploying machine learning models. It allows organizations to
apply advanced analytics techniques to their data and extract valuable insights.

18
2.2.6 Power Bi

Power PI is a powerful business intelligence tool that enables organizations to analyze,


visualize, and gain valuable insights from their data. Built as an add-on to Microsoft Excel,
Power PI provides enhanced capabilities for data modelling, data integration, and interactive
reporting. This article explores the features and benefits of Power PI in data analysis,
highlighting its role in empowering organizations to make informed decisions based on data-
driven insights.

1. Data Modeling and Integration:

Power PI allows users to create robust data models by importing and integrating data
from various sources. It provides connectivity to a wide range of data sources,
including databases, spreadsheets, cloud platforms, and web services. With Power PI,
users can easily transform and shape data, perform data cleansing and enrichment, and
establish relationships between different data sets. This capability enables
organizations to build comprehensive and unified data models for analysis.

2. Interactive Data Exploration and Visualization:

Power PI offers a wide array of data visualization tools, allowing users to create
interactive and visually compelling reports and dashboards. Users can leverage a
variety of chart types, including bar charts, line charts, scatter plots, and maps, to
represent their data effectively. Power PI's interactive features enable users to drill
down into the data, apply filters, and dynamically change visualizations to gain deeper
insights and explore patterns within their data.

3. Advanced Analytics and Calculations:

Power PI provides a range of advanced analytics and calculation capabilities, enabling


users to perform complex calculations and derive valuable metrics from their data. It
offers a comprehensive set of built-in functions, such as DAX (Data Analysis
Expressions), which allow users to create powerful calculations and formulas. Users
can perform calculations based on aggregations, time intelligence, statistical functions,
and more, facilitating advanced data analysis and modeling.

4. Data Refresh and Automation:

Power PI supports data refresh and automation, ensuring that reports and dashboards
are always up to date with the latest data. Users can schedule data refreshes at regular
intervals or set up automatic refreshes when the data source changes. This feature
enables users to have real-time or near-real-time insights, ensuring that decision-
makers have access to accurate and current information.

5. Collaboration and Sharing:

Power PI enables users to collaborate and share their analyses and reports with others
within the organization. Users can publish their reports and dashboards to Power BI, a
cloud-based business intelligence platform, and share them with colleagues or embed
them in other applications. This promotes data-driven decision-making by allowing
stakeholders to access and interact with the insights generated through Power PI.
19
6. Integration with Other Tools and Services:

Power PI seamlessly integrates with other Microsoft products and services, such as
Excel, Azure, and SQL Server. Users can leverage the familiar Excel interface to
perform data analysis and visualization while benefiting from the scalability and cloud
capabilities of Azure and SQL Server. This integration enhances the flexibility and
extensibility of Power PI in meeting diverse data analysis needs.

7. Data Security and Governance:

Power PI ensures data security and governance by adhering to organizational data


policies and permissions. Administrators can control access to data models, reports,
and dashboards, ensuring that sensitive information is protected. Power PI integrates
with Active Directory, allowing organizations to manage user access and
authentication centrally.

Conclusion: Power PI is a robust business intelligence tool that empowers organizations to


analyze and gain valuable insights from their data. With its data modeling, integration,
visualization, and advanced analytics capabilities, Power PI enables users to transform raw
data into actionable insights. By providing interactive reporting, automation, collaboration,
and integration features, Power PI enhances decision-making and promotes a data-driven
culture within organizations.

Whether it's performing complex calculations, visualizing data trends, or sharing insights with
stakeholders, Power PI offers a comprehensive suite of tools to meet the diverse data analysis
needs of organizations. By leveraging Power PI's capabilities, organizations can uncover
patterns, identify opportunities, and make informed decisions that drive business success.

Fig5: Power BI

20
Project Analysis

4.1 Project requirement and Customers demand

When a project is initiated, it is crucial to understand the specific requirements and demands
of the customer. As a professional data analyst, your primary responsibility is to analyze raw
data and derive meaningful insights from it. The customer provides a particular task, such as
understanding the downtime of telecom sites, including the duration of each outage and the
underlying reasons. Verbal explanations may not effectively convey the complex data, so
graphical representation becomes essential.

Upon receiving a project, the initial step is to thoroughly analyze the customer's data. This
involves examining the dataset, performing data cleaning, and preparing the data for analysis.
Ensuring accuracy, consistency, and proper structuring of the data is vital to generate reliable
insights.

Once the data is prepared, you proceed to create comprehensive reports. These reports serve
as a consolidated view of the telecom site performance, offering valuable information to the
customer. Key metrics are highlighted, including the total number of sites experiencing
downtime, the duration of each outage, and a breakdown of the reasons behind the outages.

However, the reports are just the beginning. Further analysis and exploration are conducted
using advanced visualization tools. By leveraging tools such as Power BI, you create
interactive dashboards and visually compelling representations of the data. These dashboards
enable deeper analysis and facilitate data-driven decision-making.

In the context of Managed Services level optimization, your role expands beyond data
analysis. You take on the responsibility of managing the services, ensuring the smooth
operation and maintenance of site towers. It becomes your duty to monitor the site towers and
collect relevant data related to telecom operations. This data encompasses the number of sites
experiencing downtime, the duration of each outage, and any other pertinent information.

Providing accurate and timely data to the customer is paramount. This data empowers them to
comprehend the performance of their telecom infrastructure, identify areas for improvement,
and make informed decisions to optimize their services.

To summarize, as a professional data analyst in Managed Services, you excel at analyzing raw
data to extract meaningful insights. This involves creating comprehensive reports and
conducting further analysis using visualization tools. Additionally, you take on the crucial
responsibility of managing the telecom services, ensuring the proper functioning of site
towers, and delivering relevant data to the customer. By effectively fulfilling these tasks, you
contribute to the overall optimization and improvement of the customer's telecom services.

21
Methodology
4.1 Methodology for Data Analysis and Dashboard Creation

1. Define Objectives:

Clearly define the objectives of the data analysis and dashboard creation. Understand
the purpose of the analysis and the key insights or metrics you aim to derive from the
data.

2. Data Collection:

Gather the raw data from reliable sources. Ensure the data is complete, accurate, and
relevant to the objectives of the analysis. Validate the data for integrity and
consistency.

3. Data Cleaning and Preparation:

Clean and preprocess the raw data to make it suitable for analysis. This involves
handling missing values, correcting errors, standardizing formats, and transforming
data if needed. Ensure data quality and integrity throughout the process.

4. Data Exploration and Analysis:

Perform exploratory data analysis (EDA) to understand the data's characteristics,


identify patterns, relationships, outliers, and potential issues. Use statistical
techniques, visualization, and descriptive analytics to gain initial insights and inform
further analysis.

5. Hypothesis Formulation (if applicable):

If there are specific research questions or hypotheses to test, formulate them based on
the initial data exploration. Define the variables, relationships, or patterns you expect
to find in the data.

6. Select Analytical Techniques:

Based on the objectives and nature of the data, select appropriate analytical
techniques. This may include statistical analysis, machine learning algorithms,
predictive modeling, or other quantitative or qualitative methods that align with the
objectives.

7. Perform Data Analysis:

Apply the chosen analytical techniques to the preprocessed data. Conduct statistical
analysis, build models, and derive insights from the data. Validate assumptions,
conduct hypothesis testing (if applicable), and evaluate the results.

8. Visualization and Dashboard Design:

22
Once the data analysis is complete, design the dashboard that will present the key
insights in a visually appealing and interactive format. Identify the metrics, charts,
graphs, or visuals that best represent the insights and enable users to understand and
explore the data easily.

9. Dashboard Development:

Use visualization tools like Power BI, Tableau, or other dashboarding platforms to
develop the interactive dashboard. Incorporate the selected metrics, visuals, filters, and
interactivity to make the dashboard user-friendly and intuitive.

10. Validation and Iteration:

Validate the dashboard and its outputs against the original objectives and
requirements. Seek feedback from stakeholders or end-users and make necessary
iterations or refinements to improve usability, clarity, and accuracy.

11. Documentation and Deployment

Document the entire process, including the methodology, data sources, data
preparation steps, analysis techniques used, and dashboard design. Provide clear
documentation for future reference and replication. Deploy the final dashboard for use
by stakeholders or end-users.

12. Monitoring and Maintenance: Continuously monitor the dashboard's performance,


data updates, and user feedback. Make regular updates to the dashboard as new data
becomes available or as business needs change.

By following this methodology, you ensure a structured and systematic approach to data
analysis and dashboard creation. It enables you to transform raw data into valuable insights
and present them in an intuitive and interactive format that facilitates data-driven decision-
making.

23
4.2 Pictorial representation of methodology of Data analysis

Fig6: Process involved in Analysis

Fig6: Workflow in Data Analysis

24
Implementation and work done

4.1 Phase 1

During the initial phase of my internship at Ericsson, I received an introduction to the terms
and concepts commonly used in report generation and data management. One of the
fundamental concepts discussed was the concept of an alarm. In the telecom industry, alarms
are raised whenever there is an issue or problem at a specific site. For instance, if a site tower
experiences a crash due to a natural calamity, an alarm is triggered to notify the responsible
teams. Subsequently, a ticket is created for that particular site using an application called
ITSM (Information Technology Service Management). ITSM is a tool utilized for ticket
creation and it contains all the relevant data related to the non-functioning site.

Once an alarm is raised, a ticket is generated with each ticket assigned a priority level. If a site
is impacting numerous users and generating a significant volume of complaints, it is assigned
a higher priority level, such as P1. Similarly, other priority levels like P2, P3, and P4 are
assigned to different severity levels of issues. The site facing the problem is given a specific
timeframe for resolution, and if the issue remains unresolved, the trouble ticket is carried
forward to the next day for further action.

Additionally, I was introduced to the term KPI (Key Performance Indicator), which holds
great significance in the telecom field. KPIs are quantifiable metrics used to measure the
performance and effectiveness of various aspects within the telecom industry. They provide
valuable insights into the overall health and efficiency of operations, services, and strategies.
KPIs are often understood through parametric values that have a specific mean or average.

During the initial two months of my internship, my primary responsibility was manually
generating reports using Microsoft Excel. I was assigned the task of creating reports based on
client requirements, whether they needed daily, hourly, weekly, or monthly data analysis. To
accomplish this, we extracted the raw data from different sources, which could be Ericsson's
own tools or client-based tools. The raw data usually encompassed information about cells,
sites, and various parameters associated with the telecom infrastructure.

In summary, at the beginning of my internship, I familiarized myself with essential concepts


such as alarms, tickets, ITSM, and KPIs in the telecom industry. I primarily focused on
manually creating reports using Excel, extracting relevant data from various sources based on
client specifications. This initial period laid the foundation for my understanding of data
analysis and reporting in the telecom domain.

25
KPIs, or Key Performance Indicators, are critical tools in the field of telecom for measuring
and evaluating the performance of different aspects within the telecommunications industry.
These metrics provide valuable insights into the effectiveness and efficiency of operations,
services, and strategies. KPIs enable telecom companies to track progress, identify trends, and
make data-driven decisions to optimize their performance and meet their objectives. At
Ericsson, we utilize several common KPIs to assess and monitor the telecom network and
services. Some examples of these KPIs include:

1)Downtime:

Downtime KPI refers to a key performance indicator that measures the duration or frequency
of system or service downtime within a given period. It quantifies the amount of time that a
system or service is unavailable or not operational, indicating the reliability and availability of
the infrastructure.

The downtime KPI is crucial in assessing the performance and effectiveness of systems,
networks, or services, particularly in the context of service-level agreements (SLAs) and
customer satisfaction. It helps organizations understand the impact of downtime on their
operations, customers, and overall business performance.

There are several aspects to consider when analyzing the downtime KPI:

1. Downtime Duration: This aspect measures the total duration of time during which a
system or service was unavailable. It provides an overall picture of the time lost due to
outages or disruptions.
2. Downtime Frequency: This aspect counts the number of occurrences or instances of
downtime within a specific timeframe. It helps identify patterns, trends, or recurring
issues that may require attention and resolution.
3. Mean Time Between Failures (MTBF): MTBF calculates the average time between
failures or incidents leading to downtime. It provides insights into the stability and
reliability of systems or equipment, helping organizations plan maintenance activities
and optimize uptime.
4. Mean Time to Repair (MTTR): MTTR measures the average time required to restore a
system or service to operational status after an incident or failure. It assesses the
efficiency and effectiveness of the incident response and recovery processes, aiming to
minimize downtime.
5. Planned vs. Unplanned Downtime: Downtime can be categorized as planned or
unplanned. Planned downtime is scheduled and occurs during maintenance activities
or system upgrades. Unplanned downtime is unexpected and typically caused by
system failures, infrastructure issues, or external factors such as natural disasters or
power outages.

Monitoring the downtime KPI enables organizations to identify the root causes of downtime,
take preventive measures, and improve overall system reliability and availability. By setting
targets and tracking downtime metrics, organizations can establish benchmarks, implement
mitigation strategies, and drive continuous improvement efforts.

26
Reducing downtime has numerous benefits, including increased productivity, improved
customer satisfaction, minimized revenue loss, and enhanced business resilience. Effective
monitoring, analysis, and management of the downtime KPI help organizations proactively
address issues, optimize system performance, and ensure uninterrupted service delivery to
customers.

Fig7: Raw dump of Sites

Fig8: KPI data of downtime 3g site (calculation from ff tool)

Fig9: Fn8 Site (sites to take into consideration while downtime)

27
2)Uplink Throughput rate:

Uplink throughput KPI refers to a key performance indicator that measures the speed or
capacity at which data is transmitted from user devices to the network (uplink). It quantifies
the rate at which data can be uploaded from the user's device to the network infrastructure.

The uplink throughput KPI is essential in assessing the performance of the uplink channel and
the efficiency of data transmission from users to the network. It helps evaluate the
effectiveness of the network in handling user-generated data and ensures a smooth user
experience.

Here are some key aspects to consider when analyzing the uplink throughput KPI:

1. Data Transfer Rate: Uplink throughput is typically measured in terms of data transfer
rate, such as bits per second (bps) or megabits per second (Mbps). It represents the
speed at which data can be uploaded from the user's device to the network.
2. User Capacity: Uplink throughput KPI also considers the network's capacity to handle
concurrent uplink data transmissions from multiple users. It helps determine the
maximum number of users that can upload data simultaneously without degradation in
performance.
3. Quality of Service (QoS): Uplink throughput is directly linked to the quality of service
experienced by users. A higher uplink throughput ensures faster data uploads, reduced
latency, and smoother real-time communication or interactive experiences.
4. Impact on Applications: The uplink throughput KPI has implications for various
applications and services that rely on user-generated data uploads. For instance, video
conferencing, file sharing, cloud services, or social media uploads heavily depend on
efficient uplink throughput for a seamless user experience.
5. Network Congestion: Uplink throughput can be impacted by network congestion or
bandwidth limitations. When the number of users simultaneously uploading data
increases or the available uplink bandwidth is limited, it can result in lower uplink
throughput and slower data transfers.

Monitoring and optimizing the uplink throughput KPI help telecom operators ensure efficient
utilization of network resources, enhance user experience, and meet performance targets. By
analyzing uplink throughput metrics, operators can identify areas of congestion, capacity
constraints, or network issues that affect data transmission in the uplink direction. They can
then take proactive measures to optimize network configurations, implement advanced radio
access technologies, and allocate sufficient bandwidth to handle increasing uplink traffic.
Ensuring a robust uplink throughput supports the growing demand for user-generated content
and contributes to improved overall network performance and customer satisfaction in the
telecom industry

28
3) Downlink Throughput Rate:
Downlink throughput rate refers to the speed or capacity at which data is transmitted from the
network to user devices (downlink). It represents the rate at which data can be downloaded or
received by the user's device from the network infrastructure.

Analyzing the downlink throughput rate KPI is crucial in evaluating the performance and
efficiency of the network in delivering data to users. It helps assess the network's ability to
provide fast and reliable data transfers, which is essential for various applications, services,
and user experiences.

Here are key aspects to consider when analyzing the downlink throughput rate KPI:

1. Data Transfer Rate: Downlink throughput rate is typically measured in terms of data
transfer rate, such as bits per second (bps) or megabits per second (Mbps). It
represents the speed at which data can be downloaded by the user's device from the
network.
2. User Experience: The downlink throughput rate directly impacts the user experience,
particularly for activities such as browsing the internet, streaming videos,
downloading files, or using data-intensive applications. Higher downlink throughput
rates result in faster data downloads and smoother user experiences.
3. Network Capacity: Downlink throughput rate is influenced by the network's capacity
to handle multiple simultaneous data transfers to users. The network's ability to
provide sufficient bandwidth and allocate resources effectively impacts the downlink
throughput rate.
4. Quality of Service (QoS): The downlink throughput rate is a critical factor in ensuring
satisfactory quality of service for users. It affects the responsiveness, speed, and
overall performance of various applications and services that rely on data downloads.
5. Network Congestion: Downlink throughput rate can be influenced by network
congestion or high traffic load. When numerous users are simultaneously accessing
data-intensive services, the available downlink bandwidth may be limited, resulting in
lower throughput rates and slower data downloads.

Monitoring and optimizing the downlink throughput rate KPI is vital for telecom operators to
provide an excellent user experience, meet performance targets, and manage network
resources effectively. By analyzing downlink throughput metrics, operators can identify areas
of congestion, optimize network configurations, allocate bandwidth efficiently, and ensure a
smooth and efficient data delivery process from the network to users.

4) Attach failure ratio:


Failure ratio KPI, also known as Failure Rate, is a key performance indicator used to measure
the frequency or percentage of failures or faults occurring within a system, equipment, or
process. It helps evaluate the reliability, performance, and operational effectiveness of the
components or processes being measured.

29
When applied in the telecom industry, failure ratio KPI refers to the ratio or percentage of
failures within the network infrastructure, equipment, or services. It indicates the frequency at
which failures occur and provides insights into the reliability and quality of the telecom
system.

Here are key aspects to consider when analyzing the failure ratio KPI:

1. Calculation:

The failure ratio is typically calculated by dividing the number of failures or faults
within a specific time period by the total number of observations or units. It can be
expressed as a percentage or a decimal value.

2. Failure Types: Failures can occur in various forms, such as equipment malfunctions,
network outages, service disruptions, or performance degradation. It is important to
define and categorize the types of failures being measured to ensure consistent
analysis and comparison.

3. Impact: The failure ratio KPI helps quantify the impact of failures on the overall
system or service. It provides a metric to assess the frequency and severity of
disruptions and their potential consequences on customer experience, service quality,
and business operations.

4. Root Cause Analysis: Analyzing the failure ratio KPI often involves investigating the
root causes of failures. This helps identify trends, patterns, or specific areas where
improvements can be made to prevent future failures and enhance overall system
reliability.

5. Benchmarking: Comparing the failure ratio KPI against industry benchmarks or


internal targets allows organizations to assess their performance and identify areas for
improvement. It helps set realistic goals, track progress, and drive continuous
improvement efforts.

6. Mitigation Strategies: The failure ratio KPI guides organizations in developing and
implementing mitigation strategies to reduce failures. This may involve proactive
maintenance, system upgrades, redundancy measures, or process improvements to
enhance system reliability and minimize downtime.

Monitoring and managing the failure ratio KPI enable telecom operators to enhance system
reliability, improve service quality, and meet customer expectations. By analyzing failure
data, organizations can identify areas of improvement, allocate resources effectively, and
implement preventive measures to minimize failures and their impact on the network and
services. Reducing the failure ratio contributes to improved customer satisfaction, increased
operational efficiency, and enhanced overall performance in the telecom industry.

5)Paging failure ratio:


Paging Failure Ratio (PFR) is a key performance indicator used in the telecommunications
industry to measure the effectiveness and efficiency of the paging process within a cellular
network. It specifically focuses on the success rate of paging attempts for mobile devices.
30
Paging refers to the process of locating and alerting a mobile device (such as a cell phone)
when there is an incoming call, message, or other communication directed to that device. The
paging process involves sending paging messages to all cells or specific cells where the
mobile device is expected to be located.

The Paging Failure Ratio (PFR) is calculated by dividing the number of unsuccessful or failed
paging attempts by the total number of paging attempts within a specific time period. It is
typically expressed as a percentage or a decimal value.

Here are key aspects to consider when analyzing the Paging Failure Ratio (PFR) KPI:

1. Calculation: PFR is calculated by dividing the number of failed paging attempts by the
total number of paging attempts and multiplying the result by 100 to obtain a
percentage value.

2. Failure Causes:

Paging failures can occur due to various reasons, such as network congestion, signal
coverage issues, incorrect location information, or mobile device-related factors. It is
important to identify and categorize the causes of paging failures to understand the
underlying issues and implement appropriate solutions.

3. Impact on User Experience:

High paging failure rates can negatively impact the user experience by leading to
missed calls, delayed notifications, or unsuccessful communication attempts.
Monitoring PFR helps identify areas where improvements are needed to ensure
reliable and timely communication with mobile devices.

4. Network Optimization:
Analyzing PFR data enables telecom operators to identify areas of the network with
high paging failures and implement optimization strategies. These may include
adjusting paging parameters, optimizing cell coverage, improving network capacity, or
optimizing location tracking algorithms.

5. Benchmarking:
Comparing PFR against industry benchmarks or internal targets helps assess the
network's performance and identify areas for improvement. It provides a basis for
setting goals, tracking progress, and driving continuous enhancements in the paging
process.

6. Continuous Monitoring:
PFR should be monitored regularly to detect trends, fluctuations, or sudden increases
in paging failures. By proactively identifying issues, operators can take corrective
actions promptly and improve the overall performance of the paging process.

31
Fig9: Consolidated KPI’s used

Fig10: KPI values

32
4.2 Phase 2:

After the initial two months of my internship, during which I focused on manual report
generation, I was assigned the task of automating the report using VBA (Visual Basic for
Applications). The report I was working on was called NNP (Nationwide Site Availability),
which contained data from four different regions: Central, Eastern, Northern, and Sabah. The
report included site IDs, site names, areas, and weekly availability data for each site. The
customer's requirement was to identify sites with availability values lower than 90.30 percent.

While we had a tool to fetch the report, it was not sufficient to directly present it to the client.
As data analysts, we needed to make the report presentable and meaningful. This involved a
significant amount of manual work, including adding, formatting, adding columns, and
applying various formulas such as COUNTIF, VLOOKUP, HLOOKUP, CONCATENATE,
and more.

Due to the time-consuming nature of the manual tasks involved, we were tasked with
automating the report using VBA. The objective was to streamline the process and reduce the
time required. Through VBA automation, we implemented the following tasks in the report:

1. Deleting Certain Columns: We excluded certain columns, such as sites not under
contract (fn8 sites), to focus on relevant data.

2. Changing Date Format: We ensured that the date format in the report was consistent
and easy to understand.

3. Applying VLOOKUP Formula: We applied the VLOOKUP formula to retrieve


specific data and populate it in ten designated cells.

4. Adding Certain Columns and Inserting Text: We added additional columns as required
and inserted relevant text into those columns to provide additional information or
context.

5. Colour Formatting: We implemented colour formatting techniques to enhance the


visual appeal and readability of the report.

6. Combining All Four Regions: We consolidated the data from the four regions into a
single Excel sheet called "Nationwide" to present a unified view of the site availability
across the entire nation.

This automation process required rigorous effort and dedication for a month. We meticulously
programmed the VBA macros to execute the desired tasks accurately and efficiently. As a
result, we now generate the Nationwide Site Availability report using VBA macros,
significantly reducing the manual effort and time previously required.

By automating the report generation process, we have streamlined the workflow and
improved efficiency in delivering timely and accurate reports to our clients. The automation
not only saves time but also reduces the chances of errors that can occur during manual
manipulation of the data.

33
Fig11: Raw data from FFtool

Fig12: Data after applying automation

34
Fig13: VBA code

Fig14: Macros Interface

35
4.3 Phase 3:

After completing three months at Ericsson, I had acquired a strong understanding of the
terminologies commonly used in the industry. Moreover, I had learned the process of
transforming raw data into meaningful insights and how to create reports efficiently.
Additionally, I gained knowledge and experience in automating reports, which significantly
reduced the time required for their generation.

As I entered the fourth month of my internship, the focus shifted towards understanding data
and its significance in the analytics process. It became clear that comprehending the data is
crucial for accurate analysis and decision-making. Consequently, we were introduced to the
concept of creating data visualizations and dashboards. We learned that presenting data in the
form of interactive dashboards enhances readability and improves data comprehension
compared to static Excel reports.

To facilitate the creation of dynamic dashboards, we received industrial training on Power BI,
a popular data visualization tool. Initially, we worked on building basic workbooks to
familiarize ourselves with the software's features and functionalities. As time progressed, I
was given the opportunity to develop real-time dashboards using actual client data.

This hands-on experience in designing and developing Power BI dashboards allowed me to


apply my knowledge of data analysis and visualization in a practical setting. I learned how to
effectively present data insights, trends, and patterns using interactive visual elements such as
charts, graphs, and filters. Creating real-time dashboards enabled me to work with live data
streams and provide clients with up-to-date information and actionable insights.

Overall, this phase of my internship not only expanded my technical skills in Power BI but
also emphasized the importance of data visualization in conveying information effectively. It
highlighted the significance of presenting data in a visually appealing and user-friendly
manner to facilitate data-driven decision-making for clients and stakeholders.

4.2.1 Power bi

Power BI is a powerful business intelligence tool developed by Microsoft. It provides a range


of features and functionalities to transform data into rich visualizations, interactive reports,
and insightful dashboards. Here are some key features of Power BI:

1. Data Connectivity:
 Power Query: Power Query allows users to connect to various data sources,
such as databases, files, online services, and APIs. It offers data transformation
and cleansing capabilities to prepare data for analysis.

 Direct Query and Live Connection: Power BI can connect directly to data
sources, enabling real-time data analysis without importing or storing the data
in Power BI.

36
2. Data Modeling and Preparation:
 Power Pivot: Power Pivot is an in-memory data modeling engine that enables
users to create data models and relationships between multiple data tables.

 DAX (Data Analysis Expressions): DAX is a formula language used in Power


BI for creating custom calculations, aggregations, and measures.

 Data Profiling: Power BI provides data profiling capabilities to assess data


quality, identify anomalies, and understand data distributions.

3. Data Visualization:
 Interactive Visualizations: Power BI offers a wide range of interactive
visualizations, including bar charts, line graphs, maps, scatter plots, treemaps,
and more. Users can customize visual properties, apply filters, and drill down
into data for deeper insights.

 Custom Visuals: Power BI supports the integration of custom visuals


developed by the community, allowing users to extend the visualization
options and create unique visual representations.

 Themes and Formatting: Power BI provides options to apply pre-built themes


or customize the appearance of reports and dashboards, including colors, fonts,
backgrounds, and layout.

4. Report Authoring:
 Power BI Desktop: Power BI Desktop is a desktop application that allows
users to create, design, and publish reports and visualizations. It provides
advanced authoring capabilities, including data modeling, complex
calculations, and interactive visuals.

 Power BI Service: Power BI Service is a cloud-based platform for sharing,


collaboration, and publishing reports. It allows users to access and interact with
reports through web browsers or mobile devices.

5. Collaboration and Sharing:


 Power BI Workspace: Power BI Workspace enables collaboration among team
members by providing a shared workspace to create, edit, and share reports
and dashboards.

 Sharing and Collaboration: Power BI allows users to share reports and


dashboards with specific individuals or groups, control access permissions, and
collaborate in real-time by leaving comments and annotations.

37
6. Data Refresh and Scheduled Refresh:
 Data Refresh: Power BI supports refreshing data from various sources to
ensure that reports and dashboards display up-to-date information.

 Scheduled Refresh: Power BI allows users to schedule data refreshes at regular


intervals to keep reports and dashboards updated automatically.

7. Natural Language Query and Q&A:


 Q&A (Question and Answer): Power BI provides a natural language query
feature that allows users to ask questions about the data using everyday
language and receive visualizations and answers in real-time.

8. Mobile Reporting:
 Power BI Mobile: Power BI Mobile applications enable users to access, view,
and interact with reports and dashboards on mobile devices, providing a
seamless experience across platforms.

9. Data Security and Governance:


 Row-level Security: Power BI allows users to implement row-level security,
controlling data access based on user roles or permissions.

 Data Encryption: Power BI ensures data security by encrypting data both at


rest and in transit.

 Data Governance: Power BI offers features for data governance, including


content lifecycle management, version control, and audit trails.

10. AI Integration:
 AI-powered Insights: Power BI integrates AI capabilities to automatically detect
patterns, anomalies, and trends in data, providing valuable insights and
recommendations.

Power BI's features empower users to analyze data, create compelling visualizations,
collaborate with others, and gain actionable insights. It serves as a comprehensive tool for
data visualization, reporting, and business intelligence, catering to a wide range of data
analysis requirements in organizations of all sizes and industries.

38
Fig15: Raw data for making dashboard

Fig16: Dashboard on Power BI

39
Conclusion
Working as an intern in data analysis at Ericsson has been an invaluable experience. Ericsson,
a global leader in telecommunications, has provided me with a supportive and enriching
environment to learn and grow in the field of data analysis. Here is a summary of my
internship experience:

As an intern, I had the opportunity to work with real-world data from Ericsson's vast data
repositories. Analyzing actual data sets related to network performance, customer behavior, or
sales has given me hands-on experience in dealing with the complexities and challenges of
working with large-scale data.

During my internship, I collaborated with experienced data analysts and professionals from
diverse backgrounds. Their guidance and mentorship have been instrumental in expanding my
analytical skills and understanding of the telecommunications industry. Through teamwork
and knowledge sharing, I gained valuable insights into the best practices and techniques used
in data analysis. At Ericsson, I had access to advanced analytical tools and software used for
data analysis. I was able to apply statistical methods, data visualization techniques, and
programming languages like Python to gain insights and present findings effectively.
Utilizing these tools has enhanced my technical skills and prepared me for future data analysis
endeavors.

Working on various projects as an intern, I was tasked with identifying and solving complex
data-related challenges. This experience honed my problem-solving and critical thinking
abilities, as I had to analyze data, recognize patterns, and propose data-driven solutions to
address specific business problems or improve operational efficiency.
Being an intern at Ericsson allowed me to gain valuable exposure to the telecommunications
industry. I learned about the unique challenges and opportunities in the field, and how data
analysis plays a crucial role in optimizing network performance, enhancing customer
experiences, and driving business growth in this dynamic sector.
Ericsson prioritizes the professional development of its interns. Throughout my internship, I
had access to various training programs, workshops, and learning resources that helped me
enhance my data analysis skills. The exposure to industry-leading practices and the
opportunity to apply them in real-world scenarios has been invaluable for my personal and
professional growth.

Overall, my internship at Ericsson in data analysis has been an exceptional learning


experience. The exposure to real-world data, collaboration with professionals, application of
analytical tools, and the opportunity to contribute to meaningful projects has laid a strong
foundation for my career in data analysis. I am grateful for the knowledge gained, the skills
developed, and the valuable experiences I had during my time at Ericsson.

40
References
1. Kaggle (www.kaggle.com): Kaggle is a platform that hosts data science competitions
and provides a vast repository of datasets. It offers a community of data enthusiasts,
tutorials, and notebooks to explore and learn from.

2. DataCamp (www.datacamp.com): DataCamp is an online learning platform that offers


interactive courses on data analysis, programming languages (such as Python and R),
and machine learning. It provides a hands-on learning experience through coding
exercises and projects.

3. Coursera (www.coursera.org): Coursera offers a variety of data analysis and data


science courses taught by renowned professors and industry experts from top
universities and organizations. These courses cover a wide range of topics and provide
comprehensive learning resources.

4. Udemy (www.udemy.com): Udemy is an online learning platform that hosts


numerous data analysis courses, both free and paid. It offers courses for beginners as
well as more advanced topics, allowing learners to choose based on their skill level
and interests.

5. Stack Overflow (stackoverflow.com): Stack Overflow is a popular online community


where data analysts and programmers can ask questions and get answers related to
data analysis. It serves as a valuable resource for troubleshooting, problem-solving,
and learning from the experiences of others.

6. Power BI Documentation (https://learn.microsoft.com/en-us/power-bi/): For those


specifically interested in Power BI, the official Power BI documentation provided by
Microsoft offers comprehensive guidance, tutorials, and reference materials to master
the features and functionalities of the tool.

41

You might also like