100% found this document useful (2 votes)
769 views

Zomato Data Analysis

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
769 views

Zomato Data Analysis

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Zomato Data Analysis

A python with data analysis project report


At
DUCAT(Nodia-16)
SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENT FOR THE AWARD
OF THE DEGREE OF
BACHELOR OF TECHNOLOGY
IN
COMPUTER SCIENCE AND ENGINEERING

SUBMITTED BY
Name of Student: University Roll no:
Sangh priy bhasker 22-B-CSE-072

SCHOOL OF ENGINEERING

RIMT UNIVERSITY, MANDI GOBINDGARH PUNJAB


JUNE-2024 to JULY-2024

1
TITLE AND CONTENTS
TITLE
Abstract
Acknowledgement
Candidate’s Declaration
Certificate
About company
Chapter1 INTRODUCTION
● What is python and data analytics ?
● Uses of data analytic
● Why we USES OF PYTHON for data analytics

Chapter2 INTERNSHIP INTRODUCTION


● Introduction to Internship
● Keys aspects
● Benefits of Doing internship
● Types of internship
● Purpose
● Technology and literature review

Chapter 3 PROJECT WORK

● Introduction
● Objective
● Overview
● Which tool use for project
● Role of python librey
● limitation

Chapter 4 SOURCE CODE OF THE PROJECT


● Methodology
● Implementation Details
● Code structure

2
● Testing and validation
● Results
● Discussion

Chapter 5 OUTCOME ANALYSIS

Chapter 6 CONCLUSION
● References

3
ABSTRACT
In this assignment, we will conduct a comprehensive statistics analysis on Zomato's online food
shipping provider to recognize customer options. Our primary goals are to determine the forms
of restaurants that clients opt for, identify the fee range they're willing to pay for his or her
orders, and analyze the visitors at the Zomato internet site. Utilizing Google Colab for our
evaluation, we will easily prepare the dataset, perform exploratory facts evaluation (EDA), and
visualize key developments and patterns. By inspecting consumer conduct, we aim to derive
significant insights which could inform Zomato and its restaurant companions about patron
alternatives and marketplace trends. This analysis will contain statistical summaries,
visualizations, and interpretations of the facts, enhancing our understanding of the net food
shipping market. Our findings may be documented to provide actionable recommendations and
potential areas for further studies.

4
ACKNOWLEDGEMENTS

First of all, I would like to express my sincere gratitude to the coordinators at DUCAT for their
unwavering support, encouragement and guidance throughout the preparation of this project.
Their donations and facilities contributed to the successful completion of this project. Without
their input, this project would not have reached its full potential.

I especially thank the DUCAT faculty for their detailed teaching and patiently answering my
many questions about Python. Their in-depth understanding and insightful explanations were
invaluable in helping me overcome the challenges I faced in pursuing this project. Their
approach not only enhanced my understanding of the language but also sparked my interest in
solving problems through programming.

I also acknowledge the valuable contributions of my classmates whose cooperative spirit and
constant encouragement inspire me. Their comments and suggestions during the development
phase were helpful in shaping this work, especially in preparing the graphics and ensuring that
the code worked well and was easy to use This work is team effort indeed, and i appreciate the
teamwork.

I am immensely grateful to my teacher and old friend, whose early encouragement to pursue
engineering had a lasting impact on my career path and whose belief in my abilities has been a
driving force my many improvements. Looking back, I’m proud of the decision I made, and
looking forward, I’m eager to go deeper into data science. One day I want to take on a
comprehensive data science challenge and showcase all that I have learned on this journey

Finally, I would like to thank my family for their constant support and understanding

Signature of Student:
Sangh priy bhasker(22-B-CSE-072)
Rimt University (Mandi Gobindgarh)

5
DECLARATION
This is to certify that the work in this project titled "Weather Data Analytics" On "Python with
Data Analytics" was submitted to the Department of Computer Science & Engineering, RIMT
UNIVERSITY, in order to fulfill the requirement of Degree of Industrial Training in "Python
with Data Analytics" is my honest account of my work done and written under the direct
supervision and guidance of Mr. JASDEEP SINGH (Department of Computer Science &
Engineering) and Mrs. Jasmine (HOD). , 100) also. Department of Computer Science &
Technology) .

Signature of Student:
Sangh priy bhasker (22-B-CSE-072)
(Department of Computer Science & Engineering)
Rimt University (Mandi Gobindgarh)

Signature of Internal Examiner Signature of External


Examiner

6
CERTIFICATE

7
8
ABOUT COMPANY

DUCAT is india one the best IT training school the main headquarter are located in noida (sector
16 ) with best Teachers of every it fieldDUCAT is a Registered IT Training Institute. We train
students from both local and PAN India in a variety of nationally recognised professional IT
courses.With our streamlined and adaptable course delivery model, we ensure that you
thoroughly grasp the information and develop useful skills in your selected course.
When you enroll with us, you will be joining hundreds of other job searchers and IT
professionals who got employment or promotion after completion of training from us.
We offer a customized approach in training to elevate and build your IT skills which makes you
stand out from the crowd.

VISION OF DUCAT:
❖ We are thriving to establish ourselves as an education provider that focuses on giving
learners job-oriented skills.
❖ We understand what the monthly paycheck will make you and your family feel.

Mission of DUCAT:
Back in 2000, keeping in mind to train and educate youngsters, we started our journey. When we
started DUCAT The IT Training School, our passion was and is to train youngsters in
job-oriented subdomains in the IT industry to secure a career by offering you the best IT
Training. We are here to guide you to reach the pinnacle of your career. We are aware of the fact
that getting a job is the prime motive of students after course completion, here at DUCAT we
provide 100% Job assistance. Our IT Institute has 6 branches at Delhi NCR, with more than 180
courses and skill sets. Our Institute is helping youngsters to be trained in job-oriented courses
and improve their career.

9
CHAPTER 1
INTRODUCTION
1.1 What is data analytics with python?
Data Analysis is the technique of collecting, transforming, and organizing data to make future
predictions and informed data-driven decisions. It also helps to find possible solutions for a
business problem. There are six steps for Data Analysis. They are:

1.2 Uses of data analytics


Help us to promote our business and contribute towards the development of our country by
focusing on major areas such as human happiness index, hunger index, crime index, press
freedom index and war information. The data in these fields is essential for advancements. By
conducting statistical analysis, you can drive improvements and informed decision making that
actually matters in those areas.

1.3 Why we USES OF PYTHON for data analytics


Python is commonly used in data analysis due, to its simplicity, flexibility and the robust
ecosystem of libraries and tools that support data manipulation, analysis and
visualization. Here are the main reasons why Python is preferred for data analytics;
1. Easy to Learn and Use
Readable and Simple Syntax; Python's syntax is easy to grasp and write making it
accessible for beginners and efficient for programmers. This straightforwardness allows
analysts to focus on solving data problems rather than dealing with syntax.
Comprehensive Documentation and Community Support; Python boasts an active
community that provides documentation, tutorials and forums for problem solving and
learning.
2. Libraries and Tools
Data Manipulation;
Pandas; A library for data manipulation and analysis that offers data structures like
DataFrames, which are highly flexible and effective for handling large datasets.
Numerical Computing;
NumPy; Supports dimensional arrays and matrices along with a set of mathematical
functions to work on these arrays.
Data Visualization;
Matplotlib; A plotting library used for creating interactive visualizations.
Seaborn; Built on top of Matplotlib it offers a user interface for generating appealing
graphics.

10
Plotly is a user tool that allows you to easily generate intricate visual representations, like
graphs, charts and dashboards.
1.4 How to become a data analytics
Technical Background
Computer Science (CSE) Fundamentals:

Programming: Master programming languages usually utilized in facts analytics, along


with Python and R. These languages are critical for facts manipulation, evaluation, and
constructing models.
Data Structures and Algorithms: Understanding facts structures (like arrays, lists,
dictionaries) and algorithms enables efficient facts processing and manipulation.
Database Management: Knowledge of SQL for querying databases is crucial. Familiarize
yourself with database control systems (DBMS) like MySQL, PostgreSQL, and NoSQL
databases like MongoDB.
Data Science and Analytics:

Data Manipulation: Use libraries like Pandas in Python for facts manipulation and
cleaning.
Data Visualization: Learn visualization tools and libraries inclusive of Matplotlib,
Seaborn, Plotly, and gear like Tableau and Power BI to create insightful visualizations.
Machine Learning: Basic information of gadgets, getting to know algorithms and their
packages, the usage of libraries like Scikit-learn.
Mathematics and Statistics:
Algebra: Linear algebra is essential for expertise in numerous records analysis methods and
gadget learning algorithms.
Calculus: Essential for information optimization algorithms used in gadget gaining
knowledge of.
Probability and Statistics: These shape the backbone of data analytics. Concepts like
probability distributions, speculation testing, regression, and greater are crucial.
Tools and Software:
Excel: Strong command of Excel for simple data manipulation, evaluation, and visualization.
Power BI: Proficiency in Power BI for superior records visualization and enterprise
intelligence.
Building Logic and Problem-Solving Skills
Programming Logic: Developing logical wondering to clear up issues programmatically.
Practice coding often on systems like LeetCode, Hac

11
CHAPTER 2
LITERATURE SURVEY
2.1 INTRODUCTION:
Python with data analytics indeed has wide-ranging applications across many domains, from
improving business strategies and economic development to enhancing national security and
shaping policies. You'll likely gain valuable insights into how data-driven approaches can drive
decision-making and innovation across these diverse areas.
2.2 KEYS ASPECTS OF AN INTERNSHIP:
1) Comprehensive Learning:
● Beginner to Advanced Levels: The course is structured to cater to college students
at special levels, ensuring a comprehensive information of Python and facts
analytics.
● Non-Technical Friendly: Concepts are explained in a simple and straightforward
manner, making it easy for non-technical users to understand.
2) Hands-On Experience:
● Practical Problem-Solving: Emphasis on fixing real-world problems will assist
you advantage fingers-on experience.
● Project-Based Learning: Working on tasks permits you to apply theoretical
know-how in sensible eventualities, improving your gaining knowledge of
experience.
3) Industry Relevance:
● Job Preparation: The practical information and talents you benefit could be
immediately relevant to industry jobs, making you a sturdy candidate in the
activity market.
● Business Opportunities: Understanding a way to leverage records analytics can
also help you in starting and strolling your personal enterprise successfully.
4) Duration and Structure:
Internships can range in length, ranging from some weeks to numerous months,
depending on the enterprise and the nature of the paintings. They can be full-time or
component-time and often observe an established application or task-based assignments.
5) Skill Development:
The summer time internship application specializes in equipping you with cutting-edge
and future marketplace-relevant abilities in Python and records analytics. Designed for all
stages, it simplifies complicated principles and emphasizes realistic mastering. Over six
weeks, you'll master statistics manipulation with Pandas, numerical computing with
NumPy, and information visualization with Matplotlib and Seaborn. This fingers-on

12
technique ensures you gain real-world revel in, making you competitive in the task
marketplace and ready for entrepreneurial ventures.

6) Supervision and Mentorship:


Apprentices usually work under the supervision of experienced professionals who
provide guidance, advice and feedback. These mentoring relationships are essential for
trainees to develop their skills, gain practical experience and effectively tackle workplace
challenges. It ensures a supportive environment in which apprentices can learn and grow,
preparing them for future career opportunities.
7) Networking Opportunities:
● Throughout your internship you'll get to participate in practice interviews and
engaging sessions. Engaging with professionals and coworkers who have
hands-on business expertise can offer you perspectives and real world know-how
that will help shape your future career.

2.3 Benefits of Doing an Internship:


Internships provide a number of benefits that may be vital for each non-public and
professional development. Here are a few key blessings:
● Practical Experience: Intenship providing work exprince providing skills set for
market help to create project
● Skill development:internship will help to build strong skill set with practical
experience
● Networking:Internships provide an opportunity to meet and network with
professionals in your field, which can be invaluable for future job opportunities
and career advice.
● Mentorship: Internships often provide admission to mentors who can offer
guidance, assist, and recommendation, supporting you to navigate your career
route.
● Confidence Building:Internship help to build confidence because during
internship we will so many project and gain skill
● Enhanced Resume: internship create your resume strong and more interactive to
recruiter
● Understanding Workplace Culture: learning about work culture which well
help us in our future for making a good career
● Academic Credit: Some internships offer instructional credit scores, contributing
to your academic necessities.

13
2.4 Types of Internships:
● Paid: The are some internship the are paid the amount very on institute to institute
And paid internship helpful who those no any exprence in tech field and other field
● Unpaid: mostly after graduation many student apply in company for job so company
we'll take him as fresher but for unpaid internship you have need strong skill set and
some time company well pay you
● digital internship: is a piece revel in software wherein the intern gains sensible revel in
by means of working remotely in place of on-web site at the company's location. This
kind of internship leverages digital communication tools and technology to facilitate
tasks, supervision, and collaboration.
2.5 PURPOSE:
Internships offer unique learning opportunities outside of an academic setting. It exposes you to
new tasks and helps you learn goal-specific skills to accomplish those tasks. Internships also give
you experience with technology, people, and services that can be relevant to your career goals.
Basically, a Python with data analytics internship can give you useful skills in data manipulation,
analysis and visualization. You’ll gain hands-on experience with Python libraries like Pandas,
NumPy, and Matplotlib, and learn how to use data analysis to solve real-world problems,
preparing you for a career in data science and business analytics
2.4 TECHNOLOGY AND LITERATURE REVIEW:
I will be doing an internship in data analytics in Python, which is important in various industries.
Data analysis helps in business growth, national growth, GDP growth, war strategies, stock
market, real estate, happiness index, press freedom index, hunger index, etc. Understanding and
analyzing data enables us to solve problems and find opportunities, develop critical data analysis
skills to solve complex problems and drive improvement in multiple areas and sources.
● Role of python: Python is crucial for data analysis, providing tools like Pandas for
efficient information cleaning and manipulation. Libraries like NumPy simplify
complicated calculations, while Seaborn and Matplotlib permit easy information
visualization. Python's versatility and effective libraries make it indispensable for
managing, studying, and visualizing facts in a trustworthy way.

Fig 2.1 logo of python library for data analytic

14
● Role of MS Excel and Power BI:
MS Excel: Extensively used for data entry, analysis and basic visualization. It provides
functionality for calculating, sorting, filtering, and transforming data, making it versatile
for small and large datasets.

Power BI: Designed for advanced data analysis and visualization. It integrates with
multiple data sources, creates interactive dashboards, and provides advanced analytics
capabilities such as predictive analytics, machine learning and integration

Fig 2.2 Logo of power bi and Ms excel


● Role of math in data analytics:
Mathematics forms the inspiration of statistics analytics, presenting the theoretical
framework and tools essential for interpreting and deriving insights from
statistics. Key roles consist ofStatistical Analysis Probability idea and statistical
methods are used to analyze statistics distributions, correlations, and
developments.Linear Algebra Matrix operations underpin strategies like
dimensionality discount and machine mastering algorithmsCalculus: Derivatives
and integrals allow optimization algorithms in records modeling and predictive
analytics.
Mathematics equips analysts with the essential questioning and quantitative
talents needed to extract meaningful conclusions and make information-driven
choices across various domain names.
● Role of sql:
SQL (Structured Query Language) is essential for statistics management, allowing
efficient retrieval, manipulation, and evaluation of dependent facts. It supports

15
obligations like querying databases, appearing aggregations, and preserving data
integrity, critical for powerful facts-pushed choice-making.

Fig 2.3 Logo of MySQL


2.4 INTERNSHIP PLANNING :
This internship program was held for 6 weeks and had given me the knowledge of python
programming as python library Numpy,Matplotlib and pandas and power bi , Ms excel
The internship had covered Topics starting from:
● Core Python Concepts
● Collection
● Functions
● Modules
● Input-Output
● Exception Handling
● File Handling
● Numpy
● Pandas
● Power bi

2.5 CONCLUSION
Internships provide individuals with valuable opportunities to gain practical experience, refine
skills, and explore career interests in a real-world setting. It is a key link between academic
learning and professional pathways, providing participants with the experience and insights
necessary for future professional success and personal growth.

16
CHAPTER 3
PROJECT WORK
3.1 INTRODUCTION
In this assignment, we will conduct a comprehensive statistics analysis on Zomato's online food
shipping provider to recognize customer options. Our primary goals are to determine the forms
of restaurants that clients opt for, identify the fee range they're willing to pay for his or her
orders, and analyze the visitors at the Zomato internet site. Utilizing Google Colab for our
evaluation, we will easily prepare the dataset, perform exploratory facts evaluation (EDA), and
visualize key developments and patterns. By inspecting consumer conduct, we aim to derive
significant insights which could inform Zomato and its restaurant companions about patron
alternatives and marketplace trends. This analysis will contain statistical summaries,
visualizations, and interpretations of the facts, enhancing our understanding of the net food
shipping market. Our findings may be documented to provide actionable recommendations and
potential areas for further studies.

Fig 3.1 logo of zomato

3.2 OBJECTIVE:
This venture aims to investigate Zomato's data to understand purchaser choices for exclusive
food kinds, hourly app visitors, and patron satisfaction with on-line food delivery. We will also
investigate how Zomato affects residential growth, become aware of preferred rate stages for
food, and measure the business enterprise's income and loss. Using Google Colab, we are able to
clean the dataset, perform exploratory statistics analysis, and visualize key trends. Our findings
will provide actionable insights for Zomato and its partners, informing future techniques and
figuring out regions for improvement.

17
3.3 Overview:

Fig 3.2 In this graph we show which price range people buy the food
To create a food business analysis, follow these steps:
● Profit and Loss (P&L) Statement: Calculate net income by deducting costs from revenue
over a specific period.
● Price Point Analysis: Determine an optimal price based on market demand, competition,
and costs.
● Graphical Visualization: Use Pandas and Plotly to create interactive financial charts.

3.4 WHICH TOOL USE TO CREATE PROJECT:


● Google colab:
Google Collab is a cloud-based versatile platform for Python programming, widely used
for machine learning and data analysis. It eliminates the need for community
configuration with an integrated environment that allows access from any device,
including browsers. Collab comes equipped with essential libraries such as Pandas,
NumPy, Matplotlib, and TensorFlow, which facilitate efficient data manipulation and
model training. Its interactive features enable real-time notebook sharing and exchange
between team members. Additionally, Colab offers free GPU and TPU components,
which increase compute capacity for demanding processing applications. It is the ideal
tool for finding and deploying Python-based solutions in data-driven projects and
machine learning applications.

18
Fig 3.3 google collab logo
● Python:
Python is crucial for data analysis, providing tools like Pandas for efficient information
cleaning and manipulation. Libraries like NumPy simplify complicated calculations,
while Seaborn and Matplotlib permit easy information visualization. Python's versatility
and effective libraries make it indispensable for managing, studying, and visualizing facts
in a trustworthy way.

Fig 3.4 Python logo


3.5 PYTHON LIBRARY ROLE :
NUMPY:
Numpy stands for numerical python that provides support for large and multi dimensional arrays
and matrices along with a collection of high-level mathematical functions to operate on these
arrays.

Fig 3.4 Numpy logo

19
PANDAS:
Pandas is a Python library used to manipulate datasets.There are a number of functions for
analyzing, storing, searching, and processing this data.The name "Pandas" refers to "Panel Data",
and "Python Data Analysis" and was developed by Wes McKinney in 2008

Fig 3.5 Pandas logo


SEABORN:
Seaborn is a python library which are based on matplotlib. Seaborn helps us to have a high-level
interface for drawing attractive and informative statistical graphics.

Fig 3.6 seaborn logo


MATPLOTLIB:
Matplotlib is a python library that help us to create high level graph and different type of graph
for data visualization

Fig 3.7 Matplotlib logo

20
3.6 LIMITATION:
● Data Quality: The accuracy of the search depends on the quality and completeness of the
data set. Incomplete or inaccurate data can skew the results of the study.
● Scalability:While it suitable in smaller data,large data required for more good
visualization
● Data Quality: The accuracy of insights depends on the quality and completeness of the
dataset. Incomplete or inaccurate data can skew analysis results
● Scope:The recent trends are not capture in these data so it is very difficult to analytics
● Visualization:without big data the visualization in such a limit so it take a very

21
CHAPTER 4
SOURCE CODE OF THE PROJECT
Zomato is an Indian multinational restaurant aggregator and food delivery company. It was
founded by Deepinder Goyal and Pankaj Chaddah in 2008. Zomato provides restaurant reviews,
menus, user reviews and offers food delivery options from restaurant partners in India fills more
than 1,000 cities and towns, and rivals Zomato Swiggy in the food delivery and hyperlocal space
by 2022–23 and is available.

3.1 ZOMATO HISTORY:


Zomato was founded in 2008 as FoodieBay by Deepinder Goyal and Pankaj Chaddah who
worked at Bain & Company. The website began as a portal to restaurant listings and
recommendations. The company's name was changed in 2010 because it was unsure if it would
"stick to just food" to avoid potential naming conflicts with eBay.2012 Zomato logo

With the introduction of .xxx domains in 2011, Zomato also launched zomato.xxx, a website
dedicated to food porn.Later, in 2011, Zomato officially launched online ticketing for events.
In 2011, it spread across India to Delhi-NCR, Mumbai, Bangalore, Chennai, Pune, Ahmedabad,
Hyderabad, etc. In 2012, the United Arab Emirates, Sri Lanka,Qatar, the United Kingdom, the
Philippines , It expanded operations internationally to several countries such as South Africa and
in 2013 expanded to New Zealand, Turkey, Brazil and Indonesia, with websites and apps in
Turkish . Portuguese, Indonesian and English. As of 2014 it was launched in Chile and Portugal
in April, followed by Canada, Lebanon and Ireland in 2015.

3.2 IMPLEMENTATION DETAILS:


To address our analysis, we need to respond to the subsequent inquiries:
● Do a greater number of restaurants provide online delivery as opposed to offline services?
● Which types of restaurants are the most favored by the general public?
● What price range is preferred by couples for their dinner at restaurants?
3.3 CODE STRUCTURE:
Here is your script, defined in a detailed paragraph format:
First, we import the important libraries: numpy as np, pandas as pd, and matplotlib.Pyplot as plt.
These libraries provide essential functions for numerical operations, information manipulation,
and plotting, respectively. Next, we study the CSV file order element.Csv the use of
pd.Read_csv(), ensuring that the file path is up to date if necessary. We then convert the
'transaction records' column to datetime format using pd.To_datetime(), developing a brand new
'Time' column within the DataFrame. From this 'Time' column, we extract the hour of every
transaction and keep it in a brand new 'Hour' column.

To determine the most frequent hours of the day for transactions, we use the value_counts()
technique at the 'Hour' column, observed via sort_index() to sort the hours in ascending order.

22
This offers us a Series in which the index represents the hours of the day, and the values
represent the memory of transactions at some point of each hour. We then create a brand new
DataFrame referred to as timemost_df, which incorporates columns: 'Hour' and 'Count', derived
from the sorted index and values of the previous Series.

For output, we print a header "Hour of the day Number of telephones sold" accompanied by a
way of iterating via the rows of timemost_df and printing each hour and its corresponding reply
of transactions in a formatted string. Finally, we plot this information using plt.Bar(), specifying
the determined length and placing suitable labels and titles for the x-axis, y-axis, and the plot
itself. The x-axis ticks are set to the hours of the day for higher clarity, and we show the plot the
usage of plt.Show().

This complete script reads transaction data, processes it to extract hourly records, and visualizes
the variety of phones sold at unique hours of the day, presenting valu

CODE OF THE PROJECT: -

23
24
OUTPUT:

25
26
27
28
29
CHAPTER 5
OUTCOME ANALYSIS
In this records evaluation task, we aim to explore several key questions about the use of Python
modules in Google Colab. Firstly, we inspect whether or not a larger percentage of restaurants
offer on line transport in comparison to the ones supplying handiest offline offerings. Next, we
examine the sorts of eating places desired maximum with the aid of the overall public, analyzing
elements such as cuisine categories or dining patterns that attract the best patronage.
Additionally, we delve into the desired charge levels for couples eating out, that specialize in
how pricing impacts eating selections and developments. Through these analyses, leveraging
Python's abilities for records manipulation and visualization, we aim to find insights that remove
darkness from eating preferences and carrier developments across special restaurant sorts.
In this data analysis project, we aim to investigate several key questions using Python modules in
Google Colab. First, we examine whether a greater proportion of restaurants offer online delivery
compared to those offering offline services only. Then, we analyze which restaurants are most
popular with the general public, examining factors such as which types of cuisines or types of
food attract the most patrons. Furthermore, we examine price preferences for couples dining out
in more detail, focusing on how pricing affects dining choices and trends. Through this study,
using the power of Python in data manipulation and visualization, we aim to identify insights that
reveal food preferences and service improvements in restaurants.

5.1 OUR RESULT:


● Which types of restaurants are the most favored by the general public?
In these figure we see the most of people will buy the food from dining resaurant around 74.3%
like to buy from dining and 15.5% people are buy the food from cafe and 4.7% from buffet and
5.4% from other

Fig 5.1 The majority of the restaurants fall into the dining category.
● Do a greater number of restaurants provide online delivery as opposed to offline services?

30
Fig 5.2 in this fig we see restaurants provide online delivery as opposed to offline services

In this fig no 5.2 we see most of the restaurants do not accept the online service the prefer offline
service in this graph we will see restaurants not interested to come online

● Rating of restaurants between maximum buy customer

Fig 5.3 maximum rating buy user 3.5 to 4.5

31
The majority of restaurants received ratings ranging from 3.5 to 4.and some got between 3.0 to
1.5

● Which price range people buy maximum food

Fig 5.4 which rate maximum people buy food

The distribution of food orders across price ranges shows patterns. According to the data, the
most popular order prices for food items are Rs 300, 150 and 200, each of which attracts a
significant number of orders, indicating a particularly balanced consumer preference Rs 300
appears as price the most preferred, with Rs 150 followed by Rs 200 Prices above Rs 400 see a
slight but notable decline in the number of orders, indicating the willingness of some consumers
to spend more on their food choices . . . . In contrast, Rs 950 represents the least preferred price,
with relatively few items recorded compared to other prices. This segmentation reflects
consumer behavior and food ordering preferences based on perceived value and price perception.

● What people most prefer online vs offline


The data indicates a wonderful choice in ordering conduct between dining and eating places and
cafes. Dining restaurants predominantly take delivery of offline orders, indicating that an
enormous portion of their purchasers prefers to place orders in individual or through traditional
strategies. On the opposite hand, cafes display a strong inclination towards online orders,
implying that a bigger section of cafe customers prefers the convenience and efficiency of
ordering on-line. This contrast underscores various purchaser options primarily based at the form

32
of establishment, in which eating restaurants cater extra to in-person interactions and cafes align
with the growing trend of virtual convenience in meals ordering.

Fig 5.6 What people most prefer online vs offline

33
CHAPTER 6
CONCLUSION
From analyzing our Zomato data, we learned a lot about what people want when it comes to
restaurants.
My favorite restaurants:
Majority:-About 74.3% prefer restaurants. These places are famous for their sit down food and
variety of dishes. Cafes are next, with 15.5% of choices, possibly due to their fast food and
coffee options. The rest are buffets and other types, meaning there is a niche market for dining
experiences.
How people order:
Restaurants are increasingly accepting in-person or telephone (non-online) orders, which means
that people prefer the conventional ordering method. But cafes are big on online ordering—about
84.5% offer this service. This difference suggests that cafes are better suited for people who want
to order food online for convenience.
Preferred Price:
In terms of how much money people spend on food, Rs 300 is the most popular price, followed
by Rs 150 and Rs 200. These prices attract more orders, which indicates demand for food which
are cheaper and more expensive ones are sought as low as Rs 400. Rs 950 is the least popular,
Which means people are more careful about spending money in expensive areas
What this means for restaurants:
For restaurants, focusing on a great personalized dining experience is key, but they can also
benefit from offering more online ordering options. Cafes should continue to strengthen their
online ordering systems to meet the growing demand for convenience.
Strategic Insights:
Restaurant owners can use these insights to improve their menus, improve customer service, and
better manage ordering preferences. Doing these preferences right can help restaurants stay
competitive and keep customers happy.
conclusion:
Check it out anyway

34
References: -
● https://www.python.org/community/logos/
● https://www.geeksforgeeks.org/zomato-data-analysis-using-python/
● https://youtu.be/UiEBA8WHTXA?si=8NxkkJiHmw88jHkB
● https://en.wikipedia.org/wiki/Zomato
● https://www.zomato.com/
● https://openai.com/
● https://www.ducatindia.com/

35

You might also like