0% found this document useful (0 votes)
0 views

AI ProjectCycle

The AI Project Cycle consists of five main stages: Problem Scoping, Data Acquisition, Data Exploration, Modelling, and Evaluation, guiding the development of AI projects from initiation to deployment. Each stage involves specific tasks, such as defining the problem, gathering data, analyzing it, building models, and evaluating their performance. The cycle emphasizes structured frameworks to ensure high-quality AI project outcomes.

Uploaded by

Aditya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

AI ProjectCycle

The AI Project Cycle consists of five main stages: Problem Scoping, Data Acquisition, Data Exploration, Modelling, and Evaluation, guiding the development of AI projects from initiation to deployment. Each stage involves specific tasks, such as defining the problem, gathering data, analyzing it, building models, and evaluating their performance. The cycle emphasizes structured frameworks to ensure high-quality AI project outcomes.

Uploaded by

Aditya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

AI PROJECT CYCLE

Introduction
Project: A project can be de ined as a set of inputs and outputs required to achieve a
predetermined or speci ic goal.
Project Cycle: The Project Cycle, also known as Project Life Cycle, is a series of phases or stages
that a project goes through from its initiation to its completion and closure. It provided
structured framework for planning, executing, monitoring, and delivering a project.
In general, the four phases of project life cycle are as follows:

What is AI Project Cycle?


The AI (Arti icial Intelligence) Project Cycle is a series of stages that an AI project goes through
from its starting to its completion. This cycle involves the application of AI technologies,
algorithm, and methodologies to solve speci ic problems or achieve prede ined objectives.
In simple terms, The AI project cycle refers to the sequence of phases that a project goes through
its initiation to the closure. It is the step of techniques that support the development of high-
quality AI projects.
Every AI Project Cycle comprises ive main stages: Problem Solving, Data Acquisition, Data
Exploration, Modelling and Evaluation.

`
 Problem Scoping – It is the irst step of an AI project cycle. In this step, you have to
write about the nature and boundaries of a problem. In simple words, it is a process by
which a problem is de ined.
 Data Acquisition - Data Acquisition or Data Gathering refers to a process of identifying
and gathering all data required for an AI project.
 Data Exploration – It refers to a process of understanding the nature of the data you
have to work with, in terms of quality, characteristics, etc. This step provides you a better
understanding of data resulting in an effective end product.
 Modelling – An appropriate analysis of collected data is an essential requirement for
modelling data. In this step, you can select the datasets which match your requirement
and train the model using appropriate machine-learning algorithms or AI enabled
algorithms.
 Evaluation – Deployment is the last step of AI project cycle. But, before deploying any
AI-enabled system, proper evaluation of a trained model is necessary as it determines
the accuracy of the model according to the project requirements.

Understanding Problem Solving


What is Problem Scoping?
Problem Scoping in an AI project cycle involves de ining and understanding the speci ic problem
that the AI project aims to solve. It is a critical initial phase where the project’s objectives,
constraints, stakeholders’ needs, and potential solutions are identi ied and delineated.
This phase typically includes:

 Problem De inition- Clearly de ining the issue or challenge that needs to be addressed.
This involves understanding the context, the stakeholders involved, and the speci ic
goals the AI solution should achieve.
 Scope De inition- Outlining the boundaries and limitations of the project. This involves
determining what is within the project’s scope and what aspects or elements are outside
of it.
 Data Identi ication- identifying the data required to address the problem. This includes
understanding the type, quality, and quantity of data needed for the AI system to learn
and make informed decisions.

Note: Problem Scoping is the first step in the AI Project Cycle, where a specific problem is
identified and clearly defined. It involves setting goals, understanding needs, and planning
solutions to ensure the project stays focused and on track.

4 Ws Problem Canvas
The 4 Ws Problem Canvas is a problem-solving tool and framework used to de ine and analyze a
problem or challenge systematically. It is especially helpful for individuals, teams, or
organization seeking to gain a comprehensive understanding of a problem before devising
solution or making decisions.

`
The detailed description of the 4Ws are as follows:
a) Who: The “Who” block helps you in analysing the people getting affected directly or
indirectly due to it? Under this, you ind out who the ‘stakeholders’ to this problem are
and what you know about them. Stakeholders are the people who face this problem and
would be bene itted with the solution.
b) What: Under the “What” block, you need to look into what you have on hand. At this
stage, you need to determine the nature of the problem. What is the problem and how do
you know that it is a problem?
c) Where: In this block, you need to focus on the context/situation/location of the
problem. It will help you look into the situation in which the problem arises, the context
of it, and the locations where it is prominent.
d) Why: In the “Why” canvas, think about the bene its which the stakeholders would get
from the solution and how would it bene it them as well as the society.

Problem Statement Template


A problem statement template is a structured format used to de ine and articulate a problem. It
is like a blueprint for describing a problem. It helps by breaking things down into para that make
sense.
Problem statement template covers what the problem is, why it matters, and what needs ixing.
This template also sets the goals for solving it, the ways to measure progress, any limit or rules
to follow, and who's involved. It is like a map guiding how to understand and solve a problem
step by step, making sure everyone's on the same page about what needs ixing and why.

`
Understanding Data Acquisition
What is Data Acquisition?
Data acquisition refers to the process of collecting and gathering information from various
sources or sensors. It is like collecting raw data from tools or devices, such as temperature
sensors or cameras. This information gets turned into digital form so computers can use it. The
data then gets stored for analysis or sent to a central system. It's essential for science,
engineering, and other fields, helping us understand things better and make decisions based on
information we collect.

What is Data?
Data is a representation of facts or instructions about an entity that can be processed or
conveyed by a human or a machine, such as numbers, text, pictures, audio clips, videos, and so
on.
There is two type of data –

 Structured Data
 Unstructured Data
Structured Data –

 It is categorised as quantitative data


 It’s the type of data most of us work with every day
 It has prede ined data types and format so that it its well in columns/ ields of database
 They are highly organised and easily analysed
 Example : Name, age, address, etc
Unstructured Data –

 It is categorised as qualitative data


 It cannot be processed and analysed using conventional relational database methods
 It is dif icult to deconstruct because it has no prede ined models, meaning it cannot be
organised in relational database
 Example- text, video, audio, satellite imaginary, etc

Dataset
Dataset is a collection of data in tabular format. Dataset contains numbers or values that are
related to a speci ic subject. For example, students’ test scores in a class is a dataset.

The dataset is divided into two parts


a) Training dataset – Training dataset is a large dataset that teaches a machine learning
model. Machine learning algorithms are trained to make judgments or perform a task
through training datasets. Maximum part of the dataset comes under training data
(Usually 80%).

b) Test dataset – Data that has been clearly identi ied for use in tests, usually of a
computer program, is known as test data. 20% of data used in test data.

`
Acquiring Data from Reliable Sources
There are six ways to collect data.

a. Surveys
A research method for gathering data from a predetermined sample of respondents in
order to get knowledge and insights into a variety of issues.
b. Cameras
We can collect visual data with the help of cameras, this data is unstructured data that
can be analysed via Machine learning.
c. Web Scripting
Web scribing is a technique for collecting structured data from the internet, such as
news monitoring, market research, and price tracking.
d. Observation
Some of the information we can gather through attentive observation and monitoring.
e. Sensors
With the help of sensors also we can collect the data. A device that detects or measures a
physical property are called sensors, such as biomatrix.
f. Application program interface
An API is a software interface that enables two apps to communicate with one another.

System Maps

 System Maps help us to find relationships between di erent elements of the problem
which we have scoped.

`
 A system map shows the components and boundaries of a system and the components
of the environment at a specific point in time.
 We use system maps to understand complex issues with multiple factors that a ect
each other.
 In a system, every element is interconnected.
 With the help of System Maps, one can easily define a relationship amongst di erent
elements which come under a system.
 It helps us in strategizing the solution for achieving the goal of our project.

Understanding Data Exploration and Data


Visualisation
Data Exploration is the process of exploring data to interpret useful information from the
acquired raw facts in order to better understand the nature of data.
Data Visualisation refers to a process of representing data in a pictorial or graphical format
using various visualisation tools. In simple words, data visualisation is a quick, easy way to
convey concepts in a universal manner.

Why Data Exploration


Thus, to analyse, we need to visualise it in some user-friendly format so that we can:

 Quickly get a sense of the trend, relationships and patterns contained within the data
 Define strategy for which model to use at a larger stage
 Communicate the same to others

Data Visualization Chart / Data Visualisation Techniques


Data visualization charts are graphical representations of data that use symbols to convey a
story and help people understand large volumes of information.

 Column Chart – A column chart is a basic Visualization chart that uses vertical columns
to represent data series. Because column lengths are easy to compare, column charts are
an effective approach to demonstrate the changes in the data.
 Bar Chart - A bar chart is a visual representation of category data. The data is displayed
in a bar chart with multiple bars, each representing a different category.
 Pie Chart – It helps show proportions and percentages between categories, by dividing a
circle into proportional segments. Each arc length represents a proportion of each
category.
 Histogram – It visualises the distribution of data over a continuous interval or certain
time period. Each bar in a histogram represents the tabulated frequency at each interval
 Line Graph – A line chart is a line or multiple lines showing how single, or multiple
variables develop over time. It is a great tool because we can easily highlight the
magnitude of change of one or more variable over a period.

`
 Scatterplot – It consists of multiple data points plotted across two axes. Each variable
depicted in a scatter plot would have multiple observations. If a scatter plot includes
more than two variables, then we would use different colour to signify that.

AI, Machine Learning vs Deep Learning

Aspect Artificial Intelligence (AI) Machine Learning (ML) Deep Learning (DL)

Broad field encompassing Focused on training systems A deeper specialization in ML


Scope
all intelligent systems. to learn from data. with neural networks.

Automate learning through Achieve high-level feature


Objective Mimic human intelligence.
data-driven algorithms. recognition and automation.

Algorithms include
Includes rule-based Primarily neural networks like
Algorithms regression, decision trees,
systems, ML, and DL. CNNs, RNNs, etc.
etc.

Data Can work with limited data Requires structured data for Needs vast amounts of data
Requirements (e.g., rule-based AI). training. for training complex models.

Chatbots, robotics, Fraud detection, Image recognition, NLP,


Applications
autonomous systems. recommendation engines. autonomous driving systems.

Understanding Modelling
 AI modelling refers to developing algorithms, also called models which can be trained to
get intelligent outputs.
 That is, writing codes to make a machine artificially intelligent.

Introduction
 The graphical representation makes the data understandable for humans as we can
discover trends and patterns out of it.
 But when it comes to machine accessing and analysing data, it needs the data in the
most basic form of numbers (which is binary- 0s and s) and when it comes to
discovering patterns and trends in data, the machine goes for mathematical
representations of the same.

`
 The ability to mathematically describe the relationship between parameters is the heart
of every Al model.
 Thus, whenever we talk about developing Al models, it is the mathematical approach
towards analysing data which we refer to.

Rule Based Approach


 Rule based approach refers to the AI modelling where the relationship or patterns in
data are defined by the developer.
 The machine follows the rules or instructions mentioned by the developer, and performs
its tasks accordingly.

`
Learning Based Approach:

 Refers to the Al modelling where the relationship or patterns in data are not defined by
the developer.
 In this approach, random data is fed to the machine and it is left on the machine to
figure out patterns and trends out of it.
 Generally, this approach is followed when the data is unlabelled and too random for a
human to make sense out of it.
 Thus, the machine looks at the data, tries to extract similar features out of it and clusters
same datasets together.
 In the end as output, the machine tells us about the trends which it observed in the
training data.

Decision Tree
It is the most powerful and popular tool for classification and prediction.

A decision tree is a flowchart like tree structure where each internal node denotes a test on a
attribute, each branch represents an outcome of the test, and each leaf node (terminal node)
holds a class label.

Understanding Evaluation
 Once a model has been made and trained, it needs to go through proper testing so that
one can calculate the e iciency and performance of the model
 Hence, the model is treated with the help of Testing data (which was separated out of
the acquired dataset at Data acquisition stage) and the e iciency of the model is
calculated on the basis of parameters mentioned below:

`
Evaluation is basically to check the performance of your AI model. This is done by mainly two
things “Prediction” & “Reality “. Evaluation is done by: -

 First search for some testing data with the resulted outcome that is 100% true

 Then you will feed that testing data to the AI modal while you have the correct outcome
with yourself that is termed as “Reality”.

 Then when you will get the predicted outcome from the AI modal that is called
“Prediction” compare it with the resulted outcome, that is “Reality”.

 You can do this to: -

o improve the e iciency and performance of your AI Modal,

o Improve it, check your mistakes.

Note
Remember that It’s not recommended to use the data we used to build the model to evaluate it.
This is because our model will simply remember the whole training set, and will therefore always
predict the correct label for any point in the training set. This is known as overfitting.

Once a model has been made and trained, it needs to go through proper testing so that one can
calculate the e iciency and performance of the model. Hence, the model is tested with the help
of Testing Data (which was separated out of the acquired dataset at Data Acquisition stage) and
the e iciency of the model is calculated on the basis of the parameters mentioned below:

`
1 – Possibility

2 – Case

3 – Possible Action

`
4 – Last case

Understanding Deployment
Deployment as the final stage in the AI project cycle where the AI model or solution is
implemented in a real-world scenario.

Key Steps in Deployment Process the key steps involved in the deployment process:

`
a. Testing and validation of the AI model

b. Integration of the model with existing systems

c. Monitoring and maintenance of the deployed model.

Some examples of successful AI projects that have been deployed in various industries, such as
self-driving cars, medical diagnosis systems, and chatbots.

You might also like