0% found this document useful (0 votes)
28 views23 pages

Day 2-FDP

This document provides an overview of machine learning algorithms. It begins with a classification of machine learning methods, including supervised learning techniques like regression and classification. Regression predicts continuous values while classification predicts discrete values. Linear regression finds a linear relationship between variables, using the least squares method and gradient descent to minimize a cost function. Logistic regression can classify outcomes as yes/no using a sigmoid function. Other supervised algorithms discussed include K-nearest neighbors, which classifies new data based on its similarity to existing data. Confusion matrices and accuracy metrics are used for evaluation. Hands-on examples are provided for linear regression, logistic regression and KNN.

Uploaded by

shamilie17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views23 pages

Day 2-FDP

This document provides an overview of machine learning algorithms. It begins with a classification of machine learning methods, including supervised learning techniques like regression and classification. Regression predicts continuous values while classification predicts discrete values. Linear regression finds a linear relationship between variables, using the least squares method and gradient descent to minimize a cost function. Logistic regression can classify outcomes as yes/no using a sigmoid function. Other supervised algorithms discussed include K-nearest neighbors, which classifies new data based on its similarity to existing data. Confusion matrices and accuracy metrics are used for evaluation. Hands-on examples are provided for linear regression, logistic regression and KNN.

Uploaded by

shamilie17
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Machine Learning

https://miro.medium.com/max/629/1*_HoMKjrWahRiI-JmwYW6zg.png
© Edunet Foundation. All rights reserved.
Agenda
• Classification of Machine Learning Algorithms
• Supervised Learning
• Regression vs Classification
• Linear Regression
• Classification

© Edunet Foundation. All rights reserved.


Classification of Machine Learning

https://social.technet.microsoft.com/wiki/cfs-filesystemfile.ashx/__key/communityserver-wikis-components-files/00-00-00-00-05/7002.5_5F00_1.PNG

© Edunet Foundation. All rights reserved.


Supervised Learning

https://cdn-images-1.medium.com/max/1600/1*Iz7bCLrPTImnBDOOEyE3LA.png
© Edunet Foundation. All rights reserved.
Regression vs Classification
• Regression algorithms are used to predict the continuous values.
• Classification algorithms are used to predict or classify the discrete values.

https://www.javatpoint.com/regression-vs-classification-in-machine-learning
© Edunet Foundation. All rights reserved.
Regression vs Classification

https://www.javatpoint.com/regression-vs-classification-in-machine-learning
© Edunet Foundation. All rights reserved.
Linear Regression
It shows a linear relationship between a dependent (y) and one or more
independent (x) variables.

https://www.javatpoint.com/linear-regression-in-machine-learning
© Edunet Foundation. All rights reserved.
Least Square Method
1. To find the best fit line that
represents the relationship
between an independent and
dependent variables

2. A smaller sum of squares


indicates a better model, as
there is less variation in the
data

https://medium.com/analytics-vidhya/ordinary-least-square-ols-method-for-linear-regression-ef8ca10aadfc
© Edunet Foundation. All rights reserved.
Mathematical Intuition
• Cost function: It measures how a linear regression model is performing. It
is the sum of the squares of the residuals.

The cost function is denoted by :

• Gradient Descent: It is a generic optimization algorithm used in many machine


learning algorithms. It iteratively tweaks the parameters of the model in order to
minimize the cost function

© Edunet Foundation. All rights reserved.


Linear Regression Evaluation Techniques

https://www.datatechnotes.com/2019/02/regression-model-accuracy-mae-mse-rmse.html
© Edunet Foundation. All rights reserved.
Hands-On Linear Regression

© Edunet Foundation. All rights reserved.


Classification

Disclaimer: The content is curated for educational purposes only.


© Edunet Foundation. All rights reserved.
Classification

Disclaimer: The content is curated for educational purposes only.


© Edunet Foundation. All rights reserved.
Logistic Regression

https://www.analyticsvidhya.com/blog/2021/04/beginners-guide-to-logistic-regression-using-python/
© Edunet Foundation. All rights reserved.
Logistic Regression
• Logistic regression is one of the most popular Machine Learning algorithms,
which comes under the Supervised Learning technique.

• It is used for predicting the categorical dependent variable using a given set of
independent variables

• The outcome can be either Yes or No, 0 or 1, true or False, etc. but instead
of giving the exact value as 0 and 1, it gives the probabilistic values which lie
between 0 and 1

© Edunet Foundation. All rights reserved.


Linear vs Logistic Regression

Red : Apple
Fruit color Score vs Fruit
Yellow : Not Apple
© Edunet Foundation. All rights reserved.
A sigmoid Function is a mathematical function, which has a characteristic S-shaped
curve. It has the property of mapping the entire number line into a small range,
between 0 and 1
Y(x) = 1/(1+ e-x)

© Edunet Foundation. All rights reserved.


Confusion Matrix

• Accuracy = TP+TN/TP+FP+FN+TN ( ratio of correctly predicted observation to


the total observations)

© Edunet Foundation. All rights reserved.


K Nearest Neighbours

https://www.analyticsvidhya.com/blog/2018/03/introduction-k-neighbours-algorithm-clustering/

© Edunet Foundation. All rights reserved.


K Nearest Neighbours
• K Nearest Neighbor algorithm falls under the Supervised Learning
category and is mostly used for classification.

• As the name (K Nearest Neighbor) suggests, it considers K Nearest


Neighbors (Data points) to predict the class or continuous value for the
new Datapoint.

• The algorithm’s learning is:


1. Instance-based learning
2. Lazy Learning

© Edunet Foundation. All rights reserved.


Category A: 3 neighbours
Category B: 2 neighbours

© Edunet Foundation. All rights reserved.


Hands-On Logistic Regression and KNN
THANK YOU

© Edunet Foundation. All rights reserved.

You might also like