0% found this document useful (0 votes)
3 views

05_SVM

Support Vector Machines (SVM) are effective for classification and regression, offering good generalization and performance with limited training data. SVMs work by finding the optimal separating hyperplane that maximizes the margin between classes, and they can handle non-linear problems using kernel functions. The document also includes numerical examples and Python code for implementing SVMs.

Uploaded by

david1milad1982
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

05_SVM

Support Vector Machines (SVM) are effective for classification and regression, offering good generalization and performance with limited training data. SVMs work by finding the optimal separating hyperplane that maximizes the margin between classes, and they can handle non-linear problems using kernel functions. The document also includes numerical examples and Python code for implementing SVMs.

Uploaded by

david1milad1982
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

Support Vector Machines

Strengths of SVMs

• Good generalization
– in theory
– in practice
• Works well with few training instances
• Find globally best model
• Efficient algorithms
• SVM works for classification and regression
Classification
Separation
Separation
Separation
A good separator between
2 classes
Maximizing the Margin
margin
Support Vector Machines (SVM)
Support Vector Machines (SVM)
Support Vector Machines
(SVM)
• Points (instances) are like vectors p=(x1, x2,…., xn)
• SVM finds the closet two points from the two classes and define the
best separating line (plane)
• Then SVM draws a line connecting them.
• After that, SVM decides that the best separating line is the line that
bisects and is perpendicular to the connecting line.
Support Vector Machines
(SVM)
SVM as a minimization problem

•αi is the Lagrange


multiplier associated
with the ith training
sample.
SVM as a minimization problem
SVM as a minimization problem

We wish to find the w and b which minimizes, and the α which maximizes LP(whilst
keeping αi ≥ 0
We can do this by differentiating LP with respect to w and b and setting
the derivatives to zero:

• αi is the Lagrange
multiplier associated
with the ith training
sample.
SVMasa minimizationproblem
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
From equation (y=wx+b)
x1w1+x2w2+b =0
w1= 1, w2=0, b=-3
So
x1=3
So line will be at x1=3
Numerical example for Linear
SVM
Positively labelled data points (3,1)(3,-1)(6,1)(6,-1)

Negatively labelled data points (1,0)(0,1)(0,-1)(-1,0)

Solution: for all negative labelled output is -1 and for all


positive labelled output is 1.
Numerical example for
Linear SVM
Numerical example for
Linear SVM
Numerical example for
Linear SVM
In solving this equations, we get:
Numerical example for Linear
SVM

Bias, b= -2

At hyper plane
y=0
y= w*x + b
0= x -2
x=2
To deal with Nonlinear
problem
In practice, SVM algorithm is implemented using a kernel.
It uses a technique called the kernel trick.

In simple words, a kernel is just a function that maps the data to a higher
dimension where data is separable.

A kernel transforms a low-dimensional input data space into a higher


dimensional space. So, it converts non-linear separable problems to linear
separable problems by adding more dimensions to it.

Thus, the kernel trick helps us to build a more accurate classifier.


Hence, it is useful in non-linear separation problems.
To deal with Nonlinear
problem (Kernal function)
To deal with Nonlinear
problem
Python code for SVM
Python code for SVM
Python code for SVM

You might also like