05_SVM
05_SVM
Strengths of SVMs
• Good generalization
– in theory
– in practice
• Works well with few training instances
• Find globally best model
• Efficient algorithms
• SVM works for classification and regression
Classification
Separation
Separation
Separation
A good separator between
2 classes
Maximizing the Margin
margin
Support Vector Machines (SVM)
Support Vector Machines (SVM)
Support Vector Machines
(SVM)
• Points (instances) are like vectors p=(x1, x2,…., xn)
• SVM finds the closet two points from the two classes and define the
best separating line (plane)
• Then SVM draws a line connecting them.
• After that, SVM decides that the best separating line is the line that
bisects and is perpendicular to the connecting line.
Support Vector Machines
(SVM)
SVM as a minimization problem
We wish to find the w and b which minimizes, and the α which maximizes LP(whilst
keeping αi ≥ 0
We can do this by differentiating LP with respect to w and b and setting
the derivatives to zero:
• αi is the Lagrange
multiplier associated
with the ith training
sample.
SVMasa minimizationproblem
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
Example on SVM
From equation (y=wx+b)
x1w1+x2w2+b =0
w1= 1, w2=0, b=-3
So
x1=3
So line will be at x1=3
Numerical example for Linear
SVM
Positively labelled data points (3,1)(3,-1)(6,1)(6,-1)
Bias, b= -2
At hyper plane
y=0
y= w*x + b
0= x -2
x=2
To deal with Nonlinear
problem
In practice, SVM algorithm is implemented using a kernel.
It uses a technique called the kernel trick.
In simple words, a kernel is just a function that maps the data to a higher
dimension where data is separable.