0% found this document useful (0 votes)
17 views

Support Vector Machines (SVMS) 2222

This document discusses support vector machines (SVMs), including important terms like support vectors and hyperplanes. It explains that support vectors are the data points closest to the decision boundary, and that the hyperplane is the decision boundary that separates the two classes. It also discusses how SVMs find an optimal hyperplane to maximize the margin between classes in n-dimensional space.

Uploaded by

Habiba Sameh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Support Vector Machines (SVMS) 2222

This document discusses support vector machines (SVMs), including important terms like support vectors and hyperplanes. It explains that support vectors are the data points closest to the decision boundary, and that the hyperplane is the decision boundary that separates the two classes. It also discusses how SVMs find an optimal hyperplane to maximize the margin between classes in n-dimensional space.

Uploaded by

Habiba Sameh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 23

SUPPORT

VECTOR
MACHINES
(SVMS)
Important
terms
What is the support vector ?

Support vectors
are the data
points that lie
closest to the
decision surface
(or hyperplane)
Why is this
called ?
It is called support because it supports me in
making the decision (to which class I belong) and
vector because it is treated like vectors in
mathematics.
What is the Hyperplane ?
hyperplane is a decision
boundary that differentiates
the two classes in SVM.
Is the Hyperplane
selection random,
and if not, how do I
determine?
It is possible to choose the
Hyperplane randomly, but this
will cause mix classification
In general, lots of
possible solutions for
a ,b , c (an infinite
number!) , Support
Vector Machine (SVM)
finds an optimal
solution
What are
support vector
machines
(SVMs)?
support vector machine is
a supervised learning
algorithm used to solve
classification and
regression problems.
How support vector
machines work ?
A support vector machine (SVM) is a
supervised machine learning algorithm that
classifies data by finding an optimal line or
hyperplane that maximizes the distance
between each class in an N-dimensional
space.
(SVMs maximize the margin (Winston
terminology: the ‘street’) around the
separating hyperplane.)
After using Support Vector Machines
So how do we
calculate it
mathematically?
Form of equation defining the decision surface
separating the classes is a hyperplane of the form:

ax+b=o (linear)
wTx + b = 0
–w is a weight vector
–x is input vector
–b is bias
• Allows us to write
wTx + b ≥ 0 for di = +1
wTx + b < 0 for di = –1

H1 and H2 are the planes:


H1: w•xi+b = +1
H2: w•xi+b = –1
The points on the planes H1 and H2
are the tips of the Support Vectors
The plane H0 is the median in
between, where w•xi+b =0
Define the hyperplanes H such that: w•xi+b ≥ +1 when yi =+1 & w•xi+b ≤ -1 when yi = –1

At H1
At H2
if w.x+b=1 then we get
if w.x+b=-1 then we get
(+)class hyperplane
(-)class hyperplane
for all positive(x) points
for all negative(x) points
At H0 satisfy this rule (w.x+b
satisfy this rule (w.x+b≤-
if w.x+b=0 then we get ≥1)
1)
the decision boundary
What if ?!
What if the
support
vector was
like this ?
In the previous example, the support vector cannot be separated linearly,
so we use kernal function, and this method is called non-linear
Thanks

You might also like