0% found this document useful (0 votes)
13 views

23.0 Logistic Regression-6

Uploaded by

tarunsnipr47
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

23.0 Logistic Regression-6

Uploaded by

tarunsnipr47
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 24

LOGISTIC REGRESSION&

SVM (EXAMPLE)
LOGISTIC REGRESSION
EXAMPLE: Hours Study Pass(1)/Fail (0)

29 0
• The dataset of pass or fail in an exam of 5 students
15 0
is given in the table
33 1
• Use logistic regression as a classifier to answer the
following questions: 28 1
29 1

1. Calculate the probability of passing for the students


who studied 33 hours.
2. At least how many students should study that makes he Assume the model suggested by the
will pass the course with the probability of more than optimizer for odds of passing the course
95% is,

log(odds)= -64+2*hours
LOGISTIC REGRESSION
• We use a Sigmoid Function in logistic
regression
LOGISTIC REGRESSION
1. Calculate the probability of passing for the students who Hours Study Pass(1)/Fail (0)
studied 33 hours.
29 0
The sigmoid function can be written in the form of 15 0
probability as:
33 1
28 1
 Now we know the value of z; 29 1

• z = -64 + 2*33 = -64 + 66 = 2


log(odds)= z = -64+2*hours
• = 0.88

• That is, if student studies 33 hours, then there is 88% chance


LOGISTIC REGRESSION
Hours Study Pass(1)/Fail (0)
2. At least how many students should study that makes
he will pass the course with the probability of more 29 0
than 95% 15 0
33 1
• = 0.95 28 1
29 1
• 0.95 * (1+
• We know, ln(
• 0.95 *
• Now we have to calculate the value of z
• =
• -z = ln(0.0526) = -2.94
• ln( ) = ln (0.0526)
• z = 2.94
LOGISTIC REGRESSION
Hours Study Pass(1)/Fail (0)
• z= 2.94
29 0
• log(odds) = z = -64+2* hours
15 0
33 1
• 2.94 = -64+2 * hours
28 1
29 1
• 2 * hours= 2.94 + 64

• 2 * hours = 66.94
The student should study at least 33.47 hours, so that
he will pass the exam with more than 95 %
• hours = = 33.47 Hours
probability.
SUPPORT VECTOR MACHINE-Linear

Suppose we are given the following


positively labeled data points,

and the following negatively labeled data


points,
SUPPORT VECTOR MACHINE-Linear

By inspection ,it should be obvious that


there are three support vector
SUPPORT VECTOR MACHINE-Linear

• Each vector is augmented with a 1 as a bias input

• So, = , then
• Similarly,

• = , then then and = , then then


SUPPORT VECTOR MACHINE-Linear

• Now we need to calculate the, which will be use to calculate the weight vector

•+ (is present on the negatively labeled datasets)

•+
(is present on the positively labeled datasets)

•+

• Put the values of , , in the above equation, we will get


SUPPORT VECTOR MACHINE-Linear

•+

•+

• +
SUPPORT VECTOR MACHINE-Linear

• Solving the above equation, we get

•+

•+

•+
SUPPORT VECTOR MACHINE-Linear

• Solving the above equation, we get

•+ ….. (i)

•+ …..(ii)

•+ ……(iii) • Now by simplifying, equation (i),(ii) & (iii),


we will get the values as:
SUPPORT VECTOR MACHINE-Linear

• Now, we need to calculate the weight vector

• = = -3.5 =
• Finally remembering that our vectors are augmented
with bias
• We can equate the last entry in as the hyperplane
offset b and write the separating
• Hyperplane equation y = wx+b
• With w = and b = -2
SUPPORT VECTOR MACHINE-Linear

• Solving the above equation, we get

•+ ….. (i)

•+ …..(ii)

•+ ……(iii) • Now by simplifying, equation (i),(ii) & (iii),


we will get the values as:
SUPPORT VECTOR MACHINE-Linear

• Solving the above equation, we get

•+ ….. (i)

•+ …..(ii)

•+ ……(iii) • Now by simplifying, equation (i),(ii) & (iii),


we will get the values as:
SUPPORT VECTOR MACHINE-
(Non-Linear)
Suppose we are given the following
positively labeled data points,

and the following negatively labeled data


points,
SUPPORT VECTOR MACHINE-
(Non-Linear)
• Our goal, again is to discover a separating hyperplane that accurately
discriminates the two classes.
• Of course it is obvious that no such hyperplane exists in input space
• Therefore, we must use a non-linear SVM (that is, we need to convert data from
one feature space to another.
• Following is the equation to convert one feature space to another

𝜑
( 𝑥1
𝑥2 )=¿
¿ If 2

( 𝑥1
𝑥2 ) Otherwise
SUPPORT VECTOR MACHINE-
(Non-Linear)
• Now we will consider the Positive Examples and try to convert if from one
feature space to another :

• For that we have to check this condition 2


• Now we will put the values of x1 and x2 and check if it satisfy the condition
• If it satisfy the condition then we will be using this equation
SUPPORT VECTOR MACHINE-
(Non-Linear)
• Now we will be solving the equation

• 

• Similarly we will solve the negative example:

• 
SUPPORT VECTOR MACHINE-
(Non-Linear)
• No we can easily identify the support vectors:

• Each vector is augmented wit 1 as a bias input


SUPPORT VECTOR MACHINE-
(Non-Linear)
• We need to use again the same equation we have used previously
•+
•+

• + = -1

•+ =1
SUPPORT VECTOR MACHINE-
(Non-Linear)
• After simplifying we will be getting:
• + = -1

• +=1

• +9

•=4
SUPPORT VECTOR MACHINE-
(Non-Linear)
• = = -7 =
• Finally remembering that our vectors are
augmented with bias
• We can equate the last entry in as the
hyperplane offset b and write the
separating
• Hyperplane equation y = wx+b
• With w = and b = -3

You might also like