Assignment 3 2022
Assignment 3 2022
Deep Learning
Assignment- Week 3
TYPE OF QUESTION: MCQ/MSQ
Number of questions: 10 Total mark: 10 X 1 = 10
______________________________________________________________________________
QUESTION 1:
Find the distance of the 3D point, 𝑃 = (−3, 1, 3) from the plane defined by
2𝑥 + 2𝑦 + 5𝑧 + 9 = 0?
a. 3.1
b. 3.5
c. 0
d. ∞ (infinity)
Correct Answer: b
Detailed Solution:
QUESTION 2:
What is the shape of the loss landscape during optimization of SVM?
a. Linear
b. Paraboloid
c. Ellipsoidal
d. Non-convex with multiple possible local minimum
Correct Answer: b
Detailed Solution:
In SVM the objective to find the maximum margin based hyperplane (W) such that
The above optimization is a quadratic optimization with a paraboloid landscape for the loss
function.
______________________________________________________________________________
QUESTION 3:
How many local minimum can be encountered while solving the optimization for maximizing
margin for SVM?
a. 1
b. 2
c. ∞ (infinite)
d. 0
Correct Answer: a
Detailed Solution:
In SVM the objective to find the maximum margin-based hyperplane (W) such that
The above optimization is a quadratic optimization with a paraboloid landscape for the loss
function. Since the shape is paraboloid, there can be only 1 global minimum.
______________________________________________________________________________
QUESTION 4:
Which of the following classifiers can be replaced by a linear SVM?
a. Logistic Regression
b. Neural Networks
c. Decision Trees
d. None of the above
Correct Answer: a
Detailed Solution:
Logistic regression framework belongs to the genre of linear classifier which means the
decision boundary can segregate classes only if they are linearly separable. SVM is also
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur
capable of doing so and thus can be used instead of logistic regression classifiers. Neural
networks and decision trees are capable of modeling non-linear decision boundaries which
linear SVM cannot model directly.
______________________________________________________________________________
QUESTION 5:
Find the scalar projection of vector b = <-2, 3> onto vector a = <1, 2>?
a. 0
4
b.
√5
2
c.
√17
−2
d.
17
Correct Answer: b
Detailed Solution:
𝒃∙𝒂
Scalar projection of b onto vector a is given by the scalar value |𝒂|
____________________________________________________________________________
QUESTION 6:
For a 2-class problem what is the minimum possible number of support vectors. Assume there
are more than 4 examples from each class?
a. 4
b. 1
c. 2
d. 8
Correct Answer: c
Detailed Solution:
____________________________________________________________________________
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur
QUESTION 7:
Which one of the following is a valid representation of hinge loss (of margin = 1) for a two-class
problem?
Correct Answer: a
Detailed Solution:
Hinge loss is meant to yield a value of 0 if the predicted output (p) has the same sign as that
of the class label and satisfies the margin condition, |p| > 1. If the signs differ, the loss is
meant to increase linearly as a function of p. Option (a) satisfies the above criteria.
______________________________________________________________________________
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur
QUESTION 8:
Suppose we have one feature x ∈ R and binary class y. The dataset consists of 3 points: p1: (x1,
y1) = (−1, −1), p2: (x2, y2) = (1, 1), p3: (x3, y3) = (3, 1). Which of the following true with respect
to SVM?
a. Maximum margin will increase if we remove the point p2 from the training
set.
b. Maximum margin will increase if we remove the point p3 from the training
set.
c. Maximum margin will remain same if we remove the point p2 from the
training set.
d. None of the above.
Correct Answer: a
Detailed Solution:
Here the point p2 is a support vector, if we remove the point p2 then maximum margin will
increase.
____________________________________________________________________________
Question 9:
If we employ SVM to realize two input logic gates, then which of the following will be true?
a. The weight vector for AND gate and OR gate will be same.
b. The margin for AND gate and OR gate will be same.
c. Both the margin and weight vector will be same for AND gate and OR
gate.
d. None of the weight vector and margin will be same for AND gate and
OR gate.
Correct Answer: b
Detailed Solution:
NPTEL Online Certification Courses
Indian Institute of Technology Kharagpur
As we can see although the weight vectors are not same but the margin is same.
______________________________________________________________________________
QUESTION 10:
What will happen to the margin length of a max-margin linear SVM if one of non-support vector
training example is removed??
Correct Answer: c
Detailed Solution:
In max-margin linear SVM, the separating hyper-planes are determined only by the
training examples which are support vectors. The non-support vector training examples do
not influence the geometry of the separating planes. Thus, the margin, in our case, will be
unaltered.
____________________________________________________________________________
************END*******