2019-20-I Q4_key
2019-20-I Q4_key
Instructions:
1. This question paper contains 1 page (2 sides of paper). Please verify.
2. Write your name, roll number, department above in block letters neatly with ink.
3. Write your final answers neatly with a blue/black pen. Pencil marks may get smudged.
4. Don’t overwrite/scratch answers especially in MCQ. We will entertain no requests for leniency.
5. Do not rush to fill in answers. You have enough time to solve this quiz.
Q1. Write T or F for True/False (write only in the box on the right hand side) (8x2=16 marks)
The Adagrad method is a technique for choosing an appropriate batch size when
1
training a deep network.
F
The largest value the Gaussian kernel can take on any two points depends on the
2
value of the bandwidth parameter used within the kernel.
F
k-means++ initialization is one of the algorithms that cannot be kernelized easily
3
since it involves probabilities and sampling.
F
Suppose 𝐺 is the Gram matrix of 𝑛 data points 𝐱1 , … , 𝐱 𝑛 ∈ ℝ2 with respect to the
4
homogeneous polynomial kernel of degree 𝑝 = 2. Then 𝐺 must be pos. semi def.
T
If for some 𝐰 ∗ we have 𝑦 𝑖 = 〈𝐰 ∗ , 𝐱 𝑖 〉, 𝑖 ∈ [𝑛] then kernel regression with 𝐾(𝐱, 𝐲)
5
= (〈𝐱, 𝐲〉 + 1)2 cannot get zero training error w.r.t least squares loss on this data
F
Kernel k-means clustering with the quadratic kernel results in a larger model size
6
than what is possible if we had done linear k-means (i.e. with the linear kernel).
T
A NN with a single hidden layer and a single output node with all nodes except
7
input layer nodes using ReLU activation will always learn a differentiable function.
F
Dropout is a technique that takes a training set and randomly drops training points
8
to reduce the training set size so that training can be done faster
F
Q2. Suppose we have 𝑛 distinct data points data points 𝐱1 , … , 𝐱 𝑛 ∈ ℝ2 . Consider the Gram matrix
𝐺 w.r.t the Gaussian kernel 𝐾(𝐱, 𝐲) = exp(−𝛾 ⋅ ‖𝐱 − 𝐲‖22 ). Answer in the boxes only. (6 marks)
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -- END OF QUIZ - - - - -- - - - - - - - - - - - - - - - - - - - - - - - -
---