100% found this document useful (2 votes)
441 views3 pages

Assignment 11: Introduction To Machine Learning Prof. B. Ravindran

The document contains 6 multiple choice questions related to machine learning concepts like maximum likelihood estimation, Gaussian distributions, mixture models, and the EM algorithm. Question 1 asks about finding the MLE of the parameter λ for samples from an exponential distribution. Question 2 is similar but for a geometric distribution. Question 3 asks about the number of parameters needed to estimate the mean and covariance matrix of a Gaussian. Question 4 asks about the MLE of λ for samples from a Poisson distribution. Questions 5 and 6 are about conditions for mixing coefficients in GMMs and the equations used in the E and M steps of the EM algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
441 views3 pages

Assignment 11: Introduction To Machine Learning Prof. B. Ravindran

The document contains 6 multiple choice questions related to machine learning concepts like maximum likelihood estimation, Gaussian distributions, mixture models, and the EM algorithm. Question 1 asks about finding the MLE of the parameter λ for samples from an exponential distribution. Question 2 is similar but for a geometric distribution. Question 3 asks about the number of parameters needed to estimate the mean and covariance matrix of a Gaussian. Question 4 asks about the MLE of λ for samples from a Poisson distribution. Questions 5 and 6 are about conditions for mixing coefficients in GMMs and the equations used in the E and M steps of the EM algorithm.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Assignment 11

Introduction to Machine Learning


Prof. B. Ravindran
1. Given n samples x1 , x2 , . . . , xN drawn independently from an Exponential distribution un-
known parameter λ, find the MLE of λ.
Pn
(a) λM LE = i=1 xi
Pn
(b) λM LE = n i=1 xi
(c) λM LE = Pnn
i=1 xi
Pn
i=1 xi
(d) λM LE = n
(e) λM LE = Pn−1
n
i=1 xi
Pn
i=1 xi
(f) λM LE = n−1

Sol. (c)

n
Y n
Y Pn
L(λ, x1 , . . . , xn ) = f (xi , λ) = λe−λx = λn e−λ i=1 xi

i=1 i=1

Pn
d ln λn e−λ i=1 xi

d ln (L(λ, x1 , . . . , xn ))
=
dλ dλ Pn
d ln (n ln(λ) − λ i=1 xi )
=

n
n X
= − xi
λ i=1

Set the above term to zero to obtain MLE of λ


n
λ= P
n
xi
i=1

2. Given n samples x1 , x2 , . . . , xn drawn independently from an Geometric distribution unknown


parameter p given by pdf Pr(X = k) = (1 − p)k−1 p for k = 1, 2, 3, · · · , find the MLE of p.
Pn
(a) pM LE = i=1 xi
Pn
(b) pM LE = n i=1 xi
(c) pM LE = Pnn
i=1 xi
Pn
i=1 xi
(d) pM LE = n
(e) pM LE = Pn−1
n
i=1 xi
Pn
i=1 xi
(f) pM LE = n−1

1
Sol. (c)

3. (2 marks) Suppose we are trying to model a p dimensional Gaussian distribution. What is the
actual number of independent parameters that need to be estimated in mean and covariance
matrix respectively?
(a) 1, 1
(b) p − 1, 1
(c) p, p
(d) p, p(p + 1)
(e) p, p(p + 1)/2
(f) p, (p + 3)/2
(g) p − 1, p(p + 1)
(h) p − 1, p(p + 1)/2 + 1
(i) p − 1, (p + 3)/2
(j) p, p(p + 1) − 1
(k) p, p(p + 1)/2 − 1
(l) p, (p + 3)/2 − 1
(m) p, p2
(n) p, p2 /2
(o) None of these
Sol. (e)
Explanation Mean vector has p parameters. The covariance matrix is symmetric (p × p) and
hence has p p+1
2 independent parameters.

4. (2 marks) Given n samples x1 , x2 , . . . , xN drawn independently from a Poisson distribution


unknown parameter λ, find the MLE of λ.

2
Pn
(a) λM LE = i=1 xi
Pn
(b) λM LE = n i=1 xi
(c) λM LE = Pnn
i=1 xi
Pn
i=1 xi
(d) λM LE = n
(e) λM LE = Pn−1
n
i=1 xi
Pn
i=1 xi
(f) λM LE = n−1

Sol. (d)
Write the likelihood:
Y λxi e−nλ
l(λ; x) = e−λ = λx1 +x2 +···+xn
i
xi ! x1 !x2 ! · · · xn !

Take the log and differentiate the log-likelihood with respect to λ and set it to 0.
5. (2 marks) In Gaussian Mixture Models, πi are the mixing coefficients. Select the correct
conditions that the mixing coefficients need to satisfy for a valid GMM model.
(a) −1 ≤ πi ≤ 1, ∀i
(b) 0 ≤ πi ≤ 1, ∀i
P
(c) i πi = 1
P
(d) i πi need not be bounded

Sol. (b), (c)

6. (2 marks) Expectation-Maximization, or the EM algorithm, consists of two steps - E step and


the M-step. Using the following notation, select the correct set of equations used at each step
of the algorithm.
Notation.
X: Known/Given variables/data
Z: Hidden/Unknown variables
θ: Total set of parameters to be learned
θk : Values of all the parameters after stage k
Q(, ): The Q-function as described in the lectures
(a) E-step: EZ|X,θ [log(P r(X, Z|θm ))]
(b) E-step: EZ|X,θm−1 [log(P r(X, Z|θ))]
P
(c) M-step: argmaxθ Z P r(Z|X, θm−2 ) · log(P r(X, Z|θ))
(d) M-step: argmaxθ Q(θ, θm−1 )
(e) M-step: argmaxθ Q(θ, θm−2 )
Sol. (b), (d)

You might also like