Deep Learning Assignment5 Solutions
Deep Learning Assignment5 Solutions
1 1 2
A = 1 2 1
2 1 1
Which of the following vectors is not an eigenvector of this matrix ?
1
A. 1
1
1
B. −2
1
−1
C. 0
1
−1
D. 1
0
Solution: Option D is the correct answer. For each of the vectorsgiven in the
1
options you can compute the product Ax. For example, consider x = 1
1
1 1 2 1 4 1
Ax = 1 2 1 1 = 4 = 4 1 = 4x
2 1 1 1 4 1
1 1
Hence, x = 1 is an eigenvector of A. Similarly, you can show that x = −2
1 1
−1
and x = 0 are also eigenvectors of this matrix with the corresponding eigen
1
−1
values being 1 and -1 respectively. You can then also check that x = 1 is not
0
an eigenvector of this matrix.
2. Consider a square matrix A ∈ R3×3 such that AT = A. My friend told me that the
following three vectors are the eigenvectors of this matrix A:
−1 1 1
x = 1 , y = 1 , z = 1
1 1 −1
Is my friend telling the truth ?
A. Yes
B. No
C. Can’t say without knowing all the elements of A
D. Yes, only if all the diagonal elements of A are 1
Page 2
3. Consider the following matrix:
1 1 2
A = 1 2 1
2 1 1
What can you say about the series x, Ax, A2 xA3 x, .... ?
A. It will diverge (explode)
B. It will converge (vanish)
C. It will reach a steady state
D. Can’t say without knowing all the elements of x
Solution: Referring to the solution for question 1, we know that the dominant
eigenvalue of this matrix is 4. From slide 10 of Lecture 6 we know that if the
dominant eigenvalue λd > 1 then the series will diverge (explode) irrespective of
which x we start with. Option A is the correct answer.
Page 3
4. Which of the following sets of vectors does not form a valid basis in R3
1 −1 1
A. 1 , 0 , −2
1 1 1
1 3 1
B. 2 , 2 , 4
3 1 5
1 4 6
C. 3 , −1 , 5
1 2 4
3 1 4
D. 5 , 2 , 7
7 2 4
Solution: Option C is the correct answer. A set of 3 vectors can form a basis
in
R3 if the
vectors
in the set are linearly independent. Now, consider the vectors
1 4 6
3 , −1 , 5. We observe that,
1 2 4
1 4 6 0
2 3 + −1 − 5 = 0
1 2 4 0
Hence, the vectors are linearly dependent and thus cannot form a basis in R3 .
Page 4
5. Consider the matrix A:
1 1 2
A = 1 2 1
2 1 1
min xT Ax
x
s.t kxk = 1
Solution: From the Theorem on Slide 26 of Lecture 6 we know that the solution
to the above minimization problem is the eigenvector corresponding to the smallest
eigenvalue of A. From the solution to question1 we know that the eigenvector cor-
−1
responding to the smallest eigenvalue of A is 0 (the corresponding eigenvalue
1
is -1 and the other two eigenvalues are 1 and 4). Hence Option C is the correct
answer.
Page 5
3
M ∈ R . The sum of the elements of each row of this
6. Consider a row stochastic matrix
1
matrix is 1. Is the vector x = 1 an eigenvector of this matrix?
1
A. Yes
B. No
C. Can’t say without knowing the elements of A
D. Yes, only if each row represents a uniform distribution
Solution: Any row stochastic matrix M ∈ R3 will have the following form:
a b 1 − (a + b)
= m n 1 − (m + n)
p q 1 − (p + q)
Page 6
7. Consider
a set of
points xm ∈ R2 represented using the standard basis x =
x1 , x2 , ..., m×2
1 0 and y = 0 1 . Let X ∈ R be a matrix such that x1 , x2 , ..., xm are the rows
of this matrix. Using PCA, we want to represent this data husing a newi basis. Toh do so, we i
T √1 √1 −1
√ √1 .
find the eigenvectors of X X, which happen to be u1 = 2 2
and u 2 = 2 2
Now suppose, we want to represent one of the m points, say xi = 2.1 2.4 using only
u1 (i.e., we want to represent the data using fewer dimensions then what would be the
squared error in reconstructing xi using only u1 ?
A. 0.045
B. 0.030
C. 0.015
D. 0
Solution:
Consider the point xi = 2.1 2.4 represented using the standard
h basis
i x = 1 0
and y = 0 1 . We want to represent it using only u1 = √12 √12 . To do this we
4.5
α1 = xTi u1 = √
2
0.3
α2 = xTi u2 = √
2
We can then see that,
2.1
xi = α1 u1 + α2 u2 =
2.4
This is the full error-free reconstruction of xi using both u1 and u2 . However, in the
question we are asked to reconstruct xi using only u1 . Hence, we get,
2.25
x̂i = α1 u1 =
2.25
We can now compute the squared error between xi and x̂i as,
Page 7