0% found this document useful (0 votes)
2 views

Lecture 10 SVD

The document discusses the Singular Value Decomposition (SVD), which states that every matrix can be factorized into orthogonal matrices and a diagonal matrix, with the diagonal entries representing the singular values. It also covers the existence of SVD for arbitrary matrices, applications in least squares solutions for linear systems, and introduces the concept of the pseudoinverse. Additionally, it touches on convex and non-convex quadratic minimization problems and their relation to positive semidefinite matrices.

Uploaded by

M S Prasad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture 10 SVD

The document discusses the Singular Value Decomposition (SVD), which states that every matrix can be factorized into orthogonal matrices and a diagonal matrix, with the diagonal entries representing the singular values. It also covers the existence of SVD for arbitrary matrices, applications in least squares solutions for linear systems, and introduces the concept of the pseudoinverse. Additionally, it touches on convex and non-convex quadratic minimization problems and their relation to positive semidefinite matrices.

Uploaded by

M S Prasad
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 36

The Singular Value Decomposition

Motivation

• We just spent a lot of time looking at interesting consequences


of the observation that any symmetric matrix can be factorized
as for some orthogonal matrix and some diagonal matrix

• Is there a generalization of this for arbitrary matrices?

• Yes! It is based on the same eigenvectors/eigenvalues


construction as above, but it isn’t symmetric
The Singular Value Decomposition
• Every matrix can be factorized as where and are orthogonal
matrices and is a diagonal matrix
• The diagonal entries of are called the singular values of the
matrix
• We’ll show that we can use this decomposition similarly to the
eigenvalue decomposition, e.g., for finding least squares
solutions for linear systems
Existence of the SVD
• Let be an arbitrary matrix and consider the positive
semidefinite matrix
• For now, let’s assume that it is full rank, i.e., it has no zero
eigenvalues
• It can be written as , where are orthonormal eigenvectors in
• Define
• Let . Note that and
Existence of the SVD
• Let be an arbitrary matrix and consider the positive
semidefinite matrix
• Let . Note that and
• The are orthonormal eigenvectors of
• Let be a diagonal matrix with and be a diagonal matrix with
• Now, by construction,
Existence of the SVD

• Finally if the positive semidefinite matrix is not full rank, we can


apply the same argument by constructing one for each of the
non-zero eigenvalues and then extending them to an
orthonormal basis
• We keep the diagonal entries of both and equal to zero for
vectors in the extension

*This nice argument was taken from a blog post of Gregory Gunderson
The Singular Value Decomposition
• Every matrix can be factorized as where and are orthogonal
matrices and is a diagonal matrix
• The diagonal entries of are called the singular values of the
matrix and they are the square roots of the eigenvalues of
and
• are orthonormal eigenvectors of
• are orthonormal eigenvectors of
Least Squares Solutions Linear Systems

subject to

, where

Note that where


Least Squares Solutions Linear Systems

subject to

, where

Note that where


The Morse-Penrose Pseudo-inverse
• If the linear system has a solution, then the solution of
minimum norm is given by where
, the singular value decomposition

• If is invertible, then
• All the interesting algebraic properties of following directly from
the definition above, e.g.,
Ex 1: Convex Quadratic Minimization

is an arbitrary p.s.d

Either
matrix

or
Pseudoinverse

2
min ‖ 𝐴𝑥−𝑏‖ 2
𝑛
𝑥∈ ℝ
Pseudoinverse

𝑇
min ( 𝐴𝑥−𝑏) (𝐴𝑥−𝑏)
𝑛
𝑥∈ ℝ
Pseudoinverse

Either

or
Pseudoinverse

Either

or
Pseudoinverse

Let

The pseudoinverse solves the norm minimization problem even if


there is no solution to !
Ex 2: Convex Quadratic Minimization

subject to is an arbitrary p.s.d


matrix
Ex 2: Convex Quadratic Minimization

subject to

Either

or
Ex 2: Convex Quadratic Minimization

subject to

subject to
Ex 2: Convex Quadratic Minimization

subject to

subject to
Ex 2: Convex Quadratic Minimization

subject to

subject to
Ex 2: Convex Quadratic Minimization

subject to

subject to
𝑄=∑ 𝜇 𝑖 𝑥 𝑥
𝑇
(𝑖) ( 𝑖)

𝑖
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

Bounded from below only if is positive semidefinite


and the linear system has a solution
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

Bounded from below only if is positive semidefinite


and the linear system has a solution
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

subject to
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

subject to
Eigenvectors of

• If is an eigenvector of with eigenvalue , then

• So, is positive semidefinite if and only if where is the smallest


eigenvalue of
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

subject to
Range of a Matrix
• has a solution for a symmetric matrix if and only if is in the
span of the eigenvectors of with non-zero eigenvalues (note the
eigenvectors of are the same as the orthonormal eigenvectors of
)
• In this case,
• Otherwise, the constraint that is violated, i.e., there exists an
eigenvector of such that and
• Note that, for large enough , this system always has a
solution, indeed,
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

subject to

where
Non-Convex Quadratic Minimization

subject to is an arbitrary
symmetric matrix

subject to

where

How would
you solve this
problem?
SVD Reformulated
• Let with svd,

• Without loss of generality, we can assume that the


singular values are in decreasing order, i.e.,
Applications of SVD: Low Rank Approximations

Note: must be in decreasing order for this!


Any Q
[email protected]

You might also like