0% found this document useful (0 votes)
115 views

LU-factorization and Positive Definite Matrices: Tom Lyche

The document discusses LU factorization and positive definite matrices. It provides background on partitioning matrices into blocks and how block multiplication works. It then defines LU factorization and positive definite matrices, giving examples and criteria for positive definiteness. The document states that for a positive definite matrix, its LU factorization will have a lower triangular matrix with positive diagonal entries.

Uploaded by

Sadek Ahmed
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
115 views

LU-factorization and Positive Definite Matrices: Tom Lyche

The document discusses LU factorization and positive definite matrices. It provides background on partitioning matrices into blocks and how block multiplication works. It then defines LU factorization and positive definite matrices, giving examples and criteria for positive definiteness. The document states that for a positive definite matrix, its LU factorization will have a lower triangular matrix with positive diagonal entries.

Uploaded by

Sadek Ahmed
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 49

LU-factorization and Positive

Definite Matrices
Tom Lyche

University of Oslo
Norway

LU-factorization and Positive Definite Matrices – p. 1/4


Topics Today
Block multiplication of matrices
Basics on triangular matrices
LU factorization of matrices
Positive definite matrices
examples
criteria for positive definiteness
LU factorization of positive definite matrices

LU-factorization and Positive Definite Matrices – p. 2/4


Partitioned matrices
A rectangular matrix A can be partitioned into submatrices by drawing
horizontal lines between selected rowsh 1and vertical lines between
i
2 3
selected columns. For example, A = 4 5 6 can be partitoned as
78 9

   
1 2 3
i  1 2 3
 
A11 A12   h 
(i)   =  4 5 6 , (ii) a.1 , a.2 , a.3 = 
 4 ,
5 6 
A21 A22
 
7 8 9 7 8 9

     
aT1. 1 2 3
h i  1 2 3 
   
(iii)  T =
 2.   4
a ,
5 6  (iv) A11 , A12 = 
 4 5 6 .

aT3. 7 8 9 7 8 9

The submatrices in a partition is often referred to as blocks and a


partitioned matrix is sometimes called a block matrix.

LU-factorization and Positive Definite Matrices – p. 3/4


Column partition
Suppose A ∈ Rm,p and B ∈ Rp,n .
h i
If B = b.1 , . . . , b.n is partitioned into columns then the partition of
the product AB into columns is
h i
AB = Ab.1 , Ab.2 , . . . , Ab.n .

In particular, if I is the identity matrix of order p then


h i h i
A = AI = A e1 , e2 , . . . , ep = Ae1 , Ae2 , . . . , Aep

and we see that column j of A can be written Aej for j = 1, . . . , p.

LU-factorization and Positive Definite Matrices – p. 4/4


Row partition
If A is partitioned into rows then
   
aT1. aT1. B
   
 T  T 
 a2.   a2. B 
AB =   ..  B =  .. 
  
 .   . 
   
aTm. aTm. B

and taking A = I p it follows that row i of B can be written eTi B.


It is often useful to write the matrix-vector product Ax as a linear
combination of the columns of A

Ax = x1 a.1 + x2 a.2 + · · · + xp a.p .

One way to see that this is correct is to partition A into columns and
x into rows.

LU-factorization and Positive Definite Matrices – p. 5/4


Rules for 2 × 2 blocks
h i
If B = B 1 , B 2 , where B 1 ∈ Rp,r and B 2 ∈ Rp,n−r then
h i h i
A B 1 , B 2 = AB 1 , AB 2 .
 
A1
If A =  , where A1 ∈ Rk,p and A2 ∈ Rm−k,p then
A2
   
A A B
 1 B =  1  .
A2 A2 B
h i h i
If A = A1 , A2 and B = B 1 , B 2 , where A1 ∈ Rm,s , A2 ∈ Rm,p−s ,
B 1 ∈ Rs,p and B 2 ∈ Rp−s,n then
 
h i B1 h i
A1 , A2   = A1 B 1 + A2 B 2 .
B2

LU-factorization and Positive Definite Matrices – p. 6/4


The general rule for 2 × 2 blocks
   
A11 A12 B 11 B 12
If A =   and B =   then
A21 A22 B 21 B 22
    
A11 A12 B 11 B 12 A11 B 11 + A12 B 21 A11 B 12 + A12 B 22
  = ,
A21 A22 B 21 B 22 A21 B 11 + A22 B 21 A21 B 12 + A22 B 22

provided the vertical partition line in A matches the horizontal line in


B, i.e. the number of columns in A11 and A21 equals the number of
rows in B 11 and B 12 .

LU-factorization and Positive Definite Matrices – p. 7/4


The general case
If    
A ··· A1s B ··· B 1q
 11  11
 . ..   . .. 
 
A =  .. .  , B =  .. .  ,
   
Ap1 ··· Aps B s1 ··· B sq

and if all the matrix products in


s
X
C ij = Aik B kj , i = 1, . . . , p, j = 1, . . . , q
k=1

are well defined then


 
C ··· C 1q
 11
 . .. 

AB =  .. .  .
 
C p1 ··· C pq

LU-factorization and Positive Definite Matrices – p. 8/4


Block-Triangular Matrices
Lemma 1. Suppose
 
A11 A12
A= 
0 A22

where A, A11 and A22 are square matrices. Then A is nonsingular if and only if both
A11 and A22 are nonsingular. In that case
 
−1 −1 −1
A 11 −A 11 A12 A 22 
A−1 =  (1)
0 A−1
22

LU-factorization and Positive Definite Matrices – p. 9/4


Proof ⇐
If A11 and A12 are nonsingular then
    
A−1
11 −A−1
11 A 12 A −1
22 A11 A12 I 0
  = =I
0 A−1
22 0 A22 0 I

and A is nonsingular with the indicated inverse.

LU-factorization and Positive Definite Matrices – p. 10/4


Proof ⇒
Conversely, let B be the inverse of the nonsingular matrix A. We partition
B conformally with A and have
    
B 11 B 12 A11 A12 I 0
BA =     =  =I
B 21 B 22 0 A22 0 I

Using block-multiplication we find

B 11 A11 = I, B 21 A11 = 0, B 21 A12 + B 22 A22 = I.

The first equation implies that A11 is invertible, this in turn implies that
B 21 = 0 in the second equation, and then the third equation simplifies to
B 22 A22 = I. We conclude that also A22 is invertible.

LU-factorization and Positive Definite Matrices – p. 11/4


The inverse
Consider now a triangular matrix.
Lemma 2. An upper (lower) triangular matrix A = [aij ] ∈ Rn,n is
nonsingular if and only if the diagonal entries aii , i = 1, . . . , n are
nonzero. In that case the inverse is upper (lower) triangular with diagonal
ii , i = 1, . . . , n.
entries a−1
Proof:We use induction on n. The result holds for n = 1:
The 1-by-1 matrix A = (a11 ) is invertible if and only if a11 6= 0
and in that case A−1 = (a−1
11 ). Suppose the result holds for
n = k and let A =∈ Rk+1,k+1 be upper triangular.

LU-factorization and Positive Definite Matrices – p. 12/4


Proof
We partition A in the form
 
Ak ak
A= 
0 ak+1,k+1

and note that Ak ∈ Rk,k is upper triangular. By Lemma 1.1 A is


nonsingular if and only if Ak and (ak+1,k+1 ) are nonsingular and in that
case  
A−1
k −A−1
k ak a−1
k+1,k+1
A−1 =  .
0 a−1
k+1,k+1

By the induction hypothesis Ak is nonsingular if and only if the diagonal


entries a11 , . . . , akk of Ak are nonzero and in that case A−1 k is upper
triangular with diagonal entries a−1 ii , i = 1, . . . , k. The result for A follows.

LU-factorization and Positive Definite Matrices – p. 13/4


Unit Triangular Matrices
Lemma 3. The product C = AB = (cij ) of two upper(lower) triangular matrices
A = (aij ) and B = (bij ) is upper(lower) triangular with diagonal entries cii = aii bii
for all i.

Proof. Exercise.

A matrix is unit triangular if it is triangular with 1’s on the diagonal.


Lemma 4. For a unit upper(lower) triangular matrix A ∈ Rn,n :
1. A is invertible and the inverse is unit upper(lower) triangular.
2. The product of two unit upper(lower) triangular matrices is unit upper(lower)
triangular.

Proof. 1. follows from Lemma 1.2, while Lemma 1.3 implies 2.

LU-factorization and Positive Definite Matrices – p. 14/4


LU-factorization
We say that A = LU is an LU -factorization of A ∈ Rn,n if L ∈ Rn,n is
lower triangular and U ∈ Rn,n is upper triangular. In addition we will
assume that L is unit triangular.
Example 1. The equation
    
2 −1 1 0 2 −1
A=  =  
−1 2 −1/2 1 0 3/2

gives an LU -factorization of the 2-by-2 matrix A.

LU-factorization and Positive Definite Matrices – p. 15/4


Example
Every nonsingular matrix has a P LU -factorization, but not necessarily an
LU -factorization.
0 1
Example 2. An LU -factorization of A = 1 1 must satisfy the equation
      
0 1 1 0 u1 u3 u u3
 =  = 1 
1 1 l1 1 0 u2 l1 u1 l1 u3 + u2

for the unknowns l1 in L and u1 , u2 , u3 in U . Comparing (1, 1)-elements we see that


u1 = 0, which makes it impossible to satisfy the condition 1 = l1 u1 for the (2, 1)
element. We conclude that A has no LU -factorization.

LU-factorization and Positive Definite Matrices – p. 16/4


Uniqueness
Theorem 5. The LU -factorization of a nonsingular matrix is unique whenever it exists.

Proof. x

Suppose A = L1 U 1 = L2 U 2 are two LU -factorizations of the nonsingular


matrix A.

The equation L1 U 1= L2 U 2 can be written in the form L−1


2 L 1 = U 2 U −1
1 ,
−1 −1
where by lemmas 1.2-1.4 L2 L1 is unit lower triangular and U 2 U 1 is upper
triangular.

But then both matrices must be diagonal with ones on the diagonal.
−1
We conclude that L2 L1 = I = U 1 U −1
2 which means that L1 = L2 and
U 1 = U 2.

LU-factorization and Positive Definite Matrices – p. 17/4


Leading Principal Submatrices
Suppose A ∈ Cn,n . The upper left k × k corners
 
a · · · a1k
 11
 .. .. 

Ak =  . .  for k = 1, . . . , n
 
ak1 · · · akk

of A are called the leading principal submatrices of A

LU-factorization and Positive Definite Matrices – p. 18/4


A Lemma
The following lemma will be used for existence.
Lemma 6. Suppose A = LU is the LU -factorization of A ∈ Rn,n . For k = 1, . . . , n
let Ak , Lk , U k be the leading principal submatrices of A, L, U , respectively. Then
Ak = Lk U k is the LU -factorization of Ak for k = 1, . . . , n.
Proof: We partition A = LU as follows:
    
Ak B k Lk 0 Uk Vk
A=   =     = LU , (2)
C k Dk Mk Nk 0 Wk

where D k , Nk , Wk ∈ Rn−k,n−k .

LU-factorization and Positive Definite Matrices – p. 19/4


Proof
Using block-multiplication we find the equations

Ak = Lk U k (3)

B k = Lk Vk (4)

C k = Mk U k (5)

D k = Mk Vk + Nk Wk (6)

Since Lk is unit lower triangular and U k is upper triangular we see that (3)
gives the LU -factorization of Ak .

LU-factorization and Positive Definite Matrices – p. 20/4


Existence
Theorem 7. Suppose A ∈ Rn,n is nonsingular. Then A has an LU -factorization if and
only if the leading principal submatrices Ak are nonsingular for k = 1, . . . , n − 1.
Proof:Suppose A is nonsingular with the LU -factorization A = LU . Since
A is nonsingular it follows that L and U are nonsingular. By Lemma 6 we
have Ak = Lk U k . Since Lk is unit lower triangular it is nonsingular.
Moreover U k is nonsingular since its diagonal entries are among the
nonzero diagonal entries of U . But then Ak is nonsingular.

LU-factorization and Positive Definite Matrices – p. 21/4


Proof continued
Conversely, suppose A = An is nonsingular and Ak is nonsingular for
k = 1, . . . , n − 1. We use induction on n to show that A has a
LU -factorization. The result is clearly true for n = 1, since the
LU -factorization of a 1-by-1 matrix is (a11 ) = (1)(a11 ). Suppose that
A1 , . . . , An−1 are nonsingular implies that An−1 has an LU -factorization,
and suppose that A1 , . . . , An are nonsingular. To show that A = An has
a LU -factorization we consider (3)-(6) with k = n − 1. In this case C k and
Mk are row vectors, B k and Vk are column vectors, and D k = (ann ),
Nk = (1), and Wk = (unn ) are 1-by-1 matrices, i.e. scalars. The
LU -factorization of An−1 is given by (3), and since An−1 is nonsingular
we see that Ln−1 and U n−1 are nonsingular. But then (4) has a unique
solution Vn−1 , (5) has a unique solution Mn−1 , and setting Nn−1 = (1) in
(6) we obtain unn = ann − Mn−1 Vn−1 . Thus we have constructed an
LU -factorization of A.

LU-factorization and Positive Definite Matrices – p. 22/4


T
LDL -factorization
For a symmetric matrix the LU -factorization can be written in a special
form.
Definition 8. Suppose A ∈ Rn,n . A factorization A = LDLT , where L is unit lower
triangular and D is diagonal is called an LDLT -factorization.
Theorem 9. Suppose A ∈ Rn,n is nonsingular. Then A has an LDLT -factorization if
T
and only if A = A and Ak is nonsingular for k = 1, . . . , n − 1.

LU-factorization and Positive Definite Matrices – p. 23/4


Proof
Proof: If A1 , . . . , An−1 are nonsingular then Theorem 7 implies that A has
an LU -factorization A = LU . Since A is nonsingular it follows that U is
nonsingular and since U is triangular the diagonal matrix
D = diag(u11 , . . . , unn ) is nonsingular (cf. Lemma 2). We can then factor
A further as A = LDM T where M T = D −1 U . It is easy to see that M T
is unit upper triangular and since AT = A we find
AT = (LDM T )T = M DLT = LU = A. Now M (DLT ) and LU are two
LU -factorizations of A and by the uniqueness of the LU -factorization we
must have M = L. Thus A = LDM T = LDLT is an LDLT -factorization
of A. Conversely, if A = LDLT is an LDLT -factorization of A then A is
symmetric since LDLT is symmetric and A has an LU -factorization with
U = DLT . By Theorem 7 we conclude that A1 , . . . , An−1 are
nonsingular.

LU-factorization and Positive Definite Matrices – p. 24/4


Quadratic Forms
Suppose A ∈ Rn,n is a square matrix. The function f : Rn → R given
by
X n Xn
f (x) = xT Ax = aij xi xj
i=1 j=1

is called a quadratic form.


We say that A is
(i) positive definite if AT = A and xT Ax > 0 for all nonzero x ∈ Rn .
(ii) positive semidefinite AT = A and xT Ax ≥ 0 for all x ∈ Rn .
(iii) negative (semi-)definite if −A is positive (semi-) definite.

LU-factorization and Positive Definite Matrices – p. 25/4


Some Observations
A matrix is positive definite if it is positive semidefinite and in addition

xT Ax = 0 ⇒ x = 0. (7)

The zero-matrix is positive semidefinite;


a positive definite matrix must be nonsingular.
Indeed, if Ax = 0 for some x ∈ Rn then xT Ax = 0 which by (7)
implies that x = 0.

LU-factorization and Positive Definite Matrices – p. 26/4


Example T
Example 3. The 3-by-3 tridiagonal matrix
 
2 −1 0
 
T = T 3 =  −1

 = tridiag3 (−1, 2, −1)
2 −1  (8)

0 −1 2

is positive definite. Clearly T is symmetric. Now it can be shown that

xT T x = x21 + x23 + (x2 − x1 )2 + (x3 − x2 )2 .

Thus xT T x ≥ 0 and if xT T x = 0 then x1 = x3 = 0 and x1 = x2 = x3 which


implies that also x2 = 0. Hence T is positive definite.

LU-factorization and Positive Definite Matrices – p. 27/4


Example B
Example 4. Let A = B T B , where B ∈ Rm,n and m, n are positive integers. (Note
T T T
that B can be a rectangular matrix). Since A = (B B)T = B B we see that A is
symmetric. Moreover, for any x ∈ Rn

T T T
2 T
x Ax = x B Bx = (Bx) (Bx) = Bx 2 . (9)

Since the Euclidian norm k k2 of a vector is nonnegative this shows that A is positive
semidefinite and that A is positive definite if and only if B has linearly independent
columns.

Note that A and B have the same null-space and the same rank.

LU-factorization and Positive Definite Matrices – p. 28/4


The Hessian Matrix
Example 5. Suppose F (t) = F (t1 , . . . , tn ) is a real valued function of n variables
which has continuous 1. and 2. order partial derivatives for t in some domain Ω. For each
t ∈ Ω the gradient and Hessian of F are given by
   2 
∂F (t) ∂ F (t) ∂ 2 F (t)
. . . ∂t1 ∂tn
 ∂t1   ∂t1 ∂t1 
.. n . .
∇F (t) =  ∈ R , H(t) = . .  ∈ Rn,n .
   
.   . .
   2 2

∂F (t) ∂ F (t) ∂ F (t)
∂tn ∂tn ∂t1 . . . ∂tn ∂tn

It is shown in advanced calculus texts that under suitable conditions on the domain Ω the
matrix H(t) is symmetric for each t ∈ Ω. Moreover if ∇F (t∗ ) = 0 and H(t∗ ) is
positive definite then t∗ is a local minimum for F . This can be shown using second-order
Taylor approximation of F . Moreover, t∗ is a local maximum if ∇F (t∗ ) = 0 and H(t∗ )
is negative definite.

LU-factorization and Positive Definite Matrices – p. 29/4


When is a Matrix Positive Definite?
Not all symmetric matrices are positive definite, and sometimes we
can tell just by glancing at the matrix that it cannot be positive
definite. Examples:
     
0 1 1 2 −2 1
A1 =   , A2 =   , A3 =  .
1 1 2 2 1 2

A1 : If a diagonal entry aii ≤ 0 for some i then eTi Aei = aii ≤ 0 and
A is not positive definite.
A2 : if the absolute value of the largest entry of A is not (only) on the
diagonal then A is not positive definite.
To show this suppose aij ≥ aii and aij ≥ ajj for some i 6= j. Since A
is symmetric we obtain (ei − ej )T A(ei − ej ) = aii + ajj − 2aij ≤ 0
which implies that xT Ax ≤ 0 for some x 6= 0.

LU-factorization and Positive Definite Matrices – p. 30/4


Submatrices
To give citeria for positive definiteness we start with two
lemmas.
Lemma 10. The leading principal submatrices of a positive definite
matrix are positive definite and hence nonsingular.
Proof. Consider a leading principal submatrix Ak of the positive definite
matrix A ∈ Rn,n . Clearly Ak is symmetric. Let x ∈ Rk be nonzero, set
h i
Ak B k
y = [ x0 ] ∈ Rn , and partition A conformally with y as A = C k Dk ,

where D k ∈ Rn−k,n−k . Then


" #" #
T Ak
T T Bk x
0 < y Ay = [x 0 ] = xT Ak x.
Ck Dk 0

LU-factorization and Positive Definite Matrices – p. 31/4


LU -factorization
Lemma 11. A matrix is positive definite if and only if it has an LDLT -factorization with
positive diagonal entries in D .
Proof: It follows from Lemma 10 and 9 that A has an LDLT -factorization
A = LDLT . We need to show that the diagonal entries dii of D are
positive. With ei the ith unit vector we find

dii = eTi Dei = eTi L−1 AL−T ei = xTi Axi ,

where xi = L−T ei is nonzero since L−T is nonsingular. Since A is


positive definite we see that dii = xTi Axi > 0 for i = 1, . . . , n.

LU-factorization and Positive Definite Matrices – p. 32/4


Proof of the converse
Conversely, if A has an LDLT -factorization with positive diagonal entries
in D then we can write

A = RT R, RT := LD 1/2 , (10)

1/2 1/2
where D 1/2 = diag(d11 , . . . , dnn ). Since L and D 1/2 are nonsingular it
follows that R is nonsingular and A is positive definite by Example 4. 

LU-factorization and Positive Definite Matrices – p. 33/4


Cholesky Factorization
The factorization A = RT R in (10) is quite useful and has a special name.
Definition 12. A factorization A = RT R where R is upper triangular with positive
diagonal entries is called a Cholesky-factorization.
2 −1
has an LDLT - and Cholesky-factorization given
 
Example 6. The matrix A = −1 2
by
     
2 −1 1 0 2 0 1 −1/2
  =    
−1 2 −1/2 1 0 3/2 0 1
 √  √ √ 
2 0 2 −1/ 2
=  √ p   p .
−1/ 2 3/2 0 3/2

LU-factorization and Positive Definite Matrices – p. 34/4


Positive Eigenvalues
Lemma 13. A matrix is positive definite if and only if it is symmetric and has positive
eigenvalues.

Proof. x

If A is positive definite then by definition A is symmetric.

Suppose Ax = λx with x 6= 0. Multiplying both sides by xT and solving for λ we


xT Ax
find λ = xT x > 0.

Suppose conversely that A ∈ Rn,n is symmetric with positive eigenvalues


λ1 , . . . , λn . From the spectral theorem we have U T AU = D , where
U T U = U U T = I and D = diag(λ1 , . . . , λn ).
Let x ∈ Rn be nonzero and define c := U T x = [c1 , . . . , cn ]T . Then
cT c = xT U U T x = xT x so c is nonzero. Since x = U c we find
T Pn
x Ax = (U c) AU c = c U AU c = c Dc = j=1 λj c2j > 0 and it
T T T T

follows that A is positive definite.

LU-factorization and Positive Definite Matrices – p. 35/4


Necessary and Sufficient Conditions
Theorem 14. The following is equivalent for a symmetric matrix A ∈ Rn,n
1. A is positive definite.
2. A has only positive eigenvalues.
3.
a11 ... a1k



.. ..
> 0 for k = 1, . . . , n

. .

ak1 . . . akk

4. A = B T B for a nonsingular B ∈ Rn,n


Proof By Lemma 13 we know that 1 ⇔ 2. We show that 1 ⇒ 3 ⇒ 4 ⇒ 1.

LU-factorization and Positive Definite Matrices – p. 36/4


Proof
1 ⇒ 3: By Lemma 10 the leading principal submatrix Ak of A is positive
definite, and hence has positive eigenvalues by Lemma 13. Since the
determinant of a matrix equals the product of its eigenvalues we conclude
that det(Ak ) > 0 for k = 1, . . . , n.
3 ⇒ 4: The condition det(Ak ) > 0 implies that Ak is nonsingular for
k = 1, . . . , n. By Theorem 9 A has an LDLT -factorization A = LDLT .
Let Lk and D k be the leading principal submatrices of order k of L and
D, respectively. By partitioning A, L and D similarly to the proof of
Lemma 6 we see that Ak = Lk D k LTk is the LDLT -factorization of Ak for
k = 1, . . . , n. Using properties of determinants we find
det(Ak ) = det(Lk ) det(D k ) det(LTk ) = det(D k ) = d11 . . . dkk > 0. Since
this holds for k = 1, . . . , n we conclude that D has positive diagonal
entries and we have A = B T B with B := R as in (10).
4 ⇒ 1: This follows from the discussion in Example 4.

LU-factorization and Positive Definite Matrices – p. 37/4


Useful facts
Suppose A ∈ Rn,n is positive definite and let h·, ·i be the usual inner
product on Rn .
1. A has a set of eigenvectors u1 , . . . , un that form an orthonormal
basis for Rn .
Pn
2. For any x ∈ R we have x = j=1 cj uj , where cj = hx, uj i := xT uj .
n

Pn
3. For any x ∈ Rn we have Ax = j=1 λj cj uj , where λ1 , . . . , λn are
the eigenvalues of A.
Pn
4. Furthermore hAx, xi = j=1 λj c2j .
5. cond2 (A) := kAk2 kA−1 k2 = λλmax
min
, where λmax and λmin are the
largest and smallest eigenvalue of A.

LU-factorization and Positive Definite Matrices – p. 38/4


Proof
1. By the spectral theorem it has a set of n orthonormal eigenvectors
{u1 , . . . , un } that form an orthonormal basis for Rn .
n
Pn
2. Since {u1 , . . . , un } form a basis for R we have x = j=1 cj uj for
some c1 , . . . , cn . Taking the inner product with ui we find by
Pn
orthonormality hx, ui i = j=1 cj huj , ui i = ci for i = 1, . . . , n.
Pn
3. We apply A to the equation x = j=1 cj uj .
Pn Pn Pn
4. hAx, xi = h i=1 λi ci ui , j=1 cj uj i = j=1 λj c2j .
5. The eigenvalues of A are positive and since A is symmetric it is
normal and hence cond2 (A) = |λ max |
|λmin | = λmax
λmin .

LU-factorization and Positive Definite Matrices – p. 39/4


Example
For a positive integer m and real numbers a, b we consider the m-by-m
tridiagonal matrix given by
 
b a 0 ... 0
 

 a b a ... 0 


C := tridiagm (a, b, a) =  .. .. .. 
. (11)
 . . . 
 

 0 ... a b a 

0 ... a b

Note that C is symmetric C T = C. We obtain the second derivative matrix


when a = −1 and b = 2.

LU-factorization and Positive Definite Matrices – p. 40/4


C is positive definite if b > 0 and b ≥ 2|
Since C is symmetric it is enough to show that the smallest
eigenvalue λmin is positive.
The eigenvalues are λj = b + 2a cos (jπh) for j = 1, . . . , m, where
h = 1/(m + 1).
For C to be positive definite it is necessary that the diagonal entry
b > 0.
If b > 0 then λmin = b − 2|a| cos (πh) > b − 2|a| ≥ 0
Thus C is positive definite

LU-factorization and Positive Definite Matrices – p. 41/4


Finding the Cholesky factor R
To solve a linear system Ax = b where A is positive definite we first
compute the Cholesky-factorization A = RT R of A and then solve two
triangular systems RT y = b and Rx = y by forward and backward
substitution.
Consider finding the Cholesky factorization of A. Since A = RT R and R
is upper triangular we find

n min(j,k)
X X
akj = rik rij = rik rij , j, k = 1, . . . , n. (12)
i=1 i=1

LU-factorization and Positive Definite Matrices – p. 42/4


Cholesky Factorization Algorithm
Solving for rkj we find

k−1
2 1/2
X 
rkk = akk − rik ,
i=1
(13)
k−1
X 
rkj = akj − rik rij /rkk j = k + 1, . . . , n.
i=1

for k = 1, 2, . . . , n
s = R(1:k−1, k); R(k, k) = (A(k, k) − sT ∗ s)1/2 ;
T

R(k, k+1:n) = A(k, k+1:n) − s ∗ R(1:k−1, k+1:n) /R(k, k);
end

LU-factorization and Positive Definite Matrices – p. 43/4


#flops
The number of flops needed for the Cholesky-factorization is given by
n
X n
X Z n
(2k −2+(2k −1)(n−k)) ≈ 2k(n−k) ≈ 2x(n−x)dx = n3 /3.
k=1 k=0 0

In addition we need to take n square roots.


This is half the number of flops needed on Gaussian elimination of an
arbitrary matrix.
We obtain this reduction since the Cholesky factorization takes
advantage of the symmetry of A.

LU-factorization and Positive Definite Matrices – p. 44/4


Band Matrix
In many applications the matrix A has a banded structure, and the
number of flops can be reduced.
We say that A has lower bandwidth p if aij = 0 whenever i > j + p,
and upper bandwidth q if aij = 0 whenever j > i + q.
A diagonal matrix has upper and lower bandwidth zero,
a matrix with upper and lower bandwidth one is tridiagonal.
if A is symmetric then p = q.
It is easy to extend the algorithm to band-matrices.

LU-factorization and Positive Definite Matrices – p. 45/4


A Lemma
We first show that if A = RT R then R has the same upper bandwidth as
A.
Lemma 15. Suppose A is positive definite with Cholesky-factorization A = RT R. If
akj = 0 for j > k + d, then also rkj = 0 for j > k + d.

Proof. We show that if R has upper bandwidth d in its first k − 1 rows then row k also
has upper bandwidth d. The proof then follows by induction on k . Now, if j > k + d, then
akj = 0, and if R has upper bandwidth d in its first k − 1 rows then for i > k + d we
have rji = 0 for j = 1, . . . , k − 1. From(13) it follows that rki = 0 for i > k + d.

LU-factorization and Positive Definite Matrices – p. 46/4


Banded Cholesky Algorithm
Full version:

for k = 1, 2, . . . , n
s = R(1:k−1, k); R(k, k) = (A(k, k) − sT ∗ s)1/2 ;
T

R(k, k+1:n) = A(k, k+1:n) − s ∗ R(1:k−1, k+1:n) /R(k, k);
end

Banded version:

for k = 1, 2, . . . , n
km = max(1, k − d); kp = min(n, k + d);
s = R(km:k−1, k); R(k, k) = sqrt(A(k, k) − sT ∗ s);
T

R(k, k+1:kp) = A(k, k+1:kp) − s ∗ R(km:k−1, k+1:kp) /R(k, k);
end

LU-factorization and Positive Definite Matrices – p. 47/4


T
R x = b with R upper bandwidth d
To solve Ax = b where A ∈ Rn,n is positive definite with bandwidth d we
can use the banded Cholesky Algorithm followed by a simple modification
of the forward and backward substitution algorithms. In the forward
substitution we use L = RT , but do the calculations using the entries in
R:
Algorithm 16. x
for k = 1 :n
km = max(1, k − d);
T

x(k) = b(k) − x(km:k−1) ∗ R(km:k−1, k) /R(k, k);
end

LU-factorization and Positive Definite Matrices – p. 48/4


Rx = y with R upper bandwidth d
Algorithm 17. x
for k = n : −1 : 1
kp = min(n, k + d);

x(k) = y(k) − R(k, k+1:kp) ∗ x(k+1:kp) /R(k, k);
end

The number of flops for these algorithms are:


O(2nd2 ) for the banded Cholesky
O(4nd) for backward and forward substitution.
When d is small compared to n we see that these numbers are
considerably smaller than the O(n3 /3) and O(2n2 ) counts for the
factorization of a full matrix.

LU-factorization and Positive Definite Matrices – p. 49/4

You might also like