0% found this document useful (0 votes)
3 views

Lecture4_D3

This lecture discusses the properties of invertible matrices in linear algebra, stating that a square matrix is invertible if it has a unique inverse and that the homogeneous system Ax = 0 has only the zero solution. It also covers the relationship between the invertibility of matrices A and B, and provides propositions regarding the conditions under which matrix products are invertible. Additionally, the lecture introduces the concept of Row Canonical Form (RCF) and its uniqueness, emphasizing that a matrix can be transformed to the identity matrix through elementary row operations.

Uploaded by

Harsha Vardhan
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Lecture4_D3

This lecture discusses the properties of invertible matrices in linear algebra, stating that a square matrix is invertible if it has a unique inverse and that the homogeneous system Ax = 0 has only the zero solution. It also covers the relationship between the invertibility of matrices A and B, and provides propositions regarding the conditions under which matrix products are invertible. Additionally, the lecture introduces the concept of Row Canonical Form (RCF) and its uniqueness, emphasizing that a matrix can be transformed to the identity matrix through elementary row operations.

Uploaded by

Harsha Vardhan
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

MA 110:

Lecture 04

Saurav Bhaumik
Department of Mathematics
IIT Bombay

Spring 2025

Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04


Recall: A square matrix A ∈ Rn×n is said to be invertible if
there is B ∈ Rn×n such that

AB = I = BA,

and in this case, B is called an inverse of A.


We have seen examples of square matrices that are invertible
and also those that are not invertible. Further we noted that:
If a square matrix A ∈ Rn×n is invertible, then it has a
unique inverse, and it is denoted by A−1
If a square matrix A ∈ Rn×n is invertible, then so is its
transpose AT and in this case,

(AT )−1 = (A−1 )T .

Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04


We now relate the invertibility of a square matrix A to the
solutions of the homogeneous system Ax = 0.

Proposition
Let A ∈ Rn×n . Then A is invertible if and only if the linear
system Ax = 0 has only the zero solution.

Proof. Suppose A is invertible. Then by definition, there is


B ∈ Rn×n such that BA = I. If x ∈ Rn×1 satisfies Ax = 0,
then x = BAx = B(Ax) = B0 = 0. Thus the linear system
Ax = 0 has only the zero solution.
Conversely, suppose the linear system Ax = 0 has only the
 T
zero solution. Let y = y1 · · · yn ∈ Rn×1 . We transform
the augmented matrix [A|y] to a matrix [A′ |y′ ], where A′ is in
REF. By our previous result, A′ has n nonzero rows, and so
 T
back substitution gives a unique x = x1 · · · xn ∈ Rn×1
such that A′ x = y′ . Hence Ax = y.
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Further, the process of the back substitution shows that the
entries x1 , . . . , xn of x are given as follows:

xn = cnn yn′
′ ′ ′
xn−1 = c(n−1)(n−1) yn−1 + c(n−1)n yn′
.. .. .. .. ..
. . . . .
′ ′ ′
x2 = c22 y2 + · · · + · · · + · · · + c2n yn′

x1 = c11 y1′ + c12

y2′ + · · · + · · · + · · · + c1n

yn′ ,
T
where y′ = y1′ · · · yn′ and cjk′ ∈ R for j, k = 1, . . . , n.


Also, since y′ is obtained from y by performing EROs (which


are of the type Ri ←→ Rj , Ri + αRj and αRj ) on [A|y], we see
that each y1′ , . . . , yn′ is a linear combination of the entries
y1 , . . . , yn of y. As a result, each x1 , . . . , xn is a linear
combination of y1 , . . . , yn .
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Thus there is cjk ∈ R for j, k = 1, . . . , n (not depending on
y1 , . . . , yn ) such that

x1 = c11 y1 + c12 y2 + · · · + c1n yn


x2 = c21 y1 + c22 y2 + · · · + c2n yn
.. .. .. .. ..
. . . . .
xn = cn1 y1 + cn2 y2 + · · · + cnn yn .
 
Define C := cjk ∈ Rn×n . Then x = Cy, and so
ACy = A(Cy) = Ax = y. Letting y := ek ∈ Rn×1 , we see
that (AC)ek = ek for k = 1, . . . , n. Hence AC = I. We still
need to show that CA = I. For this, consider the linear system
Cx = 0. Note that Cx = 0 ⇒ x = ACx = A(Cx) = A0 = 0.
Thus the linear system Cx = 0 has only the zero solution.
Hence by what we have proved above, there is D ∈ Rn×n such
that CD = I. Now, D = ID = (AC)D = A(CD) = AI = A.
Thus AC = I = CA, and so A is invertible.
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Corollary
Let A ∈ Rn×n . If there is B ∈ Rn×n such that either BA = I
or AB = I, then A is invertible, and A−1 = B.
Proof. Let B ∈ Rn×n be such that BA = I. If x ∈ Rn×1
satisfies Ax = 0, then x = BAx = B(Ax) = B0 = 0. Thus
the linear system Ax = 0 has only the zero solution. By the
previous proposition, A is invertible. Then there is C ∈ Rn×n
such that AC = I, and B = C. Hence A−1 = B.
Next, let B ∈ Rn×n be such that AB = I. Then
BT AT = IT = I. By what we have just proved, AT is
invertible, and (AT )−1 = BT . Hence A = (AT )T is invertible,
and A−1 = (BT )T = B.
Note: The above result is a definite improvement over
requiring the existence of a matrix B satisfying both BA = I
and AB = I for the invertibility of a square matrix A.
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Proposition
Let A and B be square matrices. Then AB is invertible if and
only if A and B are invertible, and then (AB)−1 = B−1 A−1 .

Proof. Let A and B be invertible. Using the associativity of


matrix multiplication, we easily see that

(B−1 A−1 )(AB) = B−1 (A−1 A)B = B−1 B = I.

Hence AB is invertible and (AB)−1 = B−1 A−1 by the previous


corollary.
Conversely, let AB be invertible. Then there is C such that
(AB)C = I = C(AB). Since A(BC) = (AB)C = I, we see
that A is invertible, and A−1 = BC. Also, since
(CA)B = C(AB) = I, we see that B is invertible and
B−1 = CA, again by the previous corollary.

Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04


Proposition
Let A be an n × m matrix.
(i) If there is an m × n matrix B such that BA = Im×m then
n ≤ m.
(ii) If there is an n × m matrix C such that AC = In×n then
m ≤ n.
(iii) If there are matrices B and C such that BA = I , AC = I
then m = n and B = C .
Proof. Indeed, if BA = I and if x ∈ Rn×1 is a vector such that
Ax = 0 then x = Ix = BAx = 0. On the other hand if n > m
then there is at least one nonzero solution to Ax = 0. This
proves (i). For (ii), note that AC = I implies I = C T AT , so by
(i), as AT is of order n × m, m ≤ n. For (iii), if BA = I = AC
then by (i) and (ii) we know n = m. Again,
B = BI = B(AC ) = (BA)C = IC = C . □

Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04


Row Canonical Form (RCF)
As we have seen, a matrix A may not have a unique REF.
However, a special REF of A turns out to be unique.
An m × n matrix A is said to be in a row canonical form
(RCF) or a reduced row echelon form (RREF) if
(i) it is in a row echelon form (REF),
(ii) all pivots are equal to 1 and
(iii) in each pivotal column, all entries above the pivot are
(also) equal to 0.
For example, the matrix
 
0 1 ∗ 0 0 ∗
0 0 0 1 0 ∗
A :=  
0 0 0 0 1 ∗
0 0 0 0 0 0
is in a RCF, where ∗ denotes any real number.
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Note: If A is in REF, then in each pivotal column, all entries
below the pivot are 0. If A is in fact in RCF and has r nonzero
rows, then the r × r submatrix formed by the first r rows and
the r pivotal columns is the r × r identity matrix I.
Suppose an m × n matrix A is in RCF and has r nonzero rows.
If r = n, then it has n pivotal columns, that is,
 all its columns
I
are pivotal, and so A = I if m = n, and A =   if m > n,
O
where I is the n × n identity matrix and O is the (m − n) × n
zero matrix.
To transform an m × n matrix to a RCF, we first transform it
to a REF by elementary row operations of type I and II. Then
we multiply a row containing a pivot p by 1/p (which is an
elementary row operation of type III), and then we add a
suitable nonzero multiple of this row to each preceding row.
Every matrix has a unique RCF. (Proof by induction on n)
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04
Example
   
1 3 −2 0 2 0 1 3 −2 0 2 0
2 6 −5 −2 4 −3EROs  0 0 −1 −2 0 −3
0 0 5 10 0 15 −−−−→ 0
   ,
0 0 0 0 6
2 6 0 8 4 16 0 0 0 0 0 0
which is in REF,
   
1 3 0 4 2 6 1 3 0 4 2 0
EROs 0
 0 1 2 0 3  EROs  0
  0 1 2 0 0
−−−−→0 −−−−→ ,
0 0 0 0 6 0 0 0 0 0 1
0 0 0 0 0 0 0 0 0 0 0 0
which is in RCF.

Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04


Recall: A square matrix A is invertible if and only if the
homogeneous linear system Ax = 0 has only the zero solution.
Proposition
An n×n matrix is invertible if and only if it can be transformed
to the n×n identity matrix by EROs.

Proof. Suppose A ∈ Rn×n is invertible. Using EROs, transform


A to a matrix A′ ∈ Rn×n such that A′ is in a RCF. Since A is
invertible, the linear system Ax = 0 has only the zero solution.
Hence A′ has n nonzero rows, and so each of the n columns of
A′ is pivotal. Also, the number of rows of A is equal to the
number of its columns, that is, m = n. Therefore A′ = I.
Conversely, suppose A ∈ Rn×n can be transformed to the n×n
identity matrix I by EROs. Since Ix = 0 =⇒ x = 0 for
x ∈ Rn×1 , we see that the linear system Ax = 0 has only the
zero solution. Hence A is invertible.
Saurav Bhaumik, IIT Bombay Linear Algebra: Lecture 04

You might also like