線性代數ch02
線性代數ch02
Matrices and
Linear Equations
Edgar G. Goodaire
Matrices
What is a Matrices
2.1.1 Definition
A matrix is a rectangular array of numbers enclosed in square
brackets. If there are m rows and n columns in the array, the
matrix is called m × n (read “m by n”) and said to have size m ×
n. x1
x
• 2 is called a column matrix.
x
n
0 2 3 2
2
1 2 0 0
T
T
1 2 3 0 0 2 4 1 2
• 2 4 6 3 4 3 6 2 • 0 2 3 4
0 1 2 3
5 9 0 3 5
4
0 4 9 3
2.1.3 Example
1 2
Let A 3 4 . Then a31 5, a12 2, and a22 4.
5 6
2.1.4 Example
Let A be the 23 matrix with aij = i – j.
The (1, 1) entry of A is a11=1 – 1=0, the (1, 2) entry is a12=1 – 2=–1,
A 0 1 2 .
and so on. We obtain 1 0 1
2.1.5 Example
If A = [ aij ] and B = AT, then bij = aji.
4
2.1.7 Matrices A and B are equal they have same number
of rows, the same number of columns and
corresponding columns are equal.
2.1.3 Example
1 x a 4
Let A
2 y 3 and B 4 a b .
If A = B, then – 1 = a, x = – 4, 2y = 4, and a – b = – 3.
Thus, then a = – 1, b = a + 3 = 2, x = – 4, and y = 2.
5
Addition of Matrices
2.1.9 Example
1 2 3 2 3 0 1 0
Let A , B , and C .
4 5 6 2 1 1 0 1
Then A B 1 5 3 while A+C and B + C are not defined.
6 6 5
Only matrices of the same size can be added.
2.1.11 Example 0 0 0
0
The 23 zero matrix is . For any 23 matrix
0 0 0
a b c
A .
d e f We have
a b c 0 0 0 a b c
A0 A 0 A A.
d e f 0 0 0 d e
f
6
Scalar Multiplication of
Matrices
2.1.13 Definition
The negative of a matrix A is the matrix ( – 1)A, which we denote
– A and call “ minus A”: – A = ( – 1)A.
2.1.14 Example
1 3 1 3 1 3
If A
, then A ( 1)
.
2 4 2 4 2 4
7
Theorem 2.1.15 (Properties of Matrix
Addition and Scalar Multiplication).
8
Theorem 2.1.15 (Properties of Matrix
Addition and Scalar Multiplication).
9
Multiplication of
Matrices
2.1.18 The Product of a row and a column is a dot product:
aT b [a b].
2.1.20 Example
As noted at the start of this section, the system
2 x1 x2 x3 2
3 x1 2 x2 6 x3 1
x1 3 x3 1
can be written Ax = b, with
2 1 1 x1 2
A 3 2 6 , x x2 , and b 1 .
1 0 3 x3 1
10
Matrix times column
T
a1T
T b a b
b a1T 1
a2 a 2 b a 2 b
Ab
a m b a m b
T
T
am
11
2.1.21 Aei = column i of A.
2.1.22 Problem
Suppose A and B are mn matrices such that Aei = Bei for each
ei Rn. Prove that A = B.
Solution :
By 2.1.21, the given condition says that each column of A
equals the corresponding column of B, so A = B.
12
Matrix times matrix
b1 b 2
b
p
Ab1 Ab 2
Ab p
Amn Bnp A
T TB
a1T a1T
a 2 B a 2 B
T T
a m a m B
13
2.1.23 Problem
2 4 1 2 4 0 1 1
Let A 3 5 1 and B 3 2 0 2 4 . Then
0 4 2 1 2 6 2 1
2 4 1 2 4 0 1 1 15 2 6 8 13
AB 3 5 1 3 2 0 2 4 22 4 6 15 18
0 4 2 1 2 6 2 1 10 12 12 4 14
The first column of AB is the product of A and the first columns of B
2 2 4 1 2 15
A 3 3 5 1 3 22 ,
1 0 4 2 1 10
The second column of AB is the product of A and the second columns
of B 4 2 4 1 4 2
A 2 3 5 1 2 4 , and so on.
2 0 4 2 2 12
14
2.1.24 The (i, j) entry of AB is the product of row i of A and
column j of B.
2.1.26 Example
Suppose A 1 2 and B 1 0 1 .
3 4 3 4 2
Then AB 7 8 3 while BA is not denoted : B is 2 3
15 16 5
but A is 2 2. (The circled numners are different.)
16
Theorem 2.1.30
(Properties of Matrix Multiplication).
12 3 2
Let A 4 5 6 and x 1 . Then
7
8 9 4
1 2 3 2 12 1 2 3
Ax 4 5 6 1 27 2 4 5 4 6
7 8 9 4 42 7 8 9
x1 1 2 3
More generally, A x2 x1 4 x2 5 x3 6 .
x 7 8 9
3
2.1.35 Example
2 3 5 4 2 3
4 5 30 5 4 2 5 .
2
8 46
6 6 8
19
2.1.36 Problem Let A be an mn matrix and I = Im the
mm identity matrix. Explain why IA = A.
2.2.16 Problem x 2 y 5
Solve the system of equations
2 x 3 y 7.
1 2
Solution : The system is Ax = b, where A 2 3 is the
x
matrix of coefficients, x y is unknown, and b 5 .
7
x A 1b 3 2 5 1 . The solution is x 1, y 3.
2 1 7 3
23
2.2.17 Problem
24
More on Transposes
Theorem 2.2.19
Let A and B be matrices and let c be a scalar.
1. (A+B)T = AT+BT (assuming A and B have the same size);
2. (cA)T = cAT;
3. (AT)T = A;
4. If A is invertible, so is AT and (AT)-1 = (A–1)T.
5. (AB)T = BTAT (assuming AB is defined): the transpose of a
product is the product of the transposes, order reverse.
25
2.2.20 Example
1 2 3 0 1
Suppose A and B 1 3 . Then
4 0 1
4 2
0 1
AB 1 2 3 1 3 14 1 .
4 0 1 4 2 4 6
1 4
T
Now A 2 0 and B T 0 1 4 , so
1 3 2
3 1
0 1 4 1 4 14 4
B A
T T 2 0 ( AB)T .
1 3 2 3 1 1 6
26
2-3 Systems of Linear
Equations
2.3.1 The Elementary Row Operations.
1. Interchange two rows.
2. Multiply a row by any scalar except 0.
3. Replace a row by that row minus a multiple of another.
2.3.3 To Solve a System of Linear Equation.
1. Write the system in the form Ax = b, where A is the
matrix of coefficients, x is the vector of unknowns, and
b is the vector whose components are the constants to
the right of the equals signs.
2. Add a vertical line and b to the right of A to form the
augmented matrix [ A│b ].
27
3. Reduce the augmented matrix to an upper triangular
matrix by Gaussian elimination.
4. Write the equations that correspond to the triangular
matrix and solve by back substitution.
x 2 y 1
2.3.4 Problem Solve the system
4 x 8 y 3.
1 2 x 1
Solution : This is Ax = b with A , x , and b .
4 8 y 3
1 2 1
[ A b] R2
R 2 4( R1) 1 2 1 .
4 8 3 0 0 1
So the equation are
x 2 y 1
0 1.
The last equation is not true, so there is no solution.
28
2.3.6 A system of linear equation may have
1. A unique solution,
2. no solution, or
3. infinitely many solution.
2.3.7 Definition
A row echelon matrix is an upper triangular matrix with the
following properties:
1. All row consisting entirely of 0s are at the bottom.
2. The leading nonzero entries of the nonzero rows step from
left to right as you read down the matrix;
3. All the entries in a column below a leading nonzero entry
are 0.
A row echelon from of a matrix A is a row echelon matrix U to
which A can be moved via elementary row operations.
29
2.3.9 Example
1 1 3 4 5
• The matrix U 0 0 1 3 1 is a row echelon matrix.
0 0 0 0 1
The pivots are circled. Since the pivots are in columns one,
three, and five, the pivot columns are columns are columns
one, three, and five.
3 2 1 6 R1(– 2)+R2
• Let A 6 4 8 7 . R1(–3)+R3
9 6 9 12
3 2 1 4 R2(–1)+R3 3 2 1 4
Then A 0 0 6 1 0 0 6 1 U
0 0 6 0 0 0 0 1
So the pivots of A are 3, 6, and –1, and the pivot columns of
A are one, three, and four. 30
2 x y z 7
2.3.12 Example Solve the system x y z 2
3 x y z 0.
Solution :
2 1 1 7 1 1 1 2
A b 1 R1 R 2
1 1 2 2 1 1 7 R2– 2(R1)
3 1 1 0 3 1 1 0 R3–3(R1)
1 1 1 2 R 2 R3 1 1 1 2 1
0 3 1 3 0 2 2 6 R 2 ( 2 ) R 2
0 2 2 6 0 3 1 3
1 1 1 2 1 1 1 2
0 1 1 3 0 1 1 3
0 3 1 3 R3+3(R2) 0 0 2 12 R3 12 R3
1 1 1 2 x 1
0 1 1 3 y 3 .
0 0 1 6
z 6
31
2.1.13 Free variables are those that correspond to columns
which are not pivot columns.
2.3.8 Definition
A pivot in a row echelon matrix U is a leading nonzero entry in
a nonzero row.
A column containing a pivot is called a pivot column of U.
If U is a row echelon form of a matrix A, then a pivot column
that corresponds to a pivot column of U.
If U was obtained without any multiplication of rows by
nonzero scalars, then a pivot of A is defined to be any pivot of
U.
32
x1 x2 x3 2
2.3.14 Example Solve 2 x1 x2 3 x3 6
x1 2 x3 4.
x1
1 2 3 4 x2 1
A 2 4 8 10 , x , b 6 .
3 7 11 14 x3 7
x
4
1 2 3 4 1 1 2 3 4 1 1 2 3 4 1
Then , 2 4 8 10 6 0 0 2 2 4 0 1 2 2 4 .
3 7 11 14 7 0 1 2 2 4 0 0 1 1 2
x4 t is free, x3 4 x3 2 x4 0, and x1 1 2 x2 3x3 4 x4 5 t.
The solution to Ax b is
x1 5 t 5 1 5 1
x 0 0 0 0
0
x t , here, x , x t .
2
x3 2 t 2 1 p
2 h 1
x
4 t 0 1 0 1 37
More on Linear
Independence
2.4.3 Example
The standard basis vector of Rn are linear independent.
Proof : We suppose that some linear combination of these is
the zero vector, say
c1e1 c2e 2 cne n 0.
c1 e1 e 2 e n
c2
This is Ac = 0 where c and A is the n n
cn
identity matrix. Thus Ac = Ic = 0 certainly means c = 0, so the
vectors are linear independent.
38
2.4.4 Problem
2 1 4
x 3 , x 1 , and x 0
Decide whether the vectors 1 1 2 2 3
1
4 3 1
are linear independent or linear dependent?
Solution :
Suppose c1x1 c2 x 2 cn x n 0. This is homogeneous system
2 1 4 c1
A 3 1 0 and c c . Then,
Ac = 0 with 1 2 1 2
c
4 3 1 3
2 1 4 1 2 1 1 2 1 1 2 1
3 1 0 0 5 6 0 1 1 0 1 1 .
1 2 1 0 5 3 0 0 1 0 0 1
4 3 1 0 5 5 0 0 2 0 0 0
Gives c3 = 0, then c2 – c3 = 0, so c2 = 0 and, c2 + 2c2 – c3 = 0, so
c1 = 0. Thus, c = 0, so the vectors are linear independent.
39
2-5 The LU Factorization
of a Matrix
2.5.1 Definition
An element matrix is a square matrix, obtained from the identity
matrix by a single elementary row operation.
2.5.1 Examples
0 1 0
• E1 1 0 0 is an elementary matrix, obtained from the
0 0 1
33 identity matrix I by interchanging rows one and two.
1 0 0
• The matrix E3 0 3 0 is elementary; it is obtained
from 0 0 1
40
I by multiplying row two by 3.
2.5.3 If A is a matrix and E is an elementary matrix with EA
defined, then EA is that matrix obtained by applying to A
the elementary operation that defined E.
2.5.4 If A is a elementary matrix, E has an inverse that is an
elementary matrix, namely, the elementary matrix that
reverses ( or undoes) what E does.
2.5.5 Example
0 1 0
E1 1 0 0 is the elementary matrix that interchanging rows
0 0 1
one and two of a matrix. We want show E1E1 = I. Alternatively,
we can compute
0 1 0 0 1 0 1 0 0
E1 E1 1 0 0 1 0 0 0 1 0 I .
0 0 1 0 0 1 0 0 1
41
2.5.9 Definition
Solution : we replace row two by row two minus 4 times row one.
The elementary matrix required is
1 0 0 1 2 3
E1 4 1 0 , thus E1 A 0 14 18
0 0 1 7 7 9
Then we replace row three by row three plus 7 times row one.
1 0 0 1 2 3
E2 0 1 0 , thus E2 ( E1 A) 0 14 18 .
7 0 1 0 7 30
Next we replace row three by row three minus –1/2 times row two.
1 0 0 1 2 0
E3 0 1 0 , thus E3 ( E2 E1 A) 0 14 18 U .
0 1 1 0 0 21
2
44
Since (E3 E2 E1 ) A U , A (E3 E2 E1 ) 1U E1 1 E2 1 E3 1U , so
1 0 0 1 0 0 1 0 0 1 0 0
L E1 1 E2 1 E3 1 4 1 0 0 1 0 0 1 0 4 1 0
0 0 1 7 0 1 0
1
1 7
1
1
2 2
Thus A LU ;
1 2 3 1 0 0 1 2 3
4 6 6 4
1 0 0 14 18 .
7 7 9 7
1
2
1 0 0 21
Theorem 2.5.15
If a matrix A can be reduced to row echelon from without row
interchanges, then A has an LU factorization.
45
L without Effort
2.5.16 To find an LU factorization of a matrix A, try to reduce A
to an upper triangular matrix U from top down using
only the third elementary row operation.
1 2 3
2.5.18 Remark Suppose A 1 1 1 and we apply the
2 2 3
following sequence of third elementary row operation:
R 3 R 3 2( R 2)
1 2 3
R 2 R 2 R1
1 2 3
A 1 1 1 0 1 2 U
0 0 1 0 0 1
1 0 0
And we find L 1 1 0 , but A LU .
0 2 1
2.5.19 When trying to find an LU factorization, use only the
third elementary row operation. 46
Why LU?
2.5.20 Example
x1 0
Suppose we want to solve Ax = b for x x2 , where b 1 ,
x 0
1 2 3 3
and A 4 6 6 is the matrix of Problem 2.5.14.
7 7 9
We found A = LU, with
1 0 0 1 2 3
L 4 1 0 and U 0 14 18
7
1
2
1 0 0 21
To solve Ax = b, which is L(Ux) = b, we first solve Ly = b for
y1 y1 0
y y2 . This is the system 4 y1 y2 1
y
3 7 y1 12 y2 y3 0. 47
1 1
We have y1 0, y2 1 4 y1 1, and y3 7 y1 2 y2 2 .
0
Thus y 11 . Now we solve Ux = y, we have
2
x1 2 x2 3 x3 0 x3 421 ,
14 x2 18 x3 1 14 x2 1 18 x3 107 , so x2 295 , and
21x3 12 . x1 2 x2 3 x3 10
49
1
14
13
98
.
13
985
The solution is x 49
421
2.5.23 Example
0 1 0 0 0 1 0
0 1 , 0 0 1 and 0 1 0 0 are all permutation matrices.
1 0 1 0 0 0 0 0 1
1 0 0 0
49
2.5.25 Problem 0 1 1 0
0 1 1 4
Find an LU factorization of A 2 3 2 1 if possible;
otherwise find PLU factorization. 6 4 2 8
Solution :
Rearranging the rows of A in the order 3, 4, 1, 2 leads to the
matrix
2 3 2 1
P ' A 6 4 2 8
0 1 1 0
0 1 1 4
2 3 2 1 2 3 2 1
P ' A 6 4 2 8 0 5 4 11
0 1 1 0 R2 ( 3)R1 0 1 1 0
0 1 1 4 0 1 1 4
50
2 3 2 1 2 3 2 1
0 5 4 11 0 5 4 11
11 U '
1
R3 5 R2 0 0 1
11
R4 R3 0 0 1
5 5 5 5
R4 ( 15 )R2 0 0
1 31
0 0 0 4
5 5
1 0 0 0
3 1 0 0
with L 0
1
5
1 0 . Thus P ' A LU ' and hence A PLU ' ,
0 1
1 1
5
0 0 1 0
with P ( P ' ) 1 ( P ' )T 0 0 0 1 .
1 0 0 0
0 1 0 0
51
2-6 LDU Factorizations
2.6.1 Definition
An LDU factorization of a matrix A is a representation of A =
LDU as the product of a (necessarily square) lower triangular
matrix L with 1s on the diagonal, a (square) diagonal matrix D,
and an upper triangular matrix U (the same size as A) with 1s on
the diagonal.
52
2.6.3 Problem 1 2 0 2
Find an LDU factorization of A 2 4 6 8
3 2 4 1
Solution :
1 2 0 2 R 2 ( 2) R1 1 0 2 R3
2 ( 12 ) R 2
A 2 4 6 8 0 8
6 4
3 2 4 1 R3 3R1 0 4 4 7
1 2 0 2 1 0 0
0 8 6 4 U '. We have A = LU’ with L 2 1 0 .
0 0 7 9 3 1 1
2
54
Symmetric Matrices
2.6.5 Definition
A matrix A is symmetric if it equals its transpose: AT = A.
55
2.6.7 Problem
1 2 1
Find an LDU factorization of the symmetric matrix A 2 3 0 .
1 0 5
Solution :
1 2 1 1 2 1
A 0 1 2 0 1 2 U '
0 2 4 0 0 8
1 2 1 1 0 0 1 2 1
Fartor U ' DU 0 1 2 0 1 8 0 1 2 ,
0 0 8 0 0 8 0 0 1
1 0 0 1 2 1
obtaining D 0 1 0 and U 0 1 2 .
0 0 8 0 0 1
T
1 0 0
We conalude that L U 2 1 0 and we can confirm
1 2 1
that A LDU .
56
The PLDU Factorization
2.6.8 Example
2 4 2 1 0 0
Let A 1 2 3 . As in section 2.5, P ' 0 0 1 .
3 2 1 0 1 0
1 0 0 2 4 2
2 4 2 2
Then P ' A A' 3 2 1 and A' LU ' 3 1 0 0 4 2 .
1 2 3 12 0 1 0 0 2
1 0 0 2 0 1 2 1
0
Factoring U’ = DL gives A' LDU
3 1 0 0 4
2
0 0 1 12 .
12 0 1 0 2 0 0 1
0
The inverse of the elementary matrix P’ is (P’)–1 =P’, so we
have A = PLDU with P = P’. 57
2.6.7 Problem 0 1 1 0
0 1 1 4
Find an LDU factorization of A 2 3 2 1 if possible.
6 4 2 8
Otherwise find a PLDU factorization.
Solution : In problem 2.5.25 of Section 2.5, we noted A does
not have an LU factorization, but A = PLU’ with
0 0 1 0 1 0 0 0 2 3 2 1
0 0 0 1 3 1 0 0 0 5 4 11
P , L 0 1
1 0 , and U ' 0 0 1 11 .
1 0 0 0 5 5
5
0 1 0 0 0 1 1 1 0 0 0 4
5
Since the diagonal entries of U’ are not 0, we can factor U’ = DL
with
2 0 0 0 1
2
3
1 1
2
0 5 0 0 0 1 4 11
D 0 0 15 0 and U 5 5,
0 0 0 1 11
0 0 4 0 0 0 1
hence obtaining a factorization A PLDU .
2-7 Finding the Inverse of
a Matrix
2.7.1 Problem
3 2 1 2
Determine whether A 2 1 and B 2 3 are inverse.
3 2 1 2 1 0
Solution : We compute AB 2 1 2 3 0 1 I .
Since the matrices are square conclude that A and B are inverse.
2.7.2 Problem 1 1
Let A 1 2 3 and B 0 1 . Then
4 5 6 2 1
3 3
1 1
AB 1 2 3 0 1 1 0 . Is A invertible?
4 5 6 2 1 0 1
3 3
Solution : No, A is not invertible because it is not square.
Alternatively, you may check that BA I.
59
A Method for Finding
the Inverse
Let A be an n n matrix. If A is a invertible, there is a matrix B
with 1 0 0 e1 e 2 e n
AB I 0 1 0 .
0 0 1
Let the columns of B be the inverse x1, x2, …, xn. So
x1 x 2 x n Ax1 Ax 2 Ax n
B and AB
We want AB = I, so we wish to find vectors x1, x2, …,xn such that
1 0 0
0 1 0
Ax1 0 e1 , Ax 2 0 e 2 , Ax n e n .
0
0 0 1
60
2.7.4 Definition
A matrix is in reduced row echelon form if it is in row echelon
form, each pivot (that is, each leading nonzero entry in a nonzero
row) is a 1, and each such pivot is the only nonzero entry in its
column.
2.7.7 Example
1 2 5 1 2 5 1 2 5
3 1 5 0 5 20 0 1 4
2 3 4 0 7 6 0 7 6
1 0 3 1 0 3 1 0 0
0 1 4 0 1 4 0 1 0 .
0 0 22 0 0 1 0 0 1
61
2.7.8 To find the inverse of A, apply Gaussian elimination to
A I attempting to move A to reduced row echelon form.
If this is I B for some B, then B = A–1. If you cannot
move A to I, then A is not invertible.
2.7.9 Example
1 2 0
Suppose A 3 1 2 .
2 3 2
1 2 01 0 0 1 2 0 1 0 0
A I 3 1 2 0 1 0 0 7 2 3 1 0
2 3 20 0 1 0 7 2 2 0 1
1 2 0 1 0 0
0 7 2 3 1 0 .
0 0 0 1 1 1
This is no point in continuing.
The given matrix A has no solution.
2 7 0
2.7.10 Example Suppose A 1 4 1 , find A 1 ?
1 3 0
2 7 0 1 0 0 1 4 1 0 1 0
Solution : A I 1 4 1 0 1 0 2 7 0 1 0 0
1 3 0 0 0 1 1 3 0 0 0 1
1 4 10 1 0 1 4 1 0 1 0
0 1 2 1 2 0 0 1 2 1 2 0
0 1 1 0 1 1 0 1 1 0 1 1
1 0 7 4 7 0 1 0 7 4 7 0
0 1 2 1 2 0 0 1 21 2 0
0 0 1 1 1 1 0 0 1 1 1 1
1 0 0 3 0 7
0 1 0 1 0 2 B.
0 0 1 1 1 1
We can find AB = I, so B = A–1.
63
2
2.7.12 problem Express the vector b 4 as a linear
5
combination of the columns of the matrix A in 2.7.11.
Solution : 41
We just saw that Ax = b with x 12 . Since Ax is a linear
11
combination of the columns of A with coefficients the
components of A-please recall and never forget 2.1.33, we have
2 7 0
b 41 1 12 4 11 1 .
1 3 0
If A has an inverse, then Ax = b has the unique solution x = A–1b.
Theorem 2.7.14
A matrix is invertible if and only if it can be written as the
product of elementary matrices. 64
2.7.15 Problem
2 3
Express A 1 1 as the product of elementary matrices.
A 2 3 1 1 E A
1 1 2 3 1
1 1 E E A 1 0 E E E A I .
0 1 2 1
0 1 3 2 1
with E1 0 1 , E2 1 0 , and E3 1 1 .
1 0 2 1 0 0
So E3 E2 E1 A I and A E3 E2 E1 E1 1 E2 1 E3 1 ;
1
2 3 0 1 1 0 1 1 .
1 1 1 0 2 1 0 1
65
2.7.16 Problem
1 1 3
Express A 2 2 7 as the product of elementary
matrices. 4 3 12
Theorem 2.7.18
A square matrix is invertible if and only if its columns are linear
independent.
68