Linear Lec5
Linear Lec5
Andrei Antonenko
February 10, 2003
Matrices
A = (aij ), A = .
..
..
...
.
.
.
.
am1 am2
amn
The matrix is called square matrix if the number of its rows is equal to the number of its
columns. For every square matrix we will define its main diagonal, or simply diagonal, as
a diagonal from the top left corner to the bottom right corner, i.e. diagonal consists of the
elements a11 , a22 , . . . , ann . Another diagonal is called secondary. It is used very rarely.
So, we introduced an object. But now we should introduce operations, otherwise the object
is not interesting!
2
2.1
Matrix Operations
Addition
Definition 2.1. Let A and B are m n-matrices. Then their sum C = A + B is an m nmatrix such that cij = aij + bij , i.e. the elements of this matrix are sums of corresponding
elements of the matrices A and B.
Example 2.2.
1 2 3
3 0 1
!
!
2 0 3
1 2 6
+
=
1 1 3
4 1 2
commutativity
= C + (A + B)
associativity
= (A + B) + C
commutativity
0 0 0
0 0 0
0=
.. .. . . . ..
.
. .
0 0
1
Example 2.3. The additive inverse for the matrix
1
A such
!
3
.
1
2.2
Multiplication by a number
For any matrix A and for any number c R we can define the matrix B = cA, such that
bij = caij , i.e. we multiply all elements of the matrix A by the same number c. This operation
has the following obvious properties:
(c1 c2 )A = c1 (c2 A);
(c1 + c2 )A = c1 A + c2 A;
c(A + B) = cA + cB.
2.3
Multiplication
The definition of multiplication is much more complicated than the definition of the previous
operations.
Definition 2.4. Let A be an m p-matrix and B be a p n-matrix. Then their product is
an m n matrix C such that
p
X
aik bkj
cij =
k=1
So, we see, that in order to be able to multiply matrices, the number of columns of the first
matrix should be equal to the number of rows of the second one.
Example 2.5.
! 3 0 1
2 1 2
2 1 3
3 0 1
1 1 0
!
2 3 + 1 2 + (2) 1 2 0 + 1 1 + (2) 1 2 1 + 1 3 + (2) 0
=
33+02+11
30+01+11
31+03+10
!
6 1 5
=
10 1 3
We can see, that we cannot multiply these 2 matrices in different order, i.e. we can not compute
!
3 0 1
2 1 2
2 1 3
3 0 1
1 1 0
Example 2.6 (Cute example of matrix multiplication). Let and be real numbers,
and lets compute the following product
!
!
cos sin
cos sin
sin cos
sin cos
It is equal to:
!
!
cos cos sin sin cos sin sin cos
cos( + ) sin( + )
=
sin cos + cos sin sin sin + cos cos
sin( + ) cos( + )
So,we can see that we get a matrix of the same type, but instead of and we have + .
Now lets consider the properties of multiplication.
(M1) Commutativity. Unfortunately, commutativity does not hold for matrix multiplication. Moreover, for some matrices A and B we can compute AB and cannot compute
BA. E.g., if A is a 2 3-matrix, and B is a 3 3-matrix, then AB is defined, and BA is
not. Now, we can give a counterexample even if both products are defined.
!
!
!
!
!
!
1 0
0 1
0 1
0 1
1 0
0 0
=
, but
=
0 0
0 0
0 0
0 0
0 0
0 0
Moreover, from this example we see that the product of two nonzero matrices can be a
zero matrix.
(M2) Associativity. Associativity holds for matrix multiplication, i.e. for any three matrices
such that all needed products(i.e., AB and BC) can be defined, we have that (AB)C =
A(BC).
Proof. Let A = (aij ), B = (bij ), C = (cij ). Then (AB)ik =
((AB)C)ij =
X
XX
(AB)ik ckj =
ail blk ckj
k
(A(BC))ij =
ail (BC)lj =
X
l
ail
blk ckj
Now we can change the order of the summation, and see that these expressions are
equal.
Here, unlike in the case of addition, we cannot choose any order, since commutativity
does not hold for multiplication. For example,
(AB)C 6= (CA)B,
etc.
0
I=
..
.
identity matrix
0 0
1 0
.. . . ..
. .
.
0 0
This n n-matrix has 1s on its main diagonal. For any m n-matrix A we have that
AI = IA = A.
Proof. Can be done directly from the definition of
check that
AI =
..
..
..
..
. . .
.
.
. .. .. . .
.
am1 am2
amn
0 0
..
= .
. ..
1
a12
a22
..
.
..
.
am1 am2
a1n
a2n
..
.
amn