Linear Algebra
Linear Algebra
4. Linear Algebra
Like real analysis, linear algebra is essentially also a study of Rn . But
rather than viewing elements in Rn as “points in space” like real
analysis does, linear algebra views those elements as “arrows”.
I Vector addition:
I Scalar multiplication:
I Vector addition:
I Scalar multiplication:
v = a1 · u1 + ... + an · un .
I A vector space can have many bases. The following are all bases of
(R2 , +, ·):
{(0, 1), (1, 0)} {(0, 2), (3, 0)} {(1, 1), (1, −1)}
(Proof omitted)
I This unique size of bases is called the dimension of the vector space.
I That’s why we call (Rn , +, ·) “n-dimensional”.
I Any linearly independent set of n vectors in Rn is a basis.
I Canonical basis of Rn : {e1 , ..., en } where ei = (0, ..., 1 , .., 0)
|{z}
ith entry
Proposition 4.7 If (V , +, ·) has dimension n, then:
1. If V = span(S) for some S ⊂ V , then |S| ≥ n.
2. If vectors in S ⊂ V are linearly independent and |S| = n,
then S is a basis of (V , +, ·).
3. If vectors in S ⊂ V are linearly independent and |S| < n,
then there exists a basis S0 where S ⊂ S0 .
4. If (W , +, ·) is a subspace of (V , +, ·), then its dimension
is no greater than n.
T : R2 → R 2 T (x1 , x2 ) = (x1 , 0)
Proof of Propositions 4.9 and 4.10. Pick any v = (v1 , ..., vn ) and let
u = (u1 , ..., um ) := T (v). Note that v = ni=1 vi · ei . Thus
P
Pn Pn
u = T (v) = T ( i=1 vi · ei ) = i=1 vi · T (ei ). This proves Proposition 4.9. Now
we wish to show that uk is the kth entry of AT v for every k = 1, ..., m. We have
uk = ni=1 vi Tk (ei ) where Tk (ei ) denotes the kth entry of T (ei ). Given the
P
AS AT v = S(T (v)) ∀v ∈ Rn .
1 0 ... 0
(Recall: In = ... 1 ... 0, which represents the identity
0 0 ... 1
transformation T (v) = v.)
Proof. (1) → (2): Suppose T is invertible but not one-to-one, then T −1 (T (v))
would contain more than one element for some v, a contradiction. Then T is a
bijection by Proposition 4.13. (2) → (3): Since T is a bijection, it has an
inverse mapping U : Rn → Rn such that U (T (v)) = v ∀v ∈ Rn , and let
AT−1 := AU , so from U (T (v)) = v ∀v we have AT−1 AT = In because In is the only
matrix such that In v = v ∀v. (3) → (1): Let U be the linear transformation AT−1
represents. Then AT−1 AT v = In v = v ∀v, which implies U (T (v)) = v ∀v.
The Rank of a Matrix
The rank of a matrix A is the rank of the linear transformation TA it
represents.
((1) and (2) are immediate implications of Proposition 4.9. Proof for (3)
is omitted. (4) immediately follows (3).)
I Example
1 2 3
rank 1 2 4 = 2
2 4 7
Matrix Invertibility
(Proof omitted)
I Invertibility ↔ TA (e1 ), ..., TA (en ) constitute a basis of Rn .
I Invertibility of A can be verified checking if all columns or
all rows are linearly independent.
Linear Equations
A system of linear equations have the form
A x = b
|{z} |{z} |{z}
m×n n×1 m×1
Av = λv · v for some λv ∈ R.