Linear Algebra Week 7
Linear Algebra Week 7
1 / 58
Abstract vector spaces
+ : V × V −→ V , (a, b) 7→ a + b
and
2 SCALAR MULTIPLICATION:
2 / 58
Axioms contd.
∀ a, b, c ∈ V and λ ∈ R(or C)
Definition (continuation)
A1 a + b = b + a. [Commutativity]
A2 (a + b) + c = a + (b + c). [Associativity]
A3 ∃! 0 ∈ V s.t. a + 0 = a. [Additive identity]
A4 ∃! − a s.t. a + (−a) = 0. [Additive inverse]
3 / 58
Some natural consequences
4 / 58
Examples
5 / 58
Non-examples
etc..
6 / 58
Linear combinations
c1 v1 + c2 v2 + · · · + ck vk
c1 v1 + c2 v2 + · · · + ck vk = 0.
7 / 58
Example
Example
1 In R[x] the set {1, x, x 2 , x 5 } is a linearly independent set.
2 On the other hand the set {1, x, 1 + x 2 , 1 − x 2 } ⊂ R[x] is a
linearly dependent set since
8 / 58
Wronskian
Definition
(Wronskian) Let f1 , f2 , ..., fn be functions over some interval (a, b).
Their Wronskian is another function on (a, b) defined by a
determinant involving the given functions and their derivatives
upto the order n − 1.
f1
0 f2 ··· fn
f1
def
f20 ··· fn0
Wf1 ,f2 ,...,fn (x) = .. .. .. .
. . .
(n−1) (n−1) (n−1)
1
f
2 f ··· fn
9 / 58
Wronskian
Example
Prove that if c1 f1 + c2 f2 + · · · + cn fn = 0 holds over the interval
(a, b) for some constants c1 , c2 , ..., cn and Wf1 ,f2 ,...,fn (x0 ) 6= 0 at
some x0 , then c1 = c2 = · · · = cn = 0. In other words,
nonvanishing of Wf1 ,f2 ,...,fn at a single point establishes linear
independence of f1 , f2 , ..., fn on (a, b).
10 / 58
Solution
Differentiating c1 f1 + c2 f2 + · · · + cn fn = 0 (n − 1) times we
get a system of equations
f1 f2 ··· fn c1 0
f10 f 2
0
· · · f n
0
c 2 0
..
.. ..
. = . , x ∈ (a, b).
. . . .. ..
(n−1) (n−1) (n−1)
f1 f2 · · · fn cn 0
c1 0
c2 0
If Wf1 ,f2 ,...,fn (x0 ) 6= 0 for some x0 ∈ (a, b), then
... = ... as
cn 0
required.
11 / 58
Example
Example
Show that the set {1, x, 1 + x 2 , 1 − x 2 } ⊂ R[x] is a linearly
dependent set.
Solution:
The Wronskian is
1 x 1 + x 2 1 − x 2
0 1
2x −2x
= 0.
0 0 2 2
0 0 0 0
12 / 58
Example
Example
Show that the set {1, x, x 2 , x 5 } ⊂ R[x] is a linearly
independent set.
Solution:
The Wronskian is
1
x x 2 x 5
0 1 2x 5x 4
= 120x 2 ≡
6 0.
0 0 2 20x 3
0 0 60x 2
0
13 / 58
Linear span, basis
Definition (Basis)
If a set of vectors S in a vector space V is such that
S is linearly independent and
Every vector in V is a (finite) linear combination of vectors
from S.
then the set S is called a basis of S.
Definition (Dimension)
The cardinality of any basis of V is called the dimension of V .
14 / 58
Examples
Example
{i, j, k} is the standard basis of R3 .
A basis of R[x] is
A basis of R[x]5 is
S = {1, x, x 2 , x 3 , x 4 , x 5 }
Example
{Ejk , 1 ≤ j ≤ m, 1 ≤ k ≤ n} ⊂ Mm×n (R)} is a basis of
Mm×n (R).
B = {1 − x, 1 − x 2 , 1 − x 3 }.
It has 3 dimensions.
Compare with {[a, b, c, d] ∈ R4 a + b + c + d = 0}.
16 / 58
Inner product spaces
Definition (Length/Norm)
p
For a vector v ∈ (V , h, i), the non-negative square root hv, vi is
called the length or norm of v. It is usually denoted kvk.
17 / 58
Inner product spaces
Definition (Length)
p
For a vector v ∈ (V , h, i), the non-negative square root hv, vi is
called the length of v. It is also usually denoted kvk.
18 / 58
Cauchy-Schwartz inequality
Theorem (Cauchy-Schwartz inequality)
For v, w ∈ V , we have |hv, wi| ≤ kvkkwk.
Equality holds if and only if {v, w} is a linearly dependent set.
Proof:
kv ± wk2 = hv ± w, v ± wi
bi-lin.
= kvk2 + kwk2 ± [hv, wi + hw, vi]
≤ kvk2 + kwk2 + |hv, wi| + |hw, vi|
C-S
≤ kvk2 + kwk2 + 2kvkkwk = (kvk + kwk)2 .
19 / 58
Proof of C-S inequality, real case
20 / 58
Proof of C-S inequality, complex case
In the unitary case, working as above and using
hv, wi + hw, vi = 2Rehv, wi,
we will get
Rehv, wi ≤ kvkkwk.
Let
hv, wi = |hv, wi|e ıθ ( polar form in C)
and replace w by w0 = e ıθ w. This yields,
Rehv, w0 i ≤ kvkkw0 k = kvkkwk.
But
Rehv, w0 i = Re e −ıθ hv, wi
| {z }
=|hv,wi|≥0
= |hv, wi|
21 / 58
Angle between two vectors and orthogonality
hv, wi
θ = cos−1 .
kvkkwk
Re hv, wi
In the unitary case the angle is θ = cos−1 .
kvkkwk
In either case θ ∈ [0, π].
Definition (Orthogonality)
Two vectors v, w ∈ V are said to be orthogonal if hv, wi = 0.
Theorem (Pythagoras)
Let v ⊥ w be any two mutually orthogonal vectors in an i.p. space
V , then
kv + wk2 = kvk2 + kwk2 .
Proofs:
Directly from axioms and ”R-bilinearity”. [1.5]
23 / 58
Examples
Example (1)
V = Rn , treated as the set of n × 1 columns. Then
X
hv, wi := vT w = vj w j
j
Example (2)
Let P be a symmetric n × n matrix with all eigenvalues positive. In Rn ,
define
hx, yiP = xT Py.
Then h, iP is the most general inner product in Rn .
24 / 58
Examples
Example (4)
In the space Mn (R), define hX , Y i = trX T Y . This defines an
inner product in the vector space of n × n matrices. It is analogous
to the standard inner product in Rn .
25 / 58
Function spaces
Example (4)
Let V = C [a, b] = {f : [a, b] −→ Rf is continuous} and define
Z b
hf , g i = f (t)g (t)dt.
a
26 / 58
Orthogonal and orthonormal sets
Definition (Orthogonal/orthonormal sets)
A subset S of an i.p. space V is said to be an orthogonal set if
hv, wi = 0 ∀ v =
6 w ∈ S.
Theorem
If S is an orthogonal set of non-zero vectors in V , then S is
linearly independent.
Proof:
Exercise.
Example
{i, j, k} is an orthonormal set in R3 .
{i, j, β −1 k} is an orthonormal set in R3 w.r.t. h, iβ .
27 / 58
Orthogonal sets, examples
Example (2)
Z π
Let V = C [−π, π] and define as before hf , g i = f (t)g (t)dt.
−π
Then the infinite set
It follows that
1 cos x sin x cos 2x sin 2x
S = √ , √ , √ , √ , √ , ...ad inf
2π π π π π
28 / 58
Orthonormal bases
29 / 58
Gram-Schmidt process
Q. How to get orthonormal sets and bases?
A. By Gram-Schmidt process .
Start with any spanning set {v1 , v2 , v3 , ...} -finite or infinite.
Recursively, produce an orthogonal set as follows (same as before
in Rn or Cn ):
w1 = v 1
hv2 , w1 iw1
w2 = v 2 − (= v2 if w1 = 0.)
kw1 k2
hv3 , w1 iw1 hv3 , w2 iw2
w3 = v3 − − (similar remark)
kw1 k2 kw2 k2
..
.
k−1
X hvk , wj iwj
wk = vk − etc.
j=1
kwj k2
wj 6=0
30 / 58
G-S process
31 / 58
An example
Example
Consider the linearly independent set {1, x, x 2 , ...ad inf } ⊂ V = C [−1, 1].
The (real) space V has the inner-product defined by
Z 1
hf , g i = f (t)g (t)dt.
−1
w0 = v0 = 1
hx, 1i
w1 = x− 1=x
k1k2
hx 2 , 1i hx 2 , xi 1
w2 = x2 − 2
.1 − x = x2 −
k1k kxk2 3
3
hx , xi 3
w3 = x3 − 0 − − 0 = x3 − x
kxk2 5
6 3
w4 = x4 − x2 + etc.
7 35 32 / 58
Bessel’s inequality
v ∈ LS{u1 , ..., uk }.
Proof:
Define
k
X
v0 = hv, uj iuj .
j=1
33 / 58
Bessel’s inequality
Observe that
Now,
Equality holds
⇐⇒ k(v − v0 )k = 0 ⇐⇒ v = v0 ⇐⇒
v ∈ LS{u1 , ..., uk }.
34 / 58
G-S process; an example in C [0, π]
Example
Orthogonalize the ordered set in [0, π]:
Solution:
w1 = v1 = cos 2x
0
w2 = sin 2x − cos 2x = sin 2x
π/2
0 4/3 8
w3 = cos x − cos 2x − sin 2x = cos x − sin 2x
π/2 π/2 3π
35 / 58
G-S process; example in C [0, π] contd.
−2/3 0 0
w4 = sin x − cos 2x − sin 2x − w3
π/2 π/2 π/2
4
= sin x + cos 2x.
3π
The orthogonalized set is
8 4
cos 2x, sin 2x, cos x − sin 2x , sin x + cos 2x
3π 3π
π π π 32 π 8
Squares of lengths are , , − , − .
2 2 2 9π 2 9π
36 / 58
Linear maps
Example
A ∈ Mm×n (R) can be viewed as a linear map A : Rn −→ Rm
via the matrix multiplication v 7→ Av. Here v is treated as
n × 1 column which maps to m × 1 column Av.
37 / 58
More examples
d2
(a) 2
+ µ2 : R[x] −→ R[x]
dx
(b) τc : R[x]m −→ R[x]m where
τc p(x) = p(x − c), c ∈ R.
38 / 58
Significance of linear maps
16:05 [2.0]
39 / 58
Matrix of a linear transformation (or map)
{T v ∈ W }v∈B
determine T on ALL of V .
40 / 58
Matrix of a linear transformation (or map)
T : V −→ W
41 / 58
Matrix of a linear transformation contd.
Example
Write down the matrix of the linear map
T p(x) = xp 0 (x − 1) + p 00 (x)
43 / 58
Example contd.
0 0 0 3
44 / 58
The change of basis
Theorem
Let T : V −→ W be a linear map. Let B, B 0 be two bases of V
and C, C 0 be those of W . Let
1 [T ] be the matrix of T relative to B, C and [T ]0 be the matrix
of T relative to B 0 , C 0 .
2 Let P be the matrix wich gives B 0 in terms of B and Q be the
matrix which gives C 0 in terms of C.
Then
[T ]P = Q[T ]0 .
45 / 58
The change of basis, proof
46 / 58
Special case of interest
47 / 58
The null space and the range
48 / 58
Rank-nullity theorem
Theorem
Let [T ] be a matrix of T w.r.t. to any choices of bases in V and W .
then rank([T ]) = rank(T ) and null([T ]) = null(T ).
49 / 58
Solvabilty of T x = b
If p ∈ V is any particular
solution i.e. T p = b, then the set
p + N (T ) = {p + v v ∈ N (T )} is the full set of solutions of
T x = p.
This is identical to the description of the solution set of the linear
systems.
50 / 58
Eigenvalue problem
We restrict to the real vector spaces for simplicity.
Let V be a real vector space. Let T : V −→ V be a linear map.
(It is important that W = V .)
Eigenvalue Problem (EVP):
Example
Let T : R[x]3 −→ R[x]3 be defined by
T p(x) = xp 0 (x − 1) + p 00 (x)
0 0 0 3
53 / 58
Example contd.
Similarly, an eigenvector for λ = 3 is p3 = −4 + 10.5x − 6x 2 + x 3 . The
matrix of eigenvectors is
1 0 1 −4 1 0 −1 −2
0 1 −2 10.5 −1
0 1 2 1.5
P= , P =
0 0 1 −6 0 0 1 6
0 0 0 1 0 0 0 1
d
Exercise: Show that D = : R[x]3 −→ R[x]3 is not
dx
diagonalizable.
54 / 58
Symmetric maps and spectral theorem
hT v, wi = hv, T wi ∀ v, w ∈ V .
55 / 58
Symmetric maps and spectral theorem
56 / 58
Legendre polynomials again
Example
Consider the polynomials of degree n defined by
d
pn (x) = D n (x 2 − 1)n , n = 0, 1, 2, ... where D ≡ .
dx
Show using integration by parts n times that
Z 1
pn (x)pm (x) dx = 0 if m < n (Orthogonality).
−1
Prove that
Z 1 Z 1
Lf (x) g (x) dx = f (x) Lg (x) dx. (L is self-adjoint)
−1 −1
57 / 58
Legendre polynomials again
Example
Finally prove that