Two II
Two II
II Linear Independence
http://joshua.smcvt.edu/linearalgebra
Definition and examples
Linear independence
1.4 Definition In any vector space, a set of vectors is linearly independent if
none of its elements is a linear combination of the others from the set.
Otherwise the set is linearly dependent .
Linear independence
1.4 Definition In any vector space, a set of vectors is linearly independent if
none of its elements is a linear combination of the others from the set.
Otherwise the set is linearly dependent .
~si = (1/ci )~s0 + · · · + (−ci−1 /ci )~si−1 + (−ci+1 /ci )~si+1 + · · · + (−cn /ci )~sn
When we don’t want to single out any vector we will instead say that
~s0 ,~s1 , . . . ,~sn are in a linear relationship and put all of the vectors on the
same side.
1.5 Lemma A subset S of a vector space is linearly independent if and only
if among its elements the only linear relationship c1~s1 + · · · + cn~sn = ~0 is
the trivial one, c1 = 0, . . . , cn = 0 (where ~si 6= ~sj when i 6= j) .
1.5 Lemma A subset S of a vector space is linearly independent if and only
if among its elements the only linear relationship c1~s1 + · · · + cn~sn = ~0 is
the trivial one, c1 = 0, . . . , cn = 0 (where ~si 6= ~sj when i 6= j) .
Proof If S is linearly independent then no vector ~si is a linear
combination of other vectors from S so there is no linear relationship where
some of the ~s ’s have nonzero coefficients.
If S is not linearly independent then some ~si is a linear combination
~si = c1~s1 + · · · + ci−1~si−1 + ci+1~si+1 + · · · + cn~sn of other vectors from S.
Subtracting ~si from both sides gives a relationship involving a nonzero
coefficient, the −1 in front of ~si . QED
is trivial c1 = 0, c2 = 0.
Example This set of vectors in the plane R2 is linearly independent.
1 0
{ , }
0 1
is trivial c1 = 0, c2 = 0.
Example In the vector space of cubic polynomials
P3 = { a0 + a1 x + a2 x2 + a3 x3 | ai ∈ R } the set { 1 − x, 1 + x } is linearly
independent. Setting up the equation c0 (1 − x) + c1 (1 + x) = 0 and
considering the constant term and linear term, leads to this system
c0 + c1 = 0
−c0 + c1 = 0
2 0 1 −1
0 1 −3 1/2
0 0 0 5
0 0 0 0
2 0 1 −1
0 1 −3 1/2
0 0 0 5
0 0 0 0
One way to see that is to spot that the third vector is twice the first plus
the second. Another way is to solve the linear system
c1 − c2 + c3 = 0
c1 + c2 + 3c3 = 0
3c1 + 6c3 = 0
Example The book has the proof; here is an illustration. The span of this
set is the xy-plane.
2 0
P = { 0 , 2 } ⊂ R3
0 0
If we expand the set by adding a vector {~p1 , ~p2 , ~q } then there are two
possibilities.
2 0 3 2 0 0
P0 = { 0 , 2 , 2 } P1 = { 0 , 2 , 0 }
0 0 0 0 0 −1
If the new vector is already in the starting span ~q ∈ [P] then the span is
unchanged [P0 ] = [P]. But if the new vector is outside the starting span
~q ∈/ [P] then the span grows [P1 ] ) [P].
1.3 Corollary For ~v ∈ S, omitting that vector does not shrink the span
[S] = [S − {~v }] if and only if it is dependent on other vectors in the
set ~v ∈ [S].
Example These two subsets of R3 have the same span
1 4 7 1 4
{ 2 , 5 , 8 } { 2 , 5 }
3 6 9 3 6
S = { 1 + x, 1 − x, x2 }
S = { 1 − x, 1 + x }
S1 = S ∪ { 2 + 2x } S2 = S ∪ { 2 + x2 }
1.15 Lemma Suppose that S is linearly independent and that ~v ∈
/ S. Then the
set S ∪ {~v } is linearly independent if and only if ~v ∈
/ [S].
Example The book has the proof; here is an illustration. Consider this
linearly independent subset of P2 .
S = { 1 − x, 1 + x }
S1 = S ∪ { 2 + 2x } S2 = S ∪ { 2 + x2 }
The book has a proof. Instead, consider the example on the next slide.
Example Consider this subset of R2 .
2 3 1 0 1
S = {~s1 ,~s2 ,~s3 ,~s4 ,~s5 } = { , , , , }
2 3 4 −1 −1
An example of the lower left is that the set S of all vectors in the space
R2 is linearly dependent but the subset Ŝ consisting of only the unit vector
on the x-axis is independent. By interchanging Ŝ with S that’s also an
example of the upper right.