0% found this document useful (0 votes)
361 views

Vector Spaces

This document defines vector spaces and some of their basic properties, including subspaces. It introduces the concepts of a basis and dimension of a vector space. The key points are: 1. A vector space is a set with operations of vector addition and scalar multiplication that satisfy certain properties. 2. A subspace is a subset of a vector space that is closed under vector addition and scalar multiplication. 3. Every vector space has a basis, which is a linearly independent set of vectors that span the entire space. The number of vectors in the basis is called the dimension of the vector space.

Uploaded by

marina_bobesi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
361 views

Vector Spaces

This document defines vector spaces and some of their basic properties, including subspaces. It introduces the concepts of a basis and dimension of a vector space. The key points are: 1. A vector space is a set with operations of vector addition and scalar multiplication that satisfy certain properties. 2. A subspace is a subset of a vector space that is closed under vector addition and scalar multiplication. 3. Every vector space has a basis, which is a linearly independent set of vectors that span the entire space. The number of vectors in the basis is called the dimension of the vector space.

Uploaded by

marina_bobesi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

2

Vector Spaces
2.1

Definition of Vector Space and basic


properties

Definition 2.1. A vector space V over a field F (or F vector space) is a


set V with an addition ` (internal composition law) such that pV, `q is a
group and a scalar multiplication : F V V, p, vq v v,
satisfying the following properties:
1. pv ` wq v ` w, @ P F, @v, w P F
2. p ` qv v ` v, @, P F, @v P V
3. pvq pqv
4. 1 v v, @v P V
The elements of V are called vectors and the elements of F are called
scalars. The scalar multiplication depends upon F. For this reason when
we need to be exact we will say that V is a vector space over F, instead of
27

28

2. Vector Spaces

simply saying that V is a vector space. Usually a vector space over R is


called a real vector space and a vector space over C is called a complex
vector space.
Remark. From the definition of a vector space V over F the following
rules for calculus are easily deduced:
0V 0
0F v 0V
v 0V 0F or v 0V .
Examples. We will list a number of simple examples, which appear
frequently in practice.
V Cn has a structure of R vector space, but it also has a structure
of C vector space.
V FrXs is a F vector space.
Mm,n pFq is a F vector space.
0
is a R vector space.
Cra,bs

2.2

Subspaces of a vector space

It is natural to ask about subsets of a vector space V which are


conveniently closed with respect to the operations in the vector space. For
this reason we give the following:
Definition 2.2. Let V a vector space over F. A subset U V is called
subspace of V over F if it is stable with respect to the composition laws,
that is, v ` u P U, @v, u P U, and v P U @ P F, v P U , and the induced
operations verify the properties form the definition of a vector space over F.

Subspaces of a vector space

29

It is easy to prove the following propositions:


Proposition 2.3. Let V be a F vector space and U V a nonempty
subset. U is a vector subspace of V over F iff the following conditions are
met:
v u P U, @v, u P U
v P U, @ P F, @v P U
Proposition 2.4. Let V be a F vector space and U V a nonempty
subset. U is a vector subspace of V over F iff
v ` u P U, @, P F, @v, u P V .
The next proposition shows how one can operate with vector subspaces (to
obtain a new vector subspace) and how one can obtain a subspace from a
family of vectors.
Proposition 2.5. Let V be a vector space and U, W V two vector
subspaces. The sets
U X W and U ` W tu ` w|u P U, w P W u
are subspaces.
The subspace U X W is called the intersection (vector subspace), whilw
the subspace U ` W is called the sum vector subspace. Of course that
these definitions can be also given for finite intersections (resp. finite
sums) of vector subspaces.
Proposition 2.6. Let V be a vector space over F and S V nonempty.

The set xSy t i vi , a finite sum, with i P F and vi P Su is a vector


subspace over F of V .

30

2. Vector Spaces

The above vector space is called the vector space generated by S, or the
linear hull of the set S. It is the smallest subspace of V which contains S,
in the sense that for every U subspace of V with S U it follows that
xSy U .
We specialize now the notion of sum of subspaces, to direct sum of
subspaces.
Definition 2.7. Let V be a vector space and Ui V subspaces,
i 1, . . . , n. The sum U1 ` ` Un is called direct sum if for every
v P U1 ` ` Un , from v u1 ` ` un u1 ` ` wn with
ui , wi P Ui , i 1, . . . , n it follows that ui wi , for i 1, . . . , n.
The direct sum of the subspaces Ui , i 1, . . . , n will be denoted by
U1 Un . The next proposition characterizes the direct sum of two
subspaces.
Proposition 2.8. Let V be a vector space and U, W V be subspaces.
The sum U ` W is a direct sum iff U X W t0V u.
Let V be a vector space over F and U be a subspace. On V one can define
the following binary relation RU : let u, v P V , uRU v iff u v P U .
Lemma 2.9. The relation R is an equivalence relation.
Theorem 2.10. On the (factor) set V {U there is a natural structure of a
vector space over F.
Proof.

The vector space from the previous theorem is called the factor vector
space, or the quotient vector space.

Basis. Dimension.

2.3

31

Basis. Dimension.

Up to now we have tried to explain some properties of vector spaces in


the large. Namely we have talked about vector spaces, subspaces, direct
sums, factor space.
The Proposition 2.6 naturally raises some questions related to the
structure of a vector space V . Is there a set S which generates V (that is
xSy V )? If the answer is yes, how big should it be? Namely how big
should a minimal one (minimal in some sense) be? Is there a finite set
which generates V ? We will shed some light on these questions in the next
part of this chapter.
Why are the answers to such questions important? The reason is quite
simple. If we control (in some way) a minimal system of generators, we
control the whole space.
Definition 2.11. Let V be a F vector space. A nonempty set S V is
called system of generators for V if for every v P V there exists a finite
subset tv1 , . . . , vn u V and the scalars 1 , . . . , n P F such that
v 1 v1 ` ` n vn (it is also said that V is a linear combination of
v1 , . . . , vn with scalars in F). V is called dimensionally finite if it has a
finite system of generators.
A nonempty set L V is called a linear independent system of vectors if
for every finite subset tv1 , . . . , vn u L of it 1 v1 ` . . . n vn 0 implies
that ai 0 for all i 1, . . . , n.
A nonempty set of vectors which is not linear independent is called linearly
dependent.
A subset B V is called basis of V if it is both a system of generators and
linearly independent. In this case every vector v P V can be uniquely
written as a linear combination of vectors from B.

32

2. Vector Spaces

We have the following theorem


Theorem 2.12. (Existence of basis) Every vector space V has a basis.
We will not prove this general theorem here, instead we will restrict to
finite dimensional vector spaces.
Theorem 2.13. Let V t0u be e finitely generated vector space. over F.
From every finite system of generators one can extract a basis.
Proof. Let S tv1 , . . . , vr u be a finite generators system. It is clear that
there are nonzero vectors in V (otherwise V t0u). Let 0 v1 P S. The
set tv1 u is linearly independent (because v1 0 0 from v1 0).
That means that S contains linearly independent subsets. Now PpSq is
finite (S being finite), and in a finite number of steps we can extract a
maximal linearly independent system, let say B. We can suppose that
B tv1 , . . . , vn u, 1 n r. We prove that B is a basis for V . It is
enough to show that B generates V , because B is linearly independent by
the choice of it. Let v P V . S being a system of generators it follows that
it is enough to show that every vk P S, n k r is a linear combination
of vectors from B. Suppose, by contrary, that vk is not a linear
combination of vectors from B. It follows that the set B Y tvk u is linearly
independent, contradiction with the maximality of B.
Corollary 2.14. Let V be a F vector space and S a system of generators
for V . Every linearly independent set L S can be completed to a basis of
V.
Proof. Let L S be a linearly independent set in S. If L is maximal by
the previous Theorem it follows that L is a basis. If L is not maximal,
there exists a linearly independent set L1 with L L1 S. If L1 is
maximal it follows that L1 is a basis. If it is not maximal, we repeat the

Basis. Dimension.

33

previous step. Because S is a finite set, after a finite number of steps we


obtain a system of linearly independent vectors B which is maximal,
L B S, so B is a basis for V , again by the previous Theorem.
Theorem 2.15. Let V be a finitely generated vector space over F. Every
linearly independent system of vectors L can be completed to a basis of V .
Proof. Let S be a finite system of generators. The intersection L X S is
again a system of generators and L L X S. We apply the previous
corollary and we obtain that L can be completed to a basis of V .
Theorem 2.16. (The cardinal of a basis). Let V be a finitely generated F
vector space. Every basis of V is finite and has the same number of
elements.
Proof. Let B te1 , . . . .en u be a basis of V , and let B1 te11 , . . . , e1m u a
system of vectors with m n. We show that B1 can not be a basis for V .
Because B is a basis the vectors e1i can be uniquely written as
m
e1i j1 aij ej , 1 i m.
Definition 2.17. Let V t0u be a F vector space finitely generated. The
number of elements in a basis of V is called the dimension of V (it does
not depend on the choice of the basis, and it is denoted by dim F V ). The
vector space V is said to be of finite dimension. For V t0u , dim F V 0.
Corollary 2.18. Let V be a vector space over F of finite dimension,
dim F V n.
1. Any linearly independent system of n vectors is a basis. Any system
of m vectors, m n is linearly dependent.
2. Any system of generators of V which consists of n vectors is a basis.
Any system of m vectors, m n is not a system of generators

34

2. Vector Spaces

Proof. a) Consider L tv1 , . . . , vn u a linearly independent system of n


vectors. From the completion theorem (Theorem 2.15) it follows that L
can be completed to a basis of V . It follows from the cardinal basis
theorem (Theorem 2.16) that there is no need to complete L, so L is a
basis.
Let L1 be a system of m vectors, m n. If L1 is linearly independent it
follows that L1 can be completed to a basis (Theorem 2.15), so
dim F V m n, contradiction.
b) Let S tv1 , . . . , vn u be a system of generators which consists of n
vectors. From the Theorem 2.13 it follows that a basis can be extracted
from its n vectors. Again from the basis Theorem 3 it follows that there is
no need to extract any vector, so S is a basis.
Let S 1 be a generators system which consists of m vectors, m n. From
the Theorem 2.13 it follows that from S 1 one can extract a basis, so
dim F V m n, contradiction.
Remark The dimension of a finite dimensional vector space is equal to
any of the following:
The number of the vectors in a basis.
The minimal number of a system of generators.
The maximal number of a linearly independent system.
Theorem 2.19. Every linearly independent list of vectors in a finite
dimensional vector space can be extended to a basis of the vector space
Proof. Suppose that V is finite dimensional and tv1 , . . . , vm u is linearly
independent. We want to extend this set to a basis of V . V being finite
dimensional, there exists a finite set tw1 , . . . , wn u, a list of vectors which
spans V .

Basis. Dimension.

35

If w1 is in the span of tv1 , . . . , vm u, Let B tv1 , . . . , vm u. If not, let


B tv1 , . . . , vm , w1 u.
If wj is in the span of B, let B unchanged. If wj is not in the span of
B, extend B by ajointing wj to it.
After each step B is still linearly indepent. After n steps at most, the span
of B includes all the ws. Thus B also spans V , and being linearly
indepent, it follows that it is a basis.
As an application we show that every subspace of a finite dimensional
vector space can be paired with another subspace to form a direct sum
which is the whole space.
Theorem 2.20. Let V be a finite dimensional vector space and U a
subspace of V . There exists a subspace W of V such that V U W .
Proof. Because V is finite dimensional, so is U . Choose tu1 , . . . , um u a
basis of U . This basis of U a linearly independent list of vectors, so it can
be extended to a basis tu1 , . . . , um , w1 , . . . , wn u of V . Let
W xw1 , . . . , wn y.
We prove that V U W . For this we will show that
V U ` W, and U X W t0u
Let v P V , there exists pa1 , . . . am , b1 . . . bn ) such that
v a1 u1 ` . . . am um ` b1 w1 ` ` bn wm ,
because tu1 , . . . , um , w1 , . . . , wn u generates V . By denoting
a1 u1 ` . . . am um u P U and b1 w1 ` ` bn wm w P W we have just
proven that V U ` W .

36

2. Vector Spaces

Suppose now that U X W t0u, so let 0 v P U X W . Then there exist


scalars a1 , . . . am P F and b1 . . . bn P F not all zero, with
v a1 u1 ` ` am um b1 w1 ` ` bn wm ,
so
a1 u1 ` ` am um b1 w1 bn wm 0.
But this is a contradiction with the fact that tu1 , . . . , um , w1 , . . . , wn u is a
basis of V , so we obtain the contradiction, i.e. U X W t0u.
The next theorem relates the dimension of the sum and the intersection of
two subspaces with the dimension of the given subspaces:
Theorem 2.21. If U and W are two subspaces of a finite dimensional
vector space V , then
dim pU ` W q dim U ` dim W dim pU X W q .
Proof. Let tu1 , . . . , um u be a basis of U X W , so dim U X W m. This is
a linearly independent set of vectors in U and W resp. so it can be
extended to a basis tu1 , . . . , um v1 . . . vi u of U and a basis
tu1 , . . . , um w1 , . . . wj u of W , so dim U m ` i and dim W m ` j. The
proof will be complete if we show that tu1 , . . . , um v1 . . . vi , w1 , . . . wj u is a
basis for U ` W , because in this case
dim pU ` W q

m`i`j

pm ` iq ` pm ` jq m

dim U ` dim W dimpU X W q


The set spantu1 , . . . , um v1 . . . vi , w1 , . . . wj u contains U and W , so it
contains U ` W . That means that to show that it is a basis for U ` W it
is only needed to show that it is linearly independent. Suppose that
a1 u1 ` ` am um ` b1 v1 ` ` bi vi ` c1 w1 ` . . . cj wj 0 .

Basis. Dimension.

37

We have
c1 w1 ` . . . cj wj a1 u1 am um b1 v1 bi vi
which shows that w c1 w1 ` . . . cj wj P U . But this is also in W , so it lies
in U X W . Because u1 , . . . um is a basis in U X W it follows that there
exist the scalars d1 . . . dm P F, not all zero, such that
c1 w1 ` . . . cj wj pd1 u1 ` ` dm um q .
But tu1 , . . . , um w1 , . . . wj u is a basis in W , so it is linearly independent,
that is all ci s are zero.
The relation involving as, bs and cs becomes
a1 u1 ` ` am um ` b1 v1 ` ` bi vi 0 ,
so as and bs are zero because the vectors tu1 , . . . , um v1 . . . vi u form a
basis in U . So all the as, bs and cs are zero, that means that
tu1 , . . . , um v1 . . . vi , w1 , . . . wj u are linearly independent, and because that
generates U X W , they form a basis of U X W .
The previous theorem shows that the dimension fits well with the direct
sum of spaces. That is, if U X W t0u, the sum is the direct sum and we
have
dim pU W q dim U ` dim W .
This is true for the direct sum of any finite number of spaces as it is shown
in the next theorem:
Theorem 2.22. Let V be a finite dimensional space, Ui subspaces of V ,
i 1, n, such that
V U1 ` . . . Un ,
and
dim V dim U1 ` . . . dim Un .

38

2. Vector Spaces

Then
V U1 Un .
Proof. One can choose a basis for each Ui . By putting all these bases in
one list, we obtain a list of vectors which spans V (by the first property in
the theorem), and it is also a basis, because by the second property, the
number of vectors in this list is dim V .
Suppose that we have ui P Ui , i 1, n, such that
0 u1 ` . . . un .
Every ui is represented as the sum of the vectors of basis of Ui , and
because all these bases form a basis of V , it follows that we have a linear
combination of the vectors of a base of V which is zero. So all the scalars
are zero, that is all ui are zero, so the sum is direct.

We end the section with two important observations. Let V be a vector


space over F (not necessary finite dimensional). Consider a basis
B pei qiPI of V .
We have the first representation theorem:
Theorem 2.23. Let V be a vector space over F (not necessary finite
dimensional). Let us consider a basis B pei qiPI . For every v P V, v 0
1

there exist a unique subset B B, B tei1 , . . . , eik u and the nonzero


scalars ai1 , . . . , aik P F , such that

j1

Proof.

aij eij ai1 ei1 ` ` aik eik .

Local computations

2.4

39

Local computations

In this section we deal with some computations related to finite


dimensional vector spaces.
Let V be an F finite dimensional vector space, with a basis
B te1 , . . . , en u. Any vector v P V can be uniquely represented as
v

ai ei a1 e1 ` ` an en .

i1

The scalars pa1 , . . . , an q are called the coordinates of the vector v in the
1

basis B. It is obvious that if we have another basis B , the coordinates of


the same vector in the new basis change. How we can measure this
change? Let us start with a situation that is a bit more general.
Theorem 2.24. Let V be a finite dimensional vector space over F with a
1

basis B te1 , . . . , en u. Consider the vectors S te1 , . . . , em u V :

e1

a11 e1 ` ` a1n en
...

em

am1 e1 ` ` amn en

Denote by A paij qi1,m the matrix formed by the coefficients in the


j1,n

above equations. The dimension of the subspace xSy is eqaul to the rank of
the matrix A, i.e. dimxSy rankA.
Proof.

Consider now the case of m n in the above discussion. The set

40

2. Vector Spaces

S te1 , . . . , en u is a basis iff rankA n We have now


1

e1
1

e2

a11 e1 ` ` a1n en
a21 e1 ` ` a2n en
...

en

an1 e1 ` ` ann en ,

representing the relations that change from the basis B to the new basis
1

B S. The matrix At is denoted by

a11

P pe,e q

a12

...

a1n

a21

...

a22

...

...

...

a2n

...

an1

an2

...

ann
1

(it has on the columns the coordinates of the new basis e in the old basis
e).
Remarks
In the matrix notations we have

1
e
e
1
1
1

1
e2
e

A 2 or pe1 q1,n pP pe,e q qt peq1,n

...
...

1
en
en
1

Consider the change of the basis from B to B with the matrix


1

P pe,e q and the change of the basis from B to B with the matrix
1

P pe ,e q . We can think at the composition of these two changes,


2

i.e. the change of the basis from B to B with the matrix P pe,e q . It
is easy to see that one has
1

P pe,e q P pe ,e

P pe,e

Local computations

41

If in the above discussion we consider B B one has


1

P pe,e q P pe ,eq In ,
that is
1

pP pe ,eq q1 P pe,e q
At this step we try to answer the next question, which isimportant in
applications. If we have two basis, a vector can be represented in both of
them. What is the relation between the coordinates in the two basis?
Let us fix the setting first. Consider the vector space V , with two basis
1

B te1 , . . . , en u and B te1 , . . . , en u and P pe,e q the matrix of the


change of basis.
Let v P V . We have
1

v a1 e1 ` ` an en b1 e1 ` ` bn en ,
where pa1 , . . . an q and pb1 , . . . bn q are the coordinates of the same vector in
the two basis. We can write

e1

pvq

a1

a2

...

an


e2

...

en

e1

b
1

Denote

a1

a2

pvqe

...

an

b2

...

bn

1

e2

...

1
en

42

2. Vector Spaces

and

b1

b2

...

bn

pvqe1

the matrices of the coordinates of v in the two basis.


Denote further the basis columns

e1

peq1n

e2

...

en

the column matrix of the basis B and

e1
1

pe q1n

1
e2

...

1
en

the matrix column of the basis B , we have


1

pvq pvqte peq1n pvqte1 pvqte1 P pe,e q peq1n


1n

Because v is uniquely represented in a basis it follows


1

pvqte1 P pe,e q pvqte ,


or
1

pvqte1 pP pe,e q q1 pvqte .

2.5

Problems

You might also like