0% found this document useful (0 votes)
41 views

Section 1.4: The Matrix Equation Ax B

This document summarizes key concepts from Sections 1.4, 1.5, and 1.7 of a linear algebra textbook. It discusses solving the matrix equation Ax=b, homogeneous and non-homogeneous systems of linear equations, writing solution sets in parametric vector form, and the definitions and tests for linear dependence and independence among vectors. Key points include that Ax=b has a solution if and only if b is a linear combination of A's columns, and that vectors are linearly independent if they are not linear combinations of each other.

Uploaded by

Andy Ortiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views

Section 1.4: The Matrix Equation Ax B

This document summarizes key concepts from Sections 1.4, 1.5, and 1.7 of a linear algebra textbook. It discusses solving the matrix equation Ax=b, homogeneous and non-homogeneous systems of linear equations, writing solution sets in parametric vector form, and the definitions and tests for linear dependence and independence among vectors. Key points include that Ax=b has a solution if and only if b is a linear combination of A's columns, and that vectors are linearly independent if they are not linear combinations of each other.

Uploaded by

Andy Ortiz
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

20F Discussion Section 3 Josh Tobin: http://www.math.ucsd.

edu/~rjtobin/

Section 1.4: The Matrix Equation Ax = b


ˆ This section is about solving the “matrix equation” Ax = b, where A is an m × n matrix and b is a
column vector with m entries (both given in the question), and x is an unknown column vector with
n entries (which we are trying to solve for). The first thing to know is what Ax means: it means we
are multiplying the matrix A times the vector x. How do we multiply a matrix by a vector? We use
the “row times column” rule, see the bottom of page 38 for examples.
ˆ Solving Ax = b is the same as solving the system described by the augmented matrix [A|b].

ˆ Ax = b has a solution if and only if b is a linear combination of the columns of A.

ˆ Theorem 4 is very important, it tells us that the following statements are either all true or all false,
for any m × n matrix A:
(a) For every b, the equation Ax = b has a solution.
(b) Every column vector b (with m entries) is a linear combination of the columns of A.
(c) The columns of A span Rm (this is just a restatement of (b), once you know what the word “span”
means).
(d) A has a pivot in every row.
This theorem is useful because it means that if we want to know if Ax = b has a solution for every b,
we just need to check if A has a pivot in every row. Note: If A does not have a pivot in every row,
that does not mean that Ax = b does not have a solution for some given vector b. It just means that
there are some vectors b for which Ax = b does not have a solution.
ˆ Finally, it is very useful to know that multiplying a vector by a vector has the following nice properties:

(a) A(u + v) = A(u) + A(v), for vectors u, v


(b) A(cu) = cA(u), for vectors u and scalars c.

Section 1.5: Solution Sets of Linear Systems


ˆ A homogeneous system is one that can be written in the form Ax = 0. Equivalently, a homogeneous
system is any system Ax = b where x = 0 is a solution (notice that this means that b = 0, so both
definitions match). The solution x = 0 is called the trivial solution. A solution x is non-trivial is
x 6= 0.
ˆ The homogeneous system Ax = 0 has a non-trivial solution if and only if the equation has at least one
free variable (or equivalently, if and only if A has a column with no pivots).
ˆ Parametric vector form: Let’s say you have found the solution set to a system, and the free variables
are x3 , x4 , x5 . Then to write the solution set in ‘parametric vector form’ means to write the solution
as
x = p + x3 u + x4 v + x5 w

where p, u, v, w are vectors with numerical entries. A method for writing a solution set in this form is given
on page 46.

1
20F Discussion Section 3 Josh Tobin: http://www.math.ucsd.edu/~rjtobin/

Section 1.7: Linear Independence


ˆ Like everything else in linear algebra, the definition of linear independence can be phrased in many
different equivalent ways. v1 , v2 , · · · , vp are linearly independent if any of the following equivalent
statements are true:
(a) the vector equation x1 v1 + x2 v2 + · · · + x2 v2 = 0 has only the trivial solution
(b) none of the vectors v1 , v2 , · · · , vp are a linear combination of the others
(c) if we put the vectors together as columns of the matrix A, then the system Ax = 0 has only the
trivial solution
(c) if we put the vectors together as columns of the matrix A, then the system Ax = 0 has only the
trivial solution
(d) if we put the vectors together as columns of the matrix A, then A has a pivot in every column
ˆ If vectors aren’t linearly independent, then they are linearly dependent. This means that (at least)
one of the vectors is a linear combination of the rest. Note: This does not mean that all of the vectors
are linear combinations of the others. See the following exercise.

ˆ Exercise 1: Find three vectors in R3 that are linearly dependent, but where the third vector is not a
linear combination of the first two.
ˆ Method to check linear (in)dependence: If we want to check if a set of given vectors is linearly
independent, put them together as columns of a matrix, and then row reduce the matrix. If there is a
pivot in every column, then they are independent. Otherwise, they are dependent.

ˆ Exercise 2 (1.7.1): Check if the following vectors are linearly independent:


     
5 7 9
 0 , 2 , 4 
0 −6 −8

ˆ Theorem 9: Any set containing the zero vector is linearly dependent. This follows immediately from
the method above, because if one of the columns is zero, there can’t be a pivot in every column (there
are other easy ways to prove this theorem also, see the book for example).
ˆ Theorem 8: If we have p vectors, each with n entries, and p > n, then these vectors have to be linearly
dependent. (This follows from the method above too, because if there are more columns than rows,
there can’t be a pivot in every column).
ˆ Exercise 3: Find 2 vectors in R5 that are linearly dependent. Notice that this means that if p ≤ n in
the theorem above, then the vectors might be dependent or independent.

You might also like