Algebra Summary
Algebra Summary
س ُت َر ُّ
دونَ م ُنىنَ َو َ م ْؤ ِ ه َوا ْن ُ سى ُن ُ م َو َر ُ مه َ ُ
ك ْ ع َه َ سيَ َري انهَّ ُ ف َمهُىا َ م اعْ َ( َو ُق ِ
مهُىنَ ).م تَ ْع َمب ُك ْن ُت ْ م بِ َ ك ْ د ِة َ
ف ُي َنبِ ّ ُئ ُ هب َانش َ
َّ م ا ْن َ
غ ْيبِ َو إِنًَ عَ بنِ ِ
Algebra
Summary
1) u + v ∈ V
2) ku ∈ V Conditions of Subspace
3) u + v = v + u
4) (u + v) + w = u + (v + w)
5) u + 0 = u, 0 ∈ V
6) u + w = 0, u = -w
7) u.1 = u
8) (k1k2).u = k1.(k2u)
9) k.(u + v) = ku + kv
Note:
Set of all polynomials with degree ≤ 2 is a vector space.
If w1, w2 are two subspaces of vector space V,
then, (w1 ∩ w2) is also a subspace.
Some Properties of Additive Identity 0
1) 0.v = 0
scalar additive identity
2) α.0 = 0
any scalar additive identity
3) if α.v = 0
Then α =0 or v = 0
S = {( ) ( ) ( )}
v1 v2 v3
Solution
v1 v2 v3 v
∴ c1 – 2c3 = -3 ∴ c1 = -3
∴ c2 + 4c3 = 5 ∴ c2 = 5
∴ -c3 =0 ∴ c3 = 0
∴u = -3v1 + 5v2
∴u can be written as a linear combination of S.
∴u ∈ Span(S).
∈ و ال ألSpan(S) يعُي يثال كُت بقىنك هم األرقاو دي،في انًسأنت انهي فاتت كُت يذيهك داجت يذذدة
وال أل؟S span R2 نكٍ في انًسأنت انقاديت بيقىنك هم
؟R في الvector ادُا عايزيٍ َذذد إرا كاٌ أي، يذذدvector ٍأَا هُا يص بتكهى ع
2
Solution
In this case we want to determine whether any vector in R2 in the form . /
can be written as a linear combination of v1, v2.
∴ R1 * (-2) + R2 R2
∴ c1 = a + (b-2a) = a + b
∴ S span R2.
Linearly Dependent and Independent Vectors
If we have a set of vectors = {v1, v2, v3}, we have two methods to check
whether v1, v2 and v3 are L.I or not.
Method 1
ٌ تس انفزق أSpan و الLinear Combination شثه يسأنح ال
هُحظ أصفارRight Hand Side ال
If c1 = c2 = c3 =0 If any c ≠ 0
v1, v2, v3 are L.I. v1, v2, v3 are L.D.
Method 2
We will find the determinant.
=0 ∴L.D
v1 v2 v3
≠ 0 ∴L.I
The Method of determinant is easier but we can’t use it:
1-if v1, v2, v3 don’t form a square matrix.
2-if the relation between the vectors is required in case of L.D. (the values of c1, c2, c3)
Dimension of a Vector Space
Number of elements in this vector space.
v1 v2 v3 ≠ 0
Notes:
If S forms a basis for R3, then S spans R3 but the converse is not true.
Basis are not unique.
.basis ٍ يًكٍ أالقي أكثر يvector spaceيعُي يٍ َفس ال
Rank of a matrix
Number of non-zero rows after making ERO (Elementary Row Operations)
From the values of rank(A), rank(A|B), we can determine whether the system
is consistent or not.
Ex: A= * +
How to find a basis for and the dimension of the row space (A)?
First, we will make ERO on A.
After making ERO:
A basis of a row space (A): The non-zero rows after making ERO
( )
= ,( )-
( )
, the dimension of row space (A) = number of vectors in the basis of
row space (A) = 3 = Rank(A)
Column Space (A)
Is a subspace that contains many vectors, but any vector in column space (A) must be
written as a linear combination of the columns of A.
How to find a basis for and the dimension of the row space (A)?
We will take the columns of leading ones from the original matrix.
How to find a basis for and the dimension of null space (A)?
To find a basis for N(A), you should solve the system AX = 0, find X.
( ) ( ) ( )
∴ The dimension of null space (A) = nullity (A) = number of vectors in the
basis of N(A) = 3
Note: For any matrix, we must have:
nullity(A) + rank (A) = number of columns of A
3 3 6
From the above equation, we can find nullity (A) without solving AX = 0,
But you must solve AX= 0 if the basis of the null space (A) is required.
Transformation
If we have two vector spaces V, W
(Transformation is like function)
image preimage
w = T(v) y = f(x)
Note:
For each v, we must have a single w.
v: domain of T.
w: codomain of T.
Range (T): set of all vectors in w that has a preimage in v.
Note:
We may have a vector in w that has no preimage in v. In this case, w is in
the codomain of T but outside range (T).
We may have a different v1, v2 that have the same image w.
To find the matrix B, B’ Non-standard matrix representation of T
A3*2 = ( )
Orthogonal Vectors
perpendicular
u,v in a vector space V will be orthogonal if:
(u,v) = 0
Note: Orthogonal vectors are L.I. vectors.
If u,v are orthogonal
∴ u,v must be L.I.
.نكٍ خهي بانك أٌ انعكس يص صخ
Orthonormal Vectors
1) u,v are orthogonal (u,v) = 0
2) u,v are unit vectors ∴ ||u|| = 1, ||v|| = 1
,w2 =
,w3 =
independent انهي هًاusيثقي هًا كذج قذروا يحىنىا ال
orthogonal انهي هًاvsل
orthonormal انهي هًاwsتعذ كذج حىنىهى ل
Important Note:
root(v,v) اساي؟magnitude هجية ال،standardنى أَا يش يستخذو ال
انهيfunction في َفسه تالvector و تعذ يا أضزب ال، في َفسه اْلولvectorيثقي أَا هضزب ال
يثقي هى دج، تتاعهاroot أروح واخذ ال،magnitude square هيطهعهي ال،أَت يذيهاني
.vector تتاع الmagnitudeال
Eigenvalue Problem
AX = λX
Eigenvector of A Eigenvalue of A
Steps:
1) Form An*n Find the matrix A-λI by subtracting λ from the diagonal of A.
2) ∴ |An*n-λI| = 0
equation of λ
Note:
Number of repetition of λ is called algebraic multiplicity.
Number of eigenvectors at the λ is called geometric multiplicity.
Eigenvalues
Distinct Repeated
Each λ has one eigenvector
Ex: λ = 1, -2, 3
x1 x2 x3
In this case, eigenvectors
are L.I.
x1 x2 x3 ≠ 0
x1, x2 x3 x1 x2
In this case, eigenvectors In this case, eigenvectors
are L.I. are L.D.
x1 x2 x3 ≠ 0 x1 x2 x3 = 0
Eigenvector of A Eigenvalue of A
Theorem: If A, B are n*n similar matrices, then they have the same eigenvalues.
Note: If B is a diagonal n*n matrix, then the eigenvalues of B are the diagonal elements.
zeros و باقي األرقاو بdiagonal دي يفيهاش أرقاو غير عهي الDiagonal Matrix
The Diagonalization:
A matrix A is called diagonalizable if A is similar to a diagonal matrix.
∴ A X1 X2 X3 = X1 X2 X3 ( )
Important Note
∴ A is diagonalizable if P has an inverse, where P = [x1 x2 x3], x1, x2, x3 are the
eigenvectors of A.
, P has an inverse if the eigenvectors are L.I. (if A is semi-simple) and this
happens only if:
The eigenvalues are distinct.
The eigenvalues are repeated and the algebraic multiplicity = geometric
multiplicity.
Otherwise, P will have no inverse and A will not be diagonalizable.
Matrix Functions
Theorem: If An*n with eigenvalues λ1, λ2,…… λn and the corresponding
eigenvectors are X1, X2,…… Xn, then f(A) will have eigenvalues f(λ1), f(λ2) ,……
f(λn) with the same eigenvectors X1, X2,…… Xn.
Note: If A is diagonalizable.
∴ f(A) = PDf(λ)P-1 P, P-1 of f(A) are the same as A
3) f(A) = PDf(λ)P-1
Cayley-Hamilton Theorem
Every matrix An*m satisfies its own characteristic equation.
Ex: λ2 – 6λ + 7 = 0 The Characteristic Equation of A.
A2 -6A + 7 = 0 Using Cayley-Hamilton Theorem.