0% found this document useful (0 votes)
11 views

Stack Edit

The document discusses eigenbases and how they can provide a better basis for describing linear transformations compared to standard bases. It explains that eigenvectors do not change direction after a linear transformation by a matrix, only their magnitude, and defines eigenvalues as the scaling constants. Examples are provided for calculating eigenvalues and eigenvectors of matrices.

Uploaded by

asd5 777
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Stack Edit

The document discusses eigenbases and how they can provide a better basis for describing linear transformations compared to standard bases. It explains that eigenvectors do not change direction after a linear transformation by a matrix, only their magnitude, and defines eigenvalues as the scaling constants. Examples are provided for calculating eigenvalues and eigenvectors of matrices.

Uploaded by

asd5 777
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

StackEdit+ https://stackedit.

net/app#

Eigenbases

Be�er Basis?

• Suppose these linear Transforma�ons below,

Fig.1 Fig.2

• The difference b/w the two is of the Basis,


• Fig.1 uses the fundamental bases to describe the linear Transforma�on
• Fig.2 uses two bases to describe the linear Transforma�on
• However in Fig.2, the the vectors transformed stay parallel
• Because of this, It is now easier to describe the tranforma�on

• Now we can describe the tranforma�on as a

1 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

◦ ×2 scaling Horz.
◦ ×3 scaling Diag.

• Hence the basis in Fig.2 are more suited / useful than the fundamental bases as
there is no changing of direc�on,

Use case

• Look at the point above,


• The point was transformed by a matrix
• It is now in a new direc�on
• Since we used the eigenbases, the transforma�on is now easier to describe

Eigenvalues and Eigenvectors

2 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

• The image above shows 3 vectors,


• All three of them are transformed by a matrix,
• Two of them don’t change their direc�on
• One of them does,
• The two, that don’t change are Eigenvectors

• Eigenvectors are the the vectors that don’t change their direc�on a�er a linear

3 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

transforma�on is done on them by a matrix,


• Instead, they change their length/magnitude by a certain scale,
• This scale/constant is called Eigenvalue
• Both Eigenvectors & Eigenvalues come in pairs,

A set of Eigenvectors can be called Eigenbases

Less Work?

4 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

• Since eigenvectors don’t require a complete matrix mul�plica�on for


transforma�on, they also require less work,
• Instead of a matrix mul�plica�on we can perform a scalar mul�plica�on

On non-Eigenvectors

• Since the Eigenvectors are also the basis for the plane, any non-Eigenvector, can be
expressed through them

5 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

• now instead of the mul�plying the non-Eigenvector w/ matrix, we can mul�ply the
Eigenvectors w/ matrix

the Eigenvectors w/ matrix mul�plica�on is already done, subs�tute it in the


below equa�on

subsituted, and now simple scalar mul�plica�on

6 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

7 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

[! NOTE]:
There is s�ll calcula�on required for calcula�ng the eigenvectors of a matrix,
It can however be useful for mass linear transforma�on as now not every vector
requires a matrix mul�plica�on, a�er Eigenvectors, they require a scalar
mul�plica�on

Correct! By using the eigenvalues you found λ1 = 11, λ2 = 1, you solve the system of
equa�ons to find the eigenvectors.

9 4
[ ] ⋅ [ ] = 1[ ]
x x
4 3 y y

Therefore, solving the linear system, you get the rela�on

x = 2y

So, any (x, y) sa�sfying such rela�on is an eigenvector. In this case, x=2,y=12x=2,y=21
2
sa�sfies. So, the vector [ ] is an eigenvector for the matrix. For the next eigenvalue, we
1
have to solve the following set of equa�ons:

9 4
[ ] ⋅ [ ] = 1[ ]
x x
4 3 y y

8 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

Leading to the following rela�onship between x and y :

2x = −y

−1
And x = −1, y = 2 sa�sfies such rela�on, so the vector [ ] is also an eigenvector
2
for the matrix.

Calcula�ng Eigenvalues and Eigenvectors

Other sources: Source1-invidious, Source1-YT

Fig 2.1 Fig 2.2

Fig 2.3 Fig 2.4

2 1
• Look at the matrix [ ] above,
0 3

2 0 3 0
• In Fig 2.1 and Fig 2.2, when comparing with the matrices [ ] and [ ]
0 2 0 3
some vectors are transforming in the same direc�on in both the comparisons

• The transforma�ons are intersec�ng at infinitely many points, something non-

9 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

singular is happening

• When looking at their differences they are non-singular

2 1 3 0 −1 1
[ ]−[ ]=[ ]
0 3 0 3 0 0

2 1 2 0 0 1
[ ]−[ ]=[ ]
0 3 0 2 0 1

non-singular aka. det(M − N) = 0

• What is also common above is that the matrices are also 3I, 2I respec�vely,
• Hence the coeff. 3, 2 are λ “Eigenvalues” (N = λI)
• Hence we have determined:
◦ that for a Matrix M , determinant of it’s difference with the eigenvalue iden�ty
matrix product is zero,

det(M − λI) = 0

Finding Eigenvalues

• Via the equa�on determined above, we can now calculate Eigenvalues of any Matrix

10 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

• Using the Eigenvalues we can calculate the Eigenvector as well

Example

9 4
Suppose a Matrix [ ]
4 3

Calcula�ng Eigenvalues

det(M − λI) = 0

9 4 1 0
• det([ ] − λ[ ]) = 0
4 3 0 1

9−λ 4
• det([ ]) = 0
4 3−λ

• (9 − λ)(3 − λ) − 16 = 0

• λ2 − 12λ + 11 = 0

λ = 1, 11 ← Eigenvalues

Calcula�ng Eigenvectors

For λ =1:

11 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

M v = λv
M v − λv = 0
(M − λ)v = 0
(M − λI)v = 0, (Iden�ty matrix used as it is equal to one)

9 4 x
[ ][ ] = 1[ ]
x
4 3 y y

([ ])[ ] = 0
9 4 1 0
] − 1[
x
4 3 0 1 y

8 4 x
[ ][ ] = 0
4 2 y

• Now there are Infinite Eigenvectors to this solu�on, as have seen previously,
• We just want the rela�on b/w x & y , and show a simple Eigenvector where x =1

8x + 4y = 0,
4x + 2y = 0,

2x = −y,
2x = −y,

1
x=1→[ ]
−2

1
[ ] ←Eigenvector
−2

For λ = 11 :

([ ])[ ] = 0
9 4 1 0
] − 11 [
x
4 3 0 1 y

−2 4
[ ][ ] = 0
x
4 −8 y

−2x + 4y = 0,
4x − 8y = 0,
x = 2y

12 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

1
if: x =1→[ ]
2

1
[ ] ←Eigenvector
2

Example w/ 3x3 Matrix

13 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

On the Number of Eigenvectors

Always dis�nct eigenvalues & vectors?

• Previously we saw 3x3 Matrix with 3 dis�nct eigenvalues with 3 dis�nct


eigenvectors
• But is that always the case?

14 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

Repeated eigenvalues - Example 1

• Here the Matrix has only 2 dis�nct eigenvalues

1st eigenvalue 2nd eigenvalue

• However with 2nd eigenvalue we get 2 unique eigenvectors


• This is due to in the eqn "x1 = 2x2 − 0.5x3 " we first assume
◦ x2 = 1 then,
◦ x1 = 1

Hence,

(3 × 3) → 2Eval → 3Evec

15 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

Repeated eigenvalues - Example 2

• 2 dis�nct eigenvalues

16 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

1st eigenvalue 2nd eigenvalue

• However 2nd eigenvalue now only has 1 eigenvector

Hence,

(3 × 3) → 2Eval → 2Evec

Summary

17 of 18 20/03/2024, 07:40 AM
StackEdit+ https://stackedit.net/app#

• The more dis�nct the eigenvalues are the more certainty of:
n × n ⟹ #nEvec

meaning that a n×n matrix will more likely have #n eigenvectors

18 of 18 20/03/2024, 07:40 AM

You might also like