0% found this document useful (0 votes)
4 views

2d

The document provides an overview of Principal Component Analysis (PCA), a technique for dimensionality reduction that identifies the orthogonal basis that best represents a dataset. It explains the concepts of eigenvalues and eigenvectors, which are crucial for understanding how PCA works and its applications in image processing. Additionally, it discusses the use of PCA in finding tight bounding boxes for data representation.

Uploaded by

freeguyfreeguy67
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

2d

The document provides an overview of Principal Component Analysis (PCA), a technique for dimensionality reduction that identifies the orthogonal basis that best represents a dataset. It explains the concepts of eigenvalues and eigenvectors, which are crucial for understanding how PCA works and its applications in image processing. Additionally, it discusses the use of PCA in finding tight bounding boxes for data representation.

Uploaded by

freeguyfreeguy67
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Principal Component Analysis (PCA)

Gundimeda Venugopal
Dimensionality Reduction: Taking a picture of the Data

2
Housing Data

3
Dimensionality Reduction (2d to 1d example)

4
Mean Variance and Covariance

Variance

5
Covariance Matrix + Linear Transformations

Linear transformations are operations that map one vector to another vector
A matrix can be seen as a representation of a linear transformation, and a vector can be seen as a representation of a point in an image.
6
Eigen Values and Eigen Vectors
❖ An eigenvalue is a scalar value that indicates how much a vector is stretched or shrunk by a linear transformation.

❖ An eigenvector is a vector that does not change its direction when it is transformed by a matrix, only its magnitude.

❖ The eigenvalues can tell you how much variance or diversity there is in the image.
❖ The eigenvectors can tell you the directions or patterns that are most prominent in the image.
❖ By using the eigenvalues and eigenvectors, you can perform various tasks such as image compression, segmentation & recognition.
7
Eigen Value of a matrix

8
Principal Component Analysis (PCA)

9
Principal Component Analysis – the general idea
❖PCA finds an orthogonal basis that best represents given data set.

y y’
❖ Eigenvectors that correspond to big eigenvalues are
x’ the directions in which the data has strong
components (= large variance).
❖ If the eigenvalues are more or less the same – there is
no preferable direction.

❖The sum of distances2 from the x’ axis is minimized.

10
PCA – the general idea

y v2
v1

y y

x x

This line segment approximates The projected data set


the original data set approximates the original data set
11
PCA – the general idea

❖PCA finds an orthogonal basis that best represents given data set.

z
3D point set in
standard basis

x y

❖PCA finds a best approximating plane (again, in terms of distances2)

12
Principal Component Analysis (PCA)

13
Application: finding tight bounding box
❖An axis-aligned bounding box: agrees with the axes

y
maxY

minX maxX x

minY

14
Application: finding tight bounding box
❖Oriented bounding box: we find better axes!

x’
y’

15
Application: finding tight bounding box
❖Oriented bounding box: we find better axes!

16
References and Credits
❖ Principal Component Analysis (PCA) by Luis Serrano
❖ StatQuest: Principal Component Analysis (PCA), Step-by-Step
❖ Principal Component Analysis (PCA) - easy and practical explanation
❖ https://www.linkedin.com/advice/0/how-can-you-use-eigenvalues-eigenvectors-improve-image-eid9f
❖ MATH 3191: Example Singular Value Decomposition for 3 x 2 Matrix
❖ https://byjus.com/maths/singular-value-decomposition/

17

You might also like