Principal Component Analysis Concepts
Principal Component Analysis Concepts
1. Main idea: seek most accurate data representation in a lower dimensional space
1. Example in 2-D, project data to 1-D subspace (a line) with minimal projection error
[email protected]
QU1HPBT85A
1. In both the pictures above, the data points (black dots) are projected to one line but the
second line is closer to the actual points (less projection errors) than first one
1. Notice that the good line to use for projection lies in the direction of largest variance
Ref: http://www.cs.haifa.ac.il/~rita/uml_course/add_mat/PCA.pdf
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
PCA Pt 2
5. After the data is projected on the best line, need to transform the coordinate system to
get 1D representation for vector y
5. Note that new data y has the same variance as old data x in the direction of the green
[email protected]
QU1HPBT85A line
Ref: http://www.cs.haifa.ac.il/~rita/uml_course/add_mat/PCA.pdf
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
PCA Pt 3
8. In general PCA on n dimensions will result in another set of new n dimensions. The
one which captures maximum variance in the underlying data is the principal
component 1, principal component 2 is orthogonal to it
8. Example in 2-D, project data to 1-D subspace (a line) with minimal projection error
[email protected]
QU1HPBT85A
Ref: http://www.cs.haifa.ac.il/~rita/uml_course/add_mat/PCA.pdf
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
Mechanics of Principal Component Analysis
http://setosa.io/ev/principal-component-analysis/
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
Principal Component Analysis steps
1. Begins by standardizing the data. Data on all the dimensions are subtracted from their
means to shift the data points to the origin. i.e. the data is centered on the origins
1. Generate the covariance matrix / correlation matrix for all the dimensions
1. Perform eigen decomposition, that is, compute eigen vectors which are the principal
components and the corresponding eigen values which are the magnitudes of variance
captured
[email protected]
QU1HPBT85A
1. Sort the eigen pairs in descending order of eigen values and select he one with the
largest value. This is the first principal component that covers the maximum
information from the original data
Ref: http://www.cs.haifa.ac.il/~rita/uml_course/add_mat/PCA.pdf
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
Principal Component Analysis (Performance issues)
1. PCA effectiveness depends upon the scales of the attributes. If attributes have
different scales, PCA will pick variable with highest variance rather than picking up
attributes based on correlation
[email protected]
QU1HPBT85A 1. Presence of skew in data with long thick tail can impact the effectiveness of the PCA
(related to point 1)
[email protected]
QU1HPBT85A
Sol: PCA-iris.ipynb
Proprietary content. ©Great
ThisLearning.
file is meantAll
forRights
personalReserved. Unauthorized use oronly.
use by [email protected] distribution prohibited
Sharing or publishing the contents in part or full is liable for legal action.
Principal Component Analysis (Signal to noise ratio)
Principal Component Analysis (Signal to noise ratio)
Signal – all valid values for a variable
(show between max and min values for
Y max
x axis and y axis). Represents a valid
data
1 st Principal Component
X_std = StandardScaler().fit_transform(X)
eig_vals, eig_vecs = np.linalg.eig(cov_matrix)
4. Multiplying the two matrices produces a matrix of total variance also called
covariance matrix (a square and symmetric matrix).
[email protected]
QU1HPBT85A
Signal
5. The axis rotation is done such that
the new dimension captures max
variance in the data points and print('Eigen Vectors \n%s', eig_vecs)
also reduces total error of print('\n Eigen Values \n%s', eig_vals)
representation
[email protected]
QU1HPBT85A
8. For this we have to transform the matrix A to a new matrix B such that the
covariance matrix of B ( ), is a diagonal matrix (Ref to part 2, bullet
5)
Thanks
[email protected]
QU1HPBT85A