100% found this document useful (1 vote)
476 views

Applications of Linear Algebra in Facial Recognition

- The document discusses applications of linear algebra, specifically principal component analysis (PCA) and eigenfaces, for facial recognition. - PCA and eigenfaces are commonly used linear subspace methods that represent faces as weighted sums of eigenfaces in a lower-dimensional face space. - The mathematics of PCA and eigenfaces for facial recognition are described, including computing the mean face, eigenvectors of the covariance matrix to form the eigenfaces, and projecting faces onto this face space for recognition.

Uploaded by

RAJ JAISWAL
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
476 views

Applications of Linear Algebra in Facial Recognition

- The document discusses applications of linear algebra, specifically principal component analysis (PCA) and eigenfaces, for facial recognition. - PCA and eigenfaces are commonly used linear subspace methods that represent faces as weighted sums of eigenfaces in a lower-dimensional face space. - The mathematics of PCA and eigenfaces for facial recognition are described, including computing the mean face, eigenvectors of the covariance matrix to form the eigenfaces, and projecting faces onto this face space for recognition.

Uploaded by

RAJ JAISWAL
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Applications of Linear Algebra in Facial

Recognition
Harsh Sumit Bhimte Harshit Gupta Akash Tyagi
Dept. of IT IIIT Allahabad Dept. of IT IIIT Allahabad Dept. of IT IIIT Allahabad Dept. of IT IIIT Allahabad
MSG2022024 MSG2022025 MSG2022026 MSG2022027

Abstract—The very common fact that the facial structure of The above table shows various approaches (both linear and
a person does not match with any other person and is unique nonlinear subspaces) used in facial recognition. The linear
form the very basis of Facial recognition systems and therefore subspaces methods primarily used are eigenfaces and different
making the face-matching a lot more easier and accurate and
this started in late 1960s, which is upgraded further and is being variations of linear PCA and linear discriminant analysis while
used till date throughout the world for security purposes.For a the nonlinear subspace methods used are Nonlinear PCA and
Facial Recognition system to be accurate, the identification of kernel disciminant ananlysis.
face and its structure should be done carefully and are then Accuracy is of utmost concern for such kind of systems
converted to two-dimensional computerised data.A database of
face images as well as an efficient algorithm are a pre-requisite
which are based on facial recognition. However, this system
for a facial recognition problem. In this paper, we explore the can encounter a difficult situation where a person is not granted
applications of Linear Algebra which is used for Face recognition access because of the differences which occur in saved images
specifically Eigenfaces as well as Principal Component Analysis. of the database due to different facial expressions, lighting
In the recognition process, we form an eigenface of a given face conditions, age, shifting and scaling of image etc.
image and compare it with eigenfaces stored in our database
on the basis of Euclidian distances between them. The person A face image can be converted be to a 2-D N x N matrix
resembles the eigenfaces which have least Euclidian distance which consist of vectors or values of intensity of N2 dimen-
between them. sion. For example, an image which is using RGB channels
Index Terms—Principal Component Analysis, Eigenvalues, (8-bit or 256-bit values) can be considered as of a vector
Eigenvectors, Euclidean Distance of dimension 65,536 or a point in 65,536-dimensional space.
There are various algorithms that can be used for recognition
I. I NTRODUCTION of faces like Support vector machines, Eigenfaces, wavelet
transformation, Principal Component Analysis etc.
Facial recognition has wide areas of applications from
biometrics, to information security and access control. Fur-
thermore, face recognition is classified into Sex determina-
tion, Face identification, surveillance in crowded places, Face
classification, and Video content indexing. In areas of IOT-
based systems, facial recognition can be used in place of
classic methods such as PINs, identity cards, access cards
etc. For a person to be recognised, the system captures an
image to be compared with the images that are present in
the database for facial recognition and to be granted access.

Fig 1.2 Comparison of Algorithms in Face Recognition


From the above table, the most efficient results are from
the Principal Component Analysis and Eigenfaces for facial
recognition system in terms of accuracy.
However, the most popular methods that have shown fruitful
results include PCA and eigenfaces.For image compression
and image recognition the most used and suggested method
Fig 1.1 Various approaches used in facial recognition is Principal Component Analysis. PCA can be thought as an
example of factor analysis, a statistical method used for de- be considered as closeness to identification.Otherwise, training
scribing variability among the data. One of the main objectives of the system has to be done for that individual.
of PCA is for dimensionality reduction i.e. to reduce a large
dimension features dataset to smaller dimensions efficiently. III. M ATHEMATICS OF PCA AND E IGENFACES
Since PCA is a technique used for linear domain, applications The method of using eigenfaces for facial recognition is
having linear models are more desirable. The primary concept quite a simple and elegant technique. A two-dimensional
behind PCA is to convert the one-dimensional vector of pixels image of a face can be expressed as a simple one-dimensional
composed from the two-dimensional face in the feature space vector by combining every row or column into one large
as principal components. This is known as a projection of vector. Let us assume that we have M different images(M
eigenspace. This eigenspace is computed from the eigenvectors vectors) of size N. pj represents the intensity of the pixels.
of the covariance matrix of the vectors of the facial image.
xi = [p1 ...pN ]T , i = 1, ..., M (1)
Once these eigenvectors are calculated, various decisions can
be made on the basis of the applications. The average or mean image is then computed and mean
For recognition of a face, the principal component analysis centring is performed for the M vectors i.e finding difference
is used for the calculation of the basis of the feature space. from the mean image vector for each vector. Let us denote
The eigenvectors calculated from PCA are the basis vectors mean image by m
M
of this feature space and are in the same direction of the 1 X
training vectors known as eigenfaces. Every eigenface can be m= xi (2)
M i=1
thought of as a feature. For a face that has been projected
on the face space, its corresponding vector will explain the Let wi represent mean-centred image vector for vector i [2]
importance of these features.The eigenfaces coefficients can
w i = xi − m (3)
express a face in the face space.In this way, we can handle
any large image vector by considering the eigenfaces and The projection on the wi ’s is maximum from which eigen-
reduce the size.Reconstruction of the original image from vectors ei ’s is calculated here,which can be done by finding
image space is also posssible although with some error as orthonormal vectors ei for which we can maximize
dimension of the image space ia much larger than the face M
space. Each of the faces in the training data is converted into 1 X T
λi = (e wn )2 (4)
the face space and its corresponding components are stored. M n=1 i
These training data populate the face space.The calculation in
using the Orthogonality Constraint
the training set for the principal components of the faces is
done first. After this eigenfaces is used for the formation of eTl ek = δl k (5)
vector space in which recognition is done by projecting of the
faces. A comparison between the Euclidian distances of the The eigenvalues and eigenvectors of the covariance matrix are
eigenvectors of the eigenfaces and the eigenface of the image denoted by λi ’s and ei ’s respectively.
is made.The identification of a person is directly related to the C = WWT (6)
distance between the two, where smaller the distance can be
considered as closeness to identification.Otherwise, training of where W is the matrix created from the mean centered vectors
the system has to be done for that individual. wi ’s where each wi is a column vector. Depending on the
size of M i.e the number of images sampled and the size of
II. PCA IMPLEMENTATION N(size of each image), the size of covariance matwi ’srix can
be huge. For example, An image of size 128 x 128 leads
The applications of linear algebra is vast and very known. to a covariance matrix of 16384 x 16384. For such a large
And the most commonly used with best result is Principal dimension matrix, it may not be practical to compute the ei ’s
Component Analysis. PCA due to its simpleness is used using conventional methods. The M x M matrix W T W can
widely as a method for the extraction of useful information be used for the calculation of eigenvalues and eigenvectors in
from the huge and unfathomable data sets, in all forms of order to solve the problem caused due to its larger dimensions.
analysis - from neuroscience to computer graphics. PCA re- Let di and µi represent the eigenvalues and eigenvectors for
duces human effort by providing a roadmap for the dimension WTW.
reduction of complex data which is sometimes hidden. W T W di = µi di
For each faces in the training data set their Principal Multiply by W on both sides, we get
component analysis are calculated.Projection of the face on
W W T (W di ) = µi (W di ) (7)
the vector space formed by the eigenfaces is used for the
recognition. A comparison between the Euclidian distances W di and i gives the first M - 1 eigenvalues i and eigenvectors
of the eigenvectors of the eigenfaces and the eigenface of the ei of W W T ,normalization is done for W di to be equal to ei .
image is made. The identification of a person is directly related The rank is limited to M - 1 for the covariance matrix, since
to the distance between the two, where smaller the distance can we add M different vectors(and -1 since we subtract the mean
vector). The eigenvectors which have non-zero eigenvalues R EFERENCES
produce an orthonormal basis with which we can represent [1] Neeraj Kumar, Nirvikar, 2013, An Application of Linear Algebra for
most of the image data with less no of dimensions. The the Optimal Image Recognition, INTERNATIONAL JOURNAL OF
eigenvectors are sorted in descending order of magnitude ENGINEERING RESEARCH TECHNOLOGY (IJERT) Volume 02,
Issue 02 (February 2013),
of their respective eigenvalues. So the eigenvectors which [2] Kim, K. (1996, August). Face recognition using principle component
have near-zero eigenvalues are in the last position since they analysis. In International Conference on Computer Vision and Pattern
represent the least variance in the image. The eigenvector Recognition (Vol. 586, p. 591).
with the largest eigenvalue represents the most variance in
the image(principal components). These eigenvalues decrease
in an exponential way, which means that most of the variance
in the image can be represented by the first 10% to 15% of
the dimensions.
By using the equation given below a facial image is pro-
′ ′ ′
jected onto M (where M ≪ M )

′ T
Ω = [v1 v2 .....vM ] (8)

where vi = eTi wi . In the new space, vi is the ith coordinate


of the facial image, which is the principal component and ei ’s
are the eigenfaces.The facial image is represented with the
help eigenfaces, and their contribution is denoted by Ω. To
tell which class an input image belongs to is to find the face
class k that minimizes the euclidean distance.

ϵk = ||(Ω − Ωk )|| (9)

where vector for k th face class is denoted by Ωk . A face


is categorized into class k if the value of ϵk calculated from
above equation is smaller than a predefined threshold.
The flowchart given below explains the steps that are used
in facial recognition using eigenfaces and PCA.

Fig 1.3 Flowchart of the Eigenfaces Algorithm

IV. L INEAR D ISCRIMINANT A NALYSIS (LDA):

Uopt = argmax (10)

The number of courses is M.The number of samples Ni in


the class i and the mean of class is µi .

You might also like