Linear Discriminant Analysis
Linear Discriminant Analysis
X1* X2 X1 X2
Dependent Independent
Categorical
MANOVA Y
Continuous
LDA
Linear Discriminant Analysis
How to predict
medical deformities
like say throat
cancer
Linear Discriminant Analysis - Illustration
Linear Discriminant Analysis – Understanding Variance
Linear Discriminant Analysis - Understanding Covariance
Linear Discriminant Analysis - Understanding Covariance
Linear Discriminant Analysis - Understanding Covariance
Solution
Linear Discriminant Analysis - Understanding Covariance
Linear Discriminant Analysis - Understanding Covariance
Linear Discriminant Analysis - Understanding Covariance
Linear Discriminant Analysis
Wilks Lambda – λ
A low λ means
within group variance / total group variance
is low
Linear Discriminant Analysis
Wilks Lambda – λ
A low λ means
within group variance / total group variance
is low
Linear Discriminant Analysis
1000 100
500
A A A B B B
Linear Discriminant Analysis
A A
Tenure A
B
B
B
Scatter
S1 S2 Original Axis
1000 100
500
A A A B B B
μ1 μ2
Means
Difference between the means need to be maximized
Linear Discriminant Analysis – Calculate the mean Values
I . Lets Calculate the mean Values
A1 A2 μ1 μ2
(4,1) (9,10)
(4 1) (9 10)
(2,4) (6,8) (2 4) (6 8)
(2,3) (9,8)
(2 3) (9 8)
(3,6) (8,7)
(3 6) (8 7)
(4,4) (10,8)
(4 4) (10 8)
15/5 18/5 42/5 38/5
(3 3.6) (8.4 7.6)
A1 (4, 1) (2, 4) (2, 3) (3, 6) (4, 4) A2 (9, 10) (6, 8) (9, 8) (8, 7) (10, 8)
μ1 (3, 3.6) (3, 3.6) (3, 3.6) (3, 3.6) (3, 3.6) μ2 (8.4, 7.6) (8.4, 7.6) (8.4, 7.6) (8.4, 7.6) (8.4, 7.6)
( x- μ1) (1, -2.6) (-1, .4) (-1, -.6) (0, -2.4) (1, 0.4) ( x- μ2) (.6, 2.4) (-2.4, 1.6) (1.4, .4) (-.4, -.6) (1.6, 0.4)
Linear Discriminant Analysis
1 1 -2.6 1 -2.6 1 1 .4 1 .4
1. -2.6
1*2
= -2.6 6.76 5. 1*2
= .4
.4 0.16
2*1 2*1
2*2 2*2
-1 -1 .4 1 - .4
2. 1*2
= - .4 0.16
.4 1 -2.6 1 - .4 1 .6 0 0 1 .4
2*1 2*2 + + + +
-2.6 6.76 - .4 0.16 .6 0.36 0 5.76 .4 0.16
2*2 2*2 2*2 2*2 2*2
-1 -1 -.6 1 .6
3. 1*2
= .6
-.6 0.36
2*1
2*2
4 -.2 - .4
4.
0 0 -2.4
1*2
0 0
-2 13.2
= .8
- .4 2.6
= S1
=
-2.4
2*1
0 5.76
2*2
5
Linear Discriminant Analysis – In between class Scatter Matrix
.8 - .4 1.84 -.04
- .4 2.6 -.04 2.64
2.64 -.44
Sw = -.44 5.24
Lets Calculate in between class scatter matrix SB = (μ1 – μ2) (μ1 – μ2)T
29.16 21.6
μ1 - μ2 -5.4 -5.4 -.4
=
21.6 16
-.4
Linear Discriminant Analysis
c d
2.64 -.44 -1 -5.4
⃗𝒆 =
-.44 5.24 -.4 -1
𝟏 d -b
𝒂𝒅 − 𝒃𝒄 -c a
1 5.24 .44 -5.4
⃗𝒆 =
.44 2.64 -.4
2.64 * 5.24 - (-.44)2
.38 -5.4
⃗𝒆 =
.03
.03
.19 -.4
=
-2.17
-.92
Linear Discriminant Analysis
-2.17
projection Vector ⃗𝒆 -.92
Eigen Vectors and Eigen Values
3 0 1 3 1
3 0 = = 3
A = 1 2 1 3 1
1 2
1 1
Multiplication by 2 A = 3
1 1
Multiplication by 7
A (7 ⃗
𝒆𝟐 ) = 7⃗
𝐀𝒆𝟐 ⃗ (1,1)
𝐀𝒆𝟐 ⃗
=⃗
𝟏𝟒𝒆𝟐 ⃗ 𝐀𝒆𝟏
𝒆𝟐
⃗
𝒆𝟏
𝒙𝟏 = λ⃗
A⃗ 𝒙𝟏
Matrix A Vector⃗
𝒙 𝟏 Vector A⃗
𝒙𝟏
1 2 1
2 4 * 1
= 3
6
(5,10)
Rotated
(1,1)
Scaled
Matrix A Vector⃗
𝒙 𝟏 Vector A⃗
𝒙𝟏
1 2 1
2 4 * 2
= 10
5
No Rotation
(1,1)
LDA and PCA
λ1
* * * * ** *
* * ** * * ** * *
* * * * * * ** * *
Bad Projection
*
* * * ** *
λ2 ** * *
* *
* * *
* * **
** * * *
*
*
* * *
*
*
*
* * ** * *
*
** * *
*
* *
*
*
Good Projection
LDA and PCA
Source - https://stackoverflow.com/questions/33576963/dimensions-reduction-in-matlab-using-pca
LDA and PCA
LDA PCA
Discovers the relationship between Discovers relationship between
dependent and independent variables independent variables
Used for variable reduction based on Used for reducing variables based on
strength of relationship between collinearity of independent variables.
independent and dependent variables
Finds the direction that maximizes Finds direction that maximizes the variance
difference between two classes in the data.