Chapter Nineteen: Factor Analysis
Chapter Nineteen: Factor Analysis
Factor Analysis
19-2
Chapter Outline
1) Overview 2) Basic Concept 3) Factor Analysis Model 4) Statistics Associated with Factor Analysis
19-3
Chapter Outline
5) Conducting Factor Analysis i. Problem Formulation ii. Construction of the Correlation Matrix iii. Method of Factor Analysis iv. Number of of Factors v. Rotation of Factors vi. Interpretation of Factors
19-4
Chapter Outline
6) Applications of Common Factor Analysis 7) Internet and Computer Applications
8) Focus on Burke
9) Summary 10) Key Terms and Concepts
19-5
Factor Analysis
Factor analysis is a general name denoting a class of procedures primarily used for data reduction and summarization. Factor analysis is an interdependence technique in that an entire set of interdependent relationships is examined without making the distinction between dependent and independent variables. Factor analysis is used in the following circumstances: To identify underlying dimensions, or factors, that explain the correlations among a set of variables. To identify a new, smaller, set of uncorrelated variables to replace the original set of correlated variables in subsequent multivariate analysis (regression or discriminant analysis). To identify a smaller set of salient variables from a larger set for use in subsequent multivariate analysis.
19-6
Xi Aij F Vi Ui m
= = = = = =
i th standardized variable
standardized multiple regression coefficient of variable i on common factor j common factor standardized regression coefficient of variable i on unique factor i the unique factor for variable i number of common factors
19-7
19-8
It is possible to select weights or factor score coefficients so that the first factor explains the largest portion of the total variance. Then a second set of weights can be selected, so that the second factor accounts for most of the residual variance, subject to being uncorrelated with the first factor. This same principle could be applied to selecting additional weights for the additional factors.
19-9
Bartlett's test of sphericity. Bartlett's test of sphericity is a test statistic used to examine the hypothesis that the variables are uncorrelated in the population. In other words, the population correlation matrix is an identity matrix; each variable correlates perfectly with itself (r = 1) but has no correlation with the other variables (r = 0). Correlation matrix. A correlation matrix is a lower triangle matrix showing the simple correlations, r, between all possible pairs of variables included in the analysis. The diagonal elements, which are all 1, are usually omitted.
19-10
Communality. Communality is the amount of variance a variable shares with all the other variables being considered. This is also the proportion of variance explained by the common factors. Eigenvalue. The eigenvalue represents the total variance explained by each factor. Factor loadings. Factor loadings are simple correlations between the variables and the factors. Factor loading plot. A factor loading plot is a plot of the original variables using the factor loadings as coordinates. Factor matrix. A factor matrix contains the factor loadings of all the variables on all the factors extracted.
19-11
Factor scores. Factor scores are composite scores estimated for each respondent on the derived factors. Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy. The Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy is an index used to examine the appropriateness of factor analysis. High values (between 0.5 and 1.0) indicate factor analysis is appropriate. Values below 0.5 imply that factor analysis may not be appropriate. Percentage of variance. The percentage of the total variance attributed to each factor. Residuals are the differences between the observed correlations, as given in the input correlation matrix, and the reproduced correlations, as estimated from the factor matrix. Scree plot. A scree plot is a plot of the Eigenvalues against the number of factors in order of extraction.
19-12
19-13
19-14
The objectives of factor analysis should be identified. The variables to be included in the factor analysis should be specified based on past research, theory, and judgment of the researcher. It is important that the variables be appropriately measured on an interval or ratio scale. An appropriate sample size should be used. As a rough guideline, there should be at least four or five times as many observations (sample size) as there are variables.
19-15
Correlation Matrix
Table 19.2
Variables V1 V2 V3 V4 V5 V6
V3
V4
V5
V6
1.000 -0.136
1.000
19-16
19-17
19-18
19-19
Factor Matrix
Variables V1 V2 V3 V4 V5 V6
19-20
19-21
The lower left triangle contains the reproduced correlation matrix; the diagonal, the communalities; the upper right triangle, the residuals between the observed correlations and the reproduced correlations.
19-22
19-23
19-24
Scree Plot
Fig 19.2
3.0 2.5
Eigenvalue
3 4 5 Component Number
19-25
19-26
Although the initial or unrotated factor matrix indicates the relationship between the factors and individual variables, it seldom results in factors that can be interpreted, because the factors are correlated with many variables. Therefore, through rotation the factor matrix is transformed into a simpler one that is easier to interpret. In rotating the factors, we would like each factor to have nonzero, or significant, loadings or coefficients for only some of the variables. Likewise, we would like each variable to have nonzero or significant loadings with only a few factors, if possible with only one. The rotation is called orthogonal rotation if the axes are maintained at right angles.
19-27
The most commonly used method for rotation is the varimax procedure. This is an orthogonal method of rotation that minimizes the number of variables with high loadings on a factor, thereby enhancing the interpretability of the factors. Orthogonal rotation results in factors that are uncorrelated. The rotation is called oblique rotation when the axes are not maintained at right angles, and the factors are correlated. Sometimes, allowing for correlations among factors can simplify the factor pattern matrix. Oblique rotation should be used when factors in the population are likely to be strongly correlated.
19-28
A factor can then be interpreted in terms of the variables that load high on it. Another useful aid in interpretation is to plot the variables, using the factor loadings as coordinates. Variables at the end of an axis are those that have high loadings on only that factor, and hence describe the factor.
19-29
V2
V6 Component 2
V2
V3 V4 V5
-5.72E-02
0.934 -9.83E-02 -0.933
0.848
-0.146 0.854 -8.40E-02
V5
V3
V1
V6
8.337E-02 0.885
19-30
The factor scores for the ith factor may be estimated as follows:
19-31
By examining the factor matrix, one could select for each factor the variable with the highest loading on that factor. That variable could then be used as a surrogate variable for the associated factor. However, the choice is not as easy if two or more variables have similarly high loadings. In such a case, the choice between these variables should be based on theoretical and measurement considerations.
19-32
The correlations between the variables can be deduced or reproduced from the estimated correlations between the variables and the factors. The differences between the observed correlations (as given in the input correlation matrix) and the reproduced correlations (as estimated from the factor matrix) can be examined to determine model fit. These differences are called residuals.
19-33
Barlett test of sphericity Approx. Chi-Square = 111.314 df = 15 Significance = 0.00000 Kaiser-Meyer-Olkin measure of sampling adequacy = 0.660
Initial Eigenvalues
Factor 1 2 3 4 5 6 Eigenvalue 2.731 2.218 0.442 0.341 0.183 0.085 % of variance 45.520 36.969 7.360 5.688 3.044 1.420 Cumulat. % 45.520 82.488 89.848 95.536 98.580 100.000
19-34
Factor Matrix
Variables V1 V2 V3 V4 V5 V6 Factor 1 0.949 -0.206 0.914 -0.246 -0.850 -0.101 Factor 2 0.168 0.720 0.038 0.734 -0.259 0.844
19-35
19-36
19-37
SPSS Windows
To select this procedures using SPSS for Windows click: Analyze>Data Reduction>Factor