P5510 Lecture 8 - Factor Analysis Intro

advertisement
Exploratory Factor Analysis
A collection of techniques for analyzing clusters of variables (items) that
are being treated as indicators of underlying factors.
Reason #1 for doing factor analysis: Identifying item clusters.
Situation: We have responses to a large number of items. We want to see if these
many many items represent just a few clusters of items.
We try to identify clusters of items which
a) correlate highly among themselves and
b) correlate hardly at all with other variables
The clusters become psychological constructs. Theys may represent
important processes or structures within us.
Factor Analysis 1 - EFA - 1
Example of a collection of behaviors:
1. Am the life of the party.
2. Feel little concern for others.
3. Am always prepared.
4. Get stressed out easily.
5. Have a rich vocabulary.
6. Don't talk a lot.
7. Am interested in people.
8. Leave my belongings around.
9. Am relaxed most of the time.
10. Have difficulty understanding abstract ideas.
11. Feel comfortable around people.
12. Insult people.
13. Pay attention to details.
14. Worry about things.
15. Have a vivid imagination.
16. Keep in the background.
17. Sympathize with others' feelings.
18. Make a mess of things.
19. Seldom feel blue.
20. Am not interested in abstract ideas.
21. Start conversations.
22. Am not interested in other people's problems.
23. Get chores done right away.
24. Am easily disturbed.
25. Have excellent ideas.
26. Have little to say.
27. Have a soft heart.
28. Often forget to put things back in their proper place.
29. Get upset easily.
30. Do not have a good imagination.
31. Talk to a lot of different people at parties.
32. Am not really interested in others.
33. Like order.
34. Change my mood a lot.
35. Am quick to understand things.
36. Don't like to draw attention to myself.
37. Take time out for others.
38. Shirk my duties.
39. Have frequent mood swings.
40. Use difficult words.
41. Don't mind being the center of attention.
42. Feel others' emotions.
43. Follow a schedule.
44. Get irritated easily.
45. Spend time reflecting on things.
46. Am quiet around strangers.
47. Make people feel at ease.
48. Am exacting in my work.
49. Often feel blue.
50. Am full of ideas.
The above are 50 items from a common personality questionnaire.
Do they represent 50 different, distinct aspects of behavior? 10? 5? 1?
If I analyze each item separately, I’ll have to perform 50 separate analyses and
interpret 50 different results.
Will I be accused of “shotgunning” – doing a whole bunch of analyses in the hopes
of finding a significant relationship.
Will all those results be clearly different?
If some of the items are highly correlated with each other, then the results for those
similar items will be redundant.
Factor Analysis 1 - EFA - 2
Example of the “identifying clusters” use of factor analysis
The Big Five. Questionnaires with 100s of different descriptors were given to
persons who were asked to say to what extent those descriptors applied to
themselves.
Five clusters of items were “discovered. They became known became known as
the Big Five.
The descriptors were taken from dictionaries, not devised based on theories of
personality. For this reason, these kinds of study – ones employing descriptors
taken from the language of the culture of the respondents are called lexical studies.
In this conceptualization, factors are viewed as collectors or organizers of items.
This use of factor analysis is a bottom up process: Going from items at the
bottom to factors at the top.
Factor Analysis 1 - EFA - 3
Reason #2 for doing factor analysis: Identifying items that indicate constructs
that we believe exist for theoretical reasons.
Situation: We have a theory about the relationship(s) between two or more
psychological constructs.
Alas, psychological constructs are internal states that are not directly observable.
We can only observe the behaviors that presumably are the result of persons
having those internal states.
We want to identify as many specific behaviors that represent the constructs as
possible so that the theory about the relationships among the constructs can be
tested by computing the relationships among the behaviors.
We try to think of specific behaviors, usually responses to items on a
questionnaire to be indicators of a construct. These are behaviors which
a) seem to be behaviors that arise from the presence of the construct of interest
b) correlate highly among themselves and
c) correlate less with other variables
Such behaviors are often called indicators of the psychological constructs.
A top down process, beginning with our interest in the dimensions and ending
with items that we believe represent those dimensions.
Example. Rosenberg (1965) said, “Clinical and experimental studies have
provided valuable insights into the nature of the self-image . . . , but we still know
every little about the nature and distribution of self-esteem, self-values a, and selfperceptions in the broader society.” From this belief in the importance of selfesteem, he identified 10 behavioral descriptors which became the Rosenberg Selfesteem (RSE) scale.
There have been many studies factor analyzing just those ten items to codify the
existence of the self-esteem construct, e. g.,
Marsh, H. W., Scalas, L. F., & Nagangast, B. (2010). Longitudinal Tests of Competing
Factor Structures for the Rosenberg Self-Esteem Scale: Traits, Ephemeral Artifacts, and Stable
Response Styles. Psychological Assessment, 22, 366-381.
Factor Analysis 1 - EFA - 4
Ralph Hood’s M scale. In 1972 Ralph believed there were 7 dimensions of
religious experience. He thought of items that represented each of the 7
dimensions. He then performed a factor analysis of all the items in the hope that 7
clusters would be identified in the analysis. Alas, the factor analysis identified 2
clusters, and these 2 have been the two dimensions of the scale of religious
experience he has used for the past 40+ years.
This is a case in which theory lead to a factor analysis whose results lead to a
change in the theory.
In this 2nd use of factor analysis, constructs are viewed as generators of items.
Note that both definitions involve two major concepts:
1. A large number of observed variables.
2. A smaller collection of unobserved, factors/latent variables.
Factor Analysis 1 - EFA - 5
Concepts Important for Factor Analysis
Variance
More than you ever wanted to know about variance.
Variance of a single variable
Suppose N = 4.
Variance = Σ(X-M)2/N – a formula based on the differences between the scores and the mean.
Pictorially
A
B
C
D
M
The above representation of the variance expresses it in terms of differences between scores and
the mean.
New information
The variance of a set of scores can also be computed in terms of the differences between all
possible pairs of scores.
For N = 4,
(A-B)2 (A-C)2 (A-D)2
(B-C)2 (B-D)2
(C-D)2
A
B
C
D
M
It can be shown that
Σ(Σ(Xi – Xj)2
--------------- is also the variance
(n*(n-1)/2
So, the variance of a set of scores is the average of the squared differences between all possible
pairs of scores.
Factor Analysis 1 - EFA - 6
Example, from an Excel spreadsheet.
Values of X are 1,2,3,4, and 5.
x
1
2
3
4
5
m
3
3
3
3
3
x-m
-2
-1
0
1
2
(x-m)^2
4
1
0
1
4
Σ(X-m)^2
x1
1
1
1
1
2
2
2
3
3
4
x2
2
3
4
5
3
4
5
4
5
5
x1-x2
-1
-2
-3
-4
-1
-2
-3
-1
-2
-1
(x1-x2)^2
1
4
9
16
1
4
9
1
4
1
ΣΣ((X-m)^2)
/
10
50
(n-1)
=
4
/
(n*(n-1))
20
Variance
2.5
=
Variance
2.5
This result, while possibly interesting to those who are interested in things “mathematical” is not
of much practical use if you have only one variable.
If it were, it would be more well-known.
But it is of value when you have two variables plotted in a scatterplot and are contemplating
variation between the points in the scatterplot.
Factor Analysis 1 - EFA - 7
4
Variance of a scatterplot.
3
5
Suppose we have N pairs of scores on two variables.
Total Variance: The average of squared distances between all nonredundant pairs of points.
Scatterplot with large variance
Y2
Scatterplot with small variance
D
B
B
C
A
D
C
A
Y1
This is a practical application of the result shown above. It shows how to get a measure of the
differences between points in space, no matter how many dimensions are represented.
Partitioning the total variance.
Using the Pythagorean theorem, it can be shown that each squared distance can be expressed as
the sum of squared differences in the two axis dimensions.
B
H
Y
AA
X
(Recall H2 = X2 + Y2.) H = square root of (X2 + Y2). E.g., 5 = square root of (32 + 42)
So, squared diagonal difference between A and B = Squared horizontal difference + squared
vertical difference.
That is, total variance = sum of individual variable variances = X variance + Y variance.
The total variance can be partitioned (separated) into as many pieces as there are dimensions.
Standardized Total Variance
If variables are standardized (Z-scores), so Variance along each dimension = 1, then
Standardized Total variance = standardized X + standardized Y variances
Since standardized X variance = standardized Y variance = 1,
Standardized Total Variance = number of dimensions.
Factor Analysis 1 - EFA - 8
Concentration of Variance along a line
Variance concentrated along 1 dimension
Variance spread between dimensions
D
D
B
B
C
C
A
A
Contrast the two figures above.
Each has the same total variance – 2 if the variables are standardized.
But the left hand figure has that variance concentrated along a line. That line is called a factor
in the factor analysis literature.
The important point about a line of concentration is that differences along that line represent
most of the variation between the people represented by the points in the scatterplot.
Note also that if the points are clustered near a line, the correlation between the two variables on
which the persons are measured will be large
So identifying lines of concentration allow us to identify dimensions that account for the most
variation between people – whatever that line is, it’s what distinguishes people from each other.
Simultaneously, identifying lines of concentration allows us to identify variables that are highly
correlated with each other. If a scatterplot has a line around which the points are tightly
clustered, that means the two variables are highly correlated.
This is a way of thinking about factor analysis – each factor is a in a multidimensional
scatterplot, around which points representing people cluster, so that variation along that line
accounts for the greatest amount of variation between people.
The amount of variance along a line divided by the total variance of the points is a measure of
the proportion of total variance accounted for by the line.
Factor Analysis 1 - EFA - 9
Proportion of variance
Suppose we were to create a multi-dimensional scatterplot of all the variables in a collection.
Suppose also that all the points in that scatterplot were concentrated along one line, forming a
multidimensional sausage. The example below is a 3-dimensional sausage scatterplot.
This situation would be a 1-factor solution. All of the variables were correlated highly with each
other.
There are as many possible factors in a solution as there are variables.
A key quantity is the proportion of total variance along each of those lines.
This quantity is analogous to R2, except that when there are multiple factors, there will be
multiple lines of concentration, so you’ll have an R2 – a proportion of variance - for each factor.
One of the things we routinely look for in a factor analysis is the proportion of all variance
accounted for by the factors we’ve decided to work with.
If the proportion is small, then our factor analysis has not done much for us. If it’s large, then
that’s good.
Factor Analysis 1 - EFA - 10
Eigenvalues.
Eigenvalue: A quantity that is the result of the solution of a set of equations that arise as part of
the factor analytic procedure.
Also called the characteristic root of the equation solved as part of the factor analysis.
There will be as many eigenvalues in a factor analysis as there are variables.
Each eigenvalue is associated with a single factor in factor analysis.
The size of the eigenvalue corresponds to the amount of variance along the line represented by
the factor.
The ratio of the value of the eigenvalue to the number of variables happens to equal the
percentage of variance along the line corresponding to that factor.
So eigenvalues and percentage of variance give the same information in different forms.
Factor Analysis 1 - EFA - 11
Factor analysis equations.
Suppose we have identified two factors that account for responses of
four different items.
Suppose the relationships between items (Ys) and Factors (Fs) are as
follows
Fs are characteristics of persons.
Y1 = A11F1 + A12F2 + U1
Y2 = A21F1 + A22F2 + U2
There are two characteristics, F1 and F2,
that cause 4 different behaviors, Y1, Y2,
Y3, and Y4.
Y3 = A31F1 + A32F2 + U3
Y4 = A41F1 + A42F2 + U4
In the above, the letter, A, is the loading of an item on a factor
Loadings represent the extent to which items are related to factors.
In some analyses they are literally correlation coefficients.
In other analyses, they are partial regression coefficients.
In either case, they tell us how the item is connected to the factor.
Aitem,factor: The first subscript is the item, the 2nd the factor.
The “U”s in the above are unique sources of variation, specific to the item. This
variation is unobserved, random variation – errors of measurement, for example.
In the above, the Fs are called common factors because they are common to all 4
variables. As mentioned above, many people view them as person characteristics.
So, in the above, variation in two Fs determines variation in 4 items. There has
been a 2:1 reduction in complexity.
All that is needed to determine what values of Y1, Y2, Y3, and Y4 a person will
have (within the limits of measurement error) is knowledge of the person’s
position on 2 Fs.
Factor Analysis 1 - EFA - 12
Reproduced correlations.
One of the goals of factor analysis is to use the equations shown above to compute
what the correlations among the Ys would be based on those equations. The
correlations created from the equations are called reproduced correlations.
Reproduced correlation : Correlation between two variables computed from the
factor analysis equation. It could also be called a predicted correlation. Loosely,
they might be called Y-hat correlations. This is based on the idea that we might
use the equations above to generate Y-hats and then compute the correlations
between the y-hats. Those correlation would be the reproduced correlations.
The goal of factor analysis is to be able to generate reproduced correlations that are
as close as possible to the actual, observed correlations from equations such as
those shown above.
In virtually all cases, the number of factors required to reproduce the correlations
will be smaller than the number of variables. This is the “identification of clusters”
goal of factor analysis.
The factor analysis algorithms estimate values of the loadings that yield
reproduced correlations between the Ys that are as similar as possible to the
observed correlations between them. (Thanks, mathematicians.)
Factor Analysis 1 - EFA - 13
Loading Tables and Loading Plots
A loading table is a table of loadings (As) of each variable onto each factor.
For example, suppose we had the following solution
Variable
Y1
Y2
Y3
Y4
Ai1: Loadings on F1
.88
.94
.29
.28
Ai2: Loadings on F2
.37
.24
.87
.88
A loading plot is a plot of loadings of each variable on axes representing factors.
That is for the following situation . . .
Y1 = .88*F1 + .37*F2 + U1.
Y2 = .94*F1 + .24*F2 + U2.
Y3 = .29*F1 + .87*F2 + U3.
Y4 = .28*F2 + .88*F2 + U4.
The loading plot would look like the following . . .
F2
Y3 Y4
Y1
Y2
F1
What this particular table of loadings tells us and the loading plot shows us is that
while all the variables are influenced by both factors, F1 has a greater influence on
Y1 and Y2 while F2 has a greater influence on Y3 and Y4.
So, when we consider loading plots, we’ll look for the extent to which the
variables cluster around the axes of the plot.
Factor Analysis 1 - EFA - 14
What loading plots show.
When factors are uncorrelated, loading plots can show
1) Correlations of variables with factors and with other variables
2) Clusters of variables
3) Sense of # of factors
Correlation from a loading plot. Draw a line from origin to each variable.
F2
Y3 Y4
Correlation of Y1 with Y4 =
r14 ~~ V1*V4*Cosine(Angle)
V4
Angle
Y1
V1
Y2
If the angle between two variables is 90 degrees, thenF1
the two variables are
uncorrelated, r = 0 because Cosine(90°) = 0.
If the angle is 0° the two variables are highly positively correlated.
If the angle is 180° the two variables are highly negatively correlated.
If V1 is small, this means that Y1 is not well-predicted by the factors – Y1’s errors
of measurement are large.
If V1 is large, this means that Y1 IS well-predicted by the factors – Y1’s errors or
measurement are small.
Factor Analysis 1 - EFA - 15
Simple Structure
In exploratory factor analysis, the equations that are identified are not the only
equations that can generate the reproduced correlations.
In fact, it can be shown that there are an infinite number of equations which will
yield the same reproduced correlations.
In order to pick one set of equations, factor analysts usually pick that set for which
the following criterion holds
Each Y has a large loading on only one F and nearly zero
loadings on all other Fs.
This criterion is called simple structure.
Alternatives to Simple Structure
Note: Most factor analysts look for and hope for simple structure.
If they find simple structure, this means that each response is influenced by only
one factor. “Life is simpler when there is simple structure.”
I call this type of situation a partitioned influence situation. The influence of
factors is partitioned – split up with no overlap. Each item is influenced by only
one factor.
I’ll point this out as we progress through factor analysis.
Another possibility is what I call shared-influence or parallel-influence or jointinfluence situations. In these situations, each item may be influenced jointly by
two or more factors. Simple structure is not possible.
Factor Analysis 1 - EFA - 16
Rotation Example
It is possible to transform one set of equations into a different set without changing
the reproduced correlations. Such a transformation is called a rotation of the
factor solution.
So, the following loadings
F2
Y3 Y4
Y1
Y2
F1
yield the same reproduced correlation matrix as these
F2
Y3 Y4
F1
Y1
Y2
Clearly, the top solution is one that makes more sense than the bottom solution.
The top solution is closer to simple structure than the bottom one.
Factor Analysis 1 - EFA - 17
Communalities of variables
Please God, let this interminable introduction of factor analysis concepts end.
Communality of a variable: Percent of variance of that variable related to the
collection of common factors, the Fs. If Fs are thought of as predictors, it’s the Rsquared of Ys predicted from Fs.
A variable with small communality isn’t predictable by the set of Fs. Probably is
its own factor. Get rid of it.
A variable with large communality is highly predictable from the set of Fs.
Factor Analysis 1 - EFA - 18
SPSS Factor Analysis Example 1 (finally)
A clear-cut orthogonal two-factor solution
The observed correlation matrix
Y1
Y2
Y3
Y4
Y1
Y2
Y3
1.000 .607 -.130
1.000 -.116
1.000
Y4
-.240
-.167
0.588
1.000
Note that for a small correlation
matrix such as this, we can
identify the likely cluster “by
eye”. I see Y1+Y2 and Y3+Y4.
comment data from PCAN chapter in QSTAT manual.
matrix data variables = y1 y2 y3 y4
/format=free upper diag /n=50
Don’t worry about the
/contents = corr.
syntax. Lines 2-4 are
begin data.
special because the
1.000 .607 -.130 -.240
data were correlations,
1.000 -.116 -.167
not raw scores.
1.000 .588
1.000
end data.
factor /matrix = in(cor=*)
/print = default correlation
/plot=eigen rotation(1,2)
/rotation=varimax.
Communalitie s
Init ial Ext ractio n
X1 1.0 00
.80 2
X2 1.0 00
.80 5
X3 1.0 00
.80 4
X4 1.0 00
.79 1
Ext ractio n Me thod:
Pri ncipa l
Co mpon ent A nalysi s.
This is the factor analysis syntax. Again,
the “/matrix. . .” is specific to this
particular example.
A communality is the proportion of variance in a
variable which is related to the common factors.
It’s analogous to R2, if the factors are considered
as predictors or causes of the variables.
A variable with small communality is a
“relationship outlier” unrelated to any of the other
variables.
Factor Analysis 1 - EFA - 19
Total Va rianc e Ex plaine d
Ext ractio n Su ms of Squa red
Loa ding s
Init ial E igenvalues
Ro tation Sum s of S quared
Loa ding s
% of
Cu mula tive
% of
Cu mula tive
% of
Cu mula tive
Co mpon ent To tal Va riance
%
To tal Va riance
%
To tal Va riance
%
1
1.9 27
48. 184
48. 184 1.9 27
48. 184
48. 184 1.6 13
40. 321
40. 321
2
1.2 75
31. 879
3
.43 0
10. 738
90. 801
4
.36 8
9.1 99
100 .000
80. 063
1.2 75
31. 879
80. 063
1.5 90
39. 742
80. 063
“Initial
Eigenvalues”:
As many
factors as
there are
variables.
Ext ractio n Me thod : Prin cipal Comp onen t Ana lysis.
Eigenvalues are quantities which are computed as part of the factor analysis algorithm. Each
eigenvalue represents the amount of standardized variance in all the variables which is related to
a factor. In the mathematics of factor analysis, there are always as many eigenvalues as there
are variables. But they vary in size. In the “Extraction” columns, as many as the user requested
are shown, beginning with the largest..
100 * eigenvalue / No. of variables is the percent of total variance of all the variables related to
a factor – concentrated along the line in a multidimensional scatterplot representing the factor.
The Scree plot is simply a plot of eigenvalues vs. factor number. It’s used to give us a hint at
how many factors are required for a data set.
Bluff
Scree Plot
2.5
Bluff
2.0
Cliff
Cliff
1.5
Scree
Eigenvalue
1.0
.5
0.0
1
2
3
4
Water
Component Number
An ideal scree plot is a horizontal line, a vertical line, then a diagonal line.
Two factor-retention rules:
1) Retain all factors with eigenvalues >= 1.
2) Retain all factors making up the “bluff” above the scree in the scree test – two in this
example.
Factor Analysis 1 - EFA - 20
Scree
Unrotated loadings
Com pon e nt
a
M a trix
Loadings are analogous to
correlations.
Co m p o n e n t
1
.7 2 5
2
.5 2 6
X2
.6 8 5
.5 8 0
X3
-.6 4 4
.6 2 4
X4
-.7 2 0
.5 2 3
X1
I don’t know what happened here.
E xt ra cti o n
M e th o d :
P ri n ci p a l
Co m p o n e n t
A n a l ysi s.
a.
2
co m p
one n
ts
e xt ra
cte d .
Unrotated loadings are computed first. They’re computed so that the first factor accounts for the
greatest percentage of variance.
The unrotated loadings will be difficult to understand if you are hoping for simple structure
(partitioned influence). This is because the first factor influences ALL of the variables.
Plot of unrotated Loadings
1.00

comp2
0.50


The dashed lines were added
by me to show the rotated
loading plot on the same
graph as the plot of unrotated
loadings.

0.00
-0.50
-1.00
-1.00
-0.50
0.00
0.50
1.00
comp1
FALecture - 21
Printed on 2/18/2016
Rotated loadings
Rotated Com pone nt M atrix a
Factors can be “rotated” or transformed without loss of ability to
represent the correlations between the variables. A rotation is a
systematic change in the loadings such that the points representing
variables move in a circle about the origin of the loading plot.
Co mpon ent
1
X1
.88 7
2
-.1 25
X2
.89 5
-5. 837E -02
X3 -3. 067E -02
.89 6
X4
.87 6
-.1 55
Extracti on M ethod :
Pri ncipa l Com pon ent
An alysis.
Ro tation Met hod: Varim ax
wit h Ka iser Norma lizat ion.
a. Ro tation
con verg ed in 3
ite ration s.
It is common to “rotate” the loadings until the loading pattern has a
pattern called simple structure.
Simple structure: Loadings are as close to 0 or 1 as possible, with
each variable having high loadings on only 1 factor.
Perfect simple structure means that each variable loads on (is
influenced by) only one factor.
Note – X1 and X2 are each influenced primarily by Component 1
above. X3 and X4 are each influenced primarily by Component 2.
Component Plot in Rotated Space
1.0
x4
x3
.5
Component
Tra nsfor mati on Ma trix
x2
x1
0.0
2
.69 4
2
-.69 4
Component 2
Co mpon ent
1
1
.72 0
.72 0
Ext ractio n Me thod :
Pri ncipa l Com pone nt
An alysis.
Ro tation Met hod:
Va rimax with Kaise r
No rmali zation .
-.5
-1.0
-1.0
-.5
0.0
.5
1.0
Component 1
Most analysts identify (e.g., name) the factors after rotation. A factor is named for the variables that
have the highest loadings on it. In the above case, we’d look at the names of variables Y1 and Y2
and give the first factor a name that “combined” the essence of the two. We’d do the same for factor
2, naming it after variables Y3 and Y4.
FALecture - 22
Printed on 2/18/2016
Exploratory Factor Analysis of a Big Five sample
Data are from Lyndsey Wrensen’s 2005 SIOP paper
Data are responses to 50 IPIP Big Five items under instructions to respond honestly.
GET FILE='G:\MdbR\Wrensen\WrensenDataFiles\WrensenMVsImputed070114.sav'.
factor variables =
he1 he2r he3 he4r he5 he6r he7 he8r he9 he10r
ha1r ha2 ha3r ha4 ha5r ha6 ha7r ha8 ha9 ha10
hc1, hc2r, hc3, hc4r, hc5, hc6r, hc7, hc8r, hc9, hc10
hs1r, hs2, hs3r, hs4, hs5r, hs6r, hs7r, hs8r, hs9r, hs10r
ho1, ho2r, ho3, ho4r, ho5, ho6r, ho7, ho8, ho9, ho10
/print = default
/plot=eigen
/extraction=ML
/rotation=varimax.
Factor Analysis
[DataSet3] G:\MdbR\Wrensen\WrensenDataFiles\WrensenMVsImputed070114.sav
If we did not know that the data represent a Big Five questionnaire, then this would be
an example of the first reason for doing factor analysis - bottom up processing –
having a collection of items – 50 in this case - and asking, “How many dimensions do
these 50 items represent?”
On the other hand, it can also be considered to be an example of the 2nd reason for
doing factor analysis – having a set of dimensions – 5 in this case – and attempting to
identify items that are indicators of those dimensions.
The point is that both reasons for doing factor analysis result in the same analysis.
The difference is in the intent of the investigator.
FALecture - 23
Printed on 2/18/2016
Communaliti es a
Init ial
.61 0
Ext ractio n
.57 5
he2 r
.63 2
.62 6
he3
.63 8
.74 9
he4 r
.68 6
.70 5
he5
.68 3
.69 8
he6 r
.62 5
.60 1
he7
.67 8
.57 4
he8 r
.49 6
.34 1
he9
.62 4
.62 4
he1 0r
.55 0
.46 3
ha1 r
.39 2
.39 1
ha2
.59 6
.47 1
ha3 r
.61 1
.50 2
ha4
.63 3
.63 9
ha5 r
.54 2
.53 4
ha6
.54 8
.52 3
ha7 r
.67 2
.63 6
ha8
.59 2
.48 5
ha9
.65 2
.72 4
ha1 0
.41 9
.33 5
hc1
.56 3
.57 7
hc2 r
.60 2
.59 6
hc3
.63 1
.63 7
hc4 r
.53 7
.50 3
hc5
.57 1
.51 6
hc6 r
.57 9
.54 3
hc7
.60 2
.51 3
hc8 r
.49 6
.33 3
hc9
.56 7
.49 5
hc1 0
.47 9
.34 4
hs1 r
.64 2
.63 7
hs2
.57 7
.58 9
hs3 r
.56 5
.55 1
hs4
.49 2
.78 0
hs5 r
.52 5
.55 1
hs6 r
.62 6
.66 0
hs7 r
.77 1
.78 1
hs8 r
.78 7
.85 3
hs9 r
.62 8
.60 0
hs1 0r
.56 3
.55 3
ho1
.65 7
.67 7
ho2 r
.59 1
.60 0
ho3
.50 5
.48 0
ho4 r
.49 1
.44 4
ho5
.62 8
.84 5
ho6 r
.56 4
.56 4
ho7
.44 9
.39 1
ho8
.64 8
.71 7
ho9
.47 3
.45 9
ho1 0
.70 2
.68 4
he1
“Gray” area
0.000
Ext ractio n Me thod : Maximum Like lihoo d.
a. On e or m ore comm unal itiy e stima tes g reate r than 1
we re en count ered durin g iteration s. Th e resu lting
sol ution shou ld be interprete d with cau tion.
FALecture - 24
1.000
Printed on 2/18/2016
Total Va rianc e Ex plained
Ini tial E igenvalue s
Fa ctor
1
Extractio n Su ms o f Squ ared Load ings
To tal
% of Va riance Cu mula tive %
6.9 35
13 .871
13 .871
To tal
% of Va riance Cu mula tive %
6.1 66
12 .332
12 .332
Ro tation Sum s of Squa red L oadin gs
To tal
% of Va riance Cu mula tive %
3.6 28
7.2 55
7.2 55
2
5.2 25
10 .450
24 .321
4.9 00
9.8 00
22 .132
3.5 44
7.0 87
14 .343
3
4.6 78
9.3 55
33 .676
4.2 22
8.4 44
30 .576
3.4 30
6.8 59
21 .202
4
3.0 38
6.0 75
39 .751
2.4 52
4.9 03
35 .479
3.3 89
6.7 78
27 .980
5
2.6 49
5.2 99
45 .050
2.4 02
4.8 04
40 .284
2.3 81
4.7 61
32 .741
6
1.7 39
3.4 79
48 .528
1.2 72
2.5 44
42 .828
2.1 64
4.3 27
37 .069
7
1.5 61
3.1 22
51 .651
1.2 42
2.4 85
45 .313
1.6 63
3.3 25
40 .394
8
1.4 93
2.9 86
54 .637
1.1 08
2.2 16
47 .528
1.5 75
3.1 50
43 .543
9
1.3 87
2.7 75
57 .412
1.0 06
2.0 12
49 .541
1.2 64
2.5 29
46 .072
10
1.3 58
2.7 16
60 .128
.91 9
1.8 38
51 .379
1.2 57
2.5 14
48 .587
11
1.2 00
2.4 00
62 .527
.90 6
1.8 13
53 .192
1.1 98
2.3 97
50 .983
12
1.1 61
2.3 23
64 .850
.77 0
1.5 40
54 .732
1.1 97
2.3 93
53 .376
13
1.0 74
2.1 48
66 .998
.68 9
1.3 78
56 .111
1.1 08
2.2 16
55 .592
14
1.0 30
2.0 60
69 .058
.61 0
1.2 20
57 .331
.86 9
1.7 39
57 .331
15
.90 5
1.8 09
70 .867
16
.83 9
1.6 77
72 .545
17
.82 5
1.6 50
74 .195
18
.78 6
1.5 72
75 .767
19
.73 4
1.4 68
77 .235
20
.71 6
1.4 32
78 .666
21
.68 1
1.3 61
80 .028
22
.64 3
1.2 86
81 .314
23
.59 5
1.1 91
82 .505
24
.58 7
1.1 73
83 .678
25
.57 6
1.1 52
84 .831
26
.54 3
1.0 85
85 .916
27
.50 7
1.0 15
86 .930
28
.49 6
.99 3
87 .923
29
.47 9
.95 8
88 .881
30
.44 8
.89 7
89 .777
31
.43 5
.87 0
90 .647
32
.40 8
.81 5
91 .463
33
.37 2
.74 3
92 .206
34
.36 8
.73 7
92 .943
35
.35 5
.71 1
93 .654
36
.33 1
.66 3
94 .317
37
.29 4
.58 9
94 .905
38
.26 9
.53 7
95 .443
39
.26 3
.52 5
95 .968
40
.25 9
.51 8
96 .486
41
.23 9
.47 8
96 .964
42
.23 4
.46 9
97 .432
43
.21 2
.42 3
97 .855
44
.19 5
.39 1
98 .246
45
.18 7
.37 5
98 .621
46
.16 8
.33 6
98 .957
47
.15 7
.31 5
99 .272
48
.14 6
.29 2
99 .564
49
.12 8
.25 6
99 .820
50
.09 0
.18 0
10 0.000
The program used the default “extracted all
factors with eigenvalues >= 1”rule, resulting
in 14 factors extracted.
Extractio n Me thod : Maximum Likeliho od.
There may be 14 different sources of influence. But all but 5 of those sources are of little interest to
us now.
FALecture - 25
Printed on 2/18/2016
Inspection of the scree plot suggests retaining 5 factors.
The unusually large 1st eigenvalue suggests the presence of an overall factor, perhaps common to all
items. I’ve spent the last several years studying that factor.
FALecture - 26
Printed on 2/18/2016
Going with the 14 factor model.
The unrotated factor matrix
Fac tor M atrix a
Factor
1
2
3
4
5
6
7
8
9
10
11
12
13
14
he1
.07 4
.37 6
-.53 5
.06 5
.08 3
.02 6
.15 7
.01 8
.06 2
.17 6
.02 4
.14 8
.17 8
.12 9
he2 r
.18 6
.48 0
-.40 5
.26 6
.03 0
.00 8
.20 7
-.00 6
.15 0
-.15 7
.08 3
-.13 4
-.07 4
-.07 1
he3
.42 4
.37 0
-.32 3
.17 0
-.01 1
.15 2
-.17 9
.23 5
-.33 8
-.04 1
-.10 4
-.18 3
.03 7
.16 4
he4 r
.26 3
.34 0
-.52 7
.34 4
.13 1
.00 7
-.01 6
-.09 7
.19 8
.06 6
.08 2
.04 5
.16 5
-.13 5
he5
.37 2
.55 2
-.17 5
.26 4
.01 2
-.12 6
-.09 6
-.16 3
-.19 4
-.05 8
-.15 0
.01 3
-.19 5
-.03 2
he6 r
.49 6
.42 4
-.15 8
.14 4
.04 2
-.12 8
.11 8
.01 7
.12 9
-.14 8
.06 9
-.10 7
-.20 4
-.01 7
he7
.25 1
.56 1
-.27 2
.23 8
-.02 2
-.03 6
-.02 6
-.03 8
.02 8
.07 6
-.04 0
.22 5
-.05 7
.02 0
he8 r
.01 4
.26 4
-.39 8
.10 0
.00 4
-.03 6
.04 8
.00 3
.06 1
-.14 8
.04 3
.10 3
.19 3
-.15 4
he9
.13 2
.42 4
-.48 8
.06 0
.03 9
.09 5
.02 7
.03 6
.06 7
.28 6
-.09 7
.10 2
.24 9
-.07 4
he1 0r
.41 6
.25 5
-.34 4
.23 1
.06 3
-.01 4
-.03 6
-.13 0
-.01 2
.01 2
.05 8
.01 3
-.12 7
.10 3
ha1 r
.15 2
.13 0
.20 2
.11 4
-.17 2
.08 1
-.13 0
-.18 3
.00 5
.14 5
.17 5
.00 7
.26 7
.29 6
ha2
.28 9
.41 8
.11 1
.23 5
-.19 6
.19 3
-.03 4
-.05 7
-.06 9
.06 5
-.10 3
-.05 4
-.20 4
-.01 2
ha3 r
.11 4
-.09 1
.49 2
.28 3
.00 4
.28 1
-.12 2
.08 3
-.00 1
-.04 1
.05 0
-.07 4
.21 0
.06 4
ha4
.14 9
.34 5
.51 0
.16 6
-.40 7
.05 7
.03 3
-.05 5
.10 9
.02 0
-.13 3
-.01 8
.05 3
-.06 7
ha5 r
.18 7
.23 9
.38 2
.13 8
-.24 3
.09 2
.03 8
-.18 7
-.07 9
.13 6
.03 1
.17 1
.06 0
.33 8
ha6
.12 3
.27 5
.44 9
.05 1
-.37 8
.13 2
-.01 5
.10 2
.13 0
-.13 3
.00 9
.03 3
.08 0
-.12 3
ha7 r
.37 3
.45 9
.28 7
.19 4
-.26 7
-.04 4
-.03 6
-.05 8
.08 2
-.07 7
.11 3
.21 6
-.11 9
.04 8
ha8
.20 8
.27 2
.40 3
-.07 6
-.23 4
.24 1
-.01 7
.22 0
.07 2
-.06 5
-.00 6
.06 4
.15 4
-.03 4
ha9
.11 1
.43 8
.50 3
.07 3
-.27 3
.12 0
.15 3
.14 6
.10 1
.27 2
-.05 5
-.12 9
-.03 0
-.15 2
ha1 0
.29 0
.40 3
.10 0
.09 4
-.12 9
-.07 0
-.03 4
.06 8
-.13 3
-.05 2
-.01 9
-.07 1
.04 7
.11 8
hc1
.16 7
-.06 9
.16 7
.04 8
.51 0
.14 4
.31 1
-.08 6
-.13 7
-.04 8
-.30 7
-.00 7
.04 7
-.10 7
hc2 r
.19 0
-.24 5
.25 3
.18 4
.60 5
.02 1
.04 8
.00 3
.04 8
.08 1
.09 5
-.06 6
.07 9
-.06 8
hc3
.42 0
-.01 4
.25 7
-.04 8
.33 5
.32 7
-.02 7
.08 8
-.25 9
-.08 0
-.13 9
.26 6
.00 4
-.02 8
hc4 r
.40 8
-.10 1
.24 6
.21 1
.34 7
-.08 8
.02 7
-.02 7
.13 3
.12 6
.19 9
-.08 6
.01 9
.09 6
hc5
.25 8
.01 8
.39 0
.16 1
.40 7
-.10 3
.15 7
.10 1
.12 0
.04 2
.05 8
-.01 0
-.07 9
hc6 r
.33 7
-.12 5
.25 6
.26 6
.48 9
.03 5
.05 9
.11 4
.08 9
-.01 5
.11 1
.01 6
.01 1
.00 4
hc7
.29 2
.11 7
.43 8
.24 7
.17 9
-.03 7
.33 4
.06 6
-.02 1
-.02 2
.05 5
.03 4
.01 3
-.07 6
hc8 r
.37 1
-.03 1
.08 9
.02 1
.12 0
.13 9
.01 0
-.01 2
.15 9
.03 8
.25 3
.16 4
-.03 7
.18 0
hc9
.21 1
.19 8
.32 8
.34 2
.31 9
-.05 4
.19 3
-.15 8
-.04 1
-.09 0
-.05 7
-.06 5
-.00 2
.05 8
hc1 0
.33 7
.05 1
.23 7
-.14 6
.19 0
.17 2
.11 4
.09 1
-.05 4
-.05 9
.09 6
-.20 7
.01 9
.07 2
hs1 r
.49 6
-.31 9
-.13 7
-.11 5
-.06 5
.13 1
-.09 8
.17 1
.38 8
-.11 4
-.07 2
-.09 4
-.03 9
.13 2
hs2
.48 0
-.23 3
-.20 4
-.06 1
-.02 4
.10 2
.15 4
.31 9
-.09 8
-.02 8
-.29 9
.14 1
-.05 2
-.03 0
hs3 r
.26 3
-.31 8
-.33 7
-.06 1
-.10 1
.06 6
-.01 7
.33 2
.22 0
-.08 1
-.24 8
.01 2
.12 9
.07 2
hs4
.27 8
-.35 3
-.09 5
.07 5
-.28 2
-.31 0
.39 2
.37 4
-.21 8
.05 4
.19 8
.05 9
-.01 8
.01 9
hs5 r
.45 3
-.33 2
.00 3
-.03 0
-.02 8
.06 6
-.23 6
-.01 9
.19 0
-.29 4
.12 9
.18 1
-.01 4
-.03 5
hs6 r
.38 9
-.45 3
-.19 6
.06 6
-.04 9
.22 9
-.07 5
.01 5
.17 6
.27 6
-.10 7
-.23 7
-.14 1
.07 4
hs7 r
.60 6
-.53 6
.01 9
.12 6
-.17 1
-.08 4
-.00 5
-.21 9
-.01 0
-.12 2
-.01 4
.09 7
.02 5
-.01 0
hs8 r
.62 1
-.56 2
.01 3
.12 0
-.13 7
-.09 7
-.03 4
-.24 3
-.12 0
.10 5
-.09 1
-.05 0
.09 4
-.06 0
hs9 r
.54 2
-.35 7
-.08 8
.10 8
-.08 1
.24 8
-.01 3
.08 6
.21 0
-.08 2
-.02 5
.00 3
-.17 3
-.04 0
hs1 0r
.52 5
-.18 1
-.10 9
.28 1
-.12 2
-.14 5
-.07 3
.15 2
.03 4
.05 7
.17 7
-.10 2
.10 8
-.17 8
ho1
.35 3
.08 0
-.15 2
-.37 5
-.14 2
.34 9
.36 0
-.23 1
-.11 1
-.18 5
.09 4
.00 5
.05 4
-.00 6
ho2 r
.40 5
-.16 1
-.13 5
-.29 2
.14 1
.10 2
-.22 4
.12 4
-.12 5
.30 7
.22 5
.12 3
-.17 8
-.04 9
ho3
.31 7
.29 7
-.01 7
-.31 8
.06 7
.08 6
-.14 7
-.04 7
-.18 9
-.13 7
.15 1
-.24 6
.10 6
-.06 9
ho4 r
.34 9
.11 6
-.05 5
-.29 1
.02 3
-.06 1
-.24 3
-.12 9
-.08 5
.23 6
.16 3
.08 4
-.15 9
-.13 8
ho5
.49 4
.33 5
.18 7
-.45 1
.08 7
-.43 8
.00 7
-.03 9
.12 1
.02 3
-.16 9
-.03 9
.04 3
.05 2
ho6 r
.46 9
.28 2
.07 3
-.24 4
-.02 0
-.04 1
-.17 3
.09 0
-.20 4
-.09 6
.22 9
-.19 5
.11 5
-.06 8
ho7
.28 6
.05 1
.05 2
-.33 6
.14 1
-.01 0
-.00 9
.23 6
.07 8
.15 0
-.15 4
.25 0
-.00 1
-.00 3
ho8
.22 7
.08 5
-.28 9
-.49 4
-.07 5
.26 4
.41 2
-.24 4
.07 0
.07 3
.08 6
-.06 9
-.03 3
.05 6
ho9
.26 5
.14 7
.34 8
-.23 8
.06 3
.28 4
-.07 2
-.05 1
-.05 9
.22 8
.05 9
.07 1
-.11 2
-.14 4
ho1 0
.56 6
.32 8
.04 9
-.37 4
.13 4
.01 3
-.20 2
.12 8
.02 6
-.10 6
.11 7
.05 7
.07 3
-.06 5
Ext ractio n Me thod : Maximum Like lihoo d.
a. 14 facto rs ext racted . 20 iterat ions required.
Goodne ss-of-fit Te st
Ch i-Squ are
68 1.869
df
Sig .
61 6
.03 3
Clearly we need some theory here.
FALecture - 27
Printed on 2/18/2016
.18 3
The rotated 14 factor matrix – looking for simplicity.
Rotated Factor Ma trixa
Factor
1
2
3
4
5
6
7
8
9
10
11
12
13
14
he1
-.04 3
-.08 1
-.15 1
.67 5
.08 8
-.02 6
.01 2
.13 8
-.15 6
.07 9
.04 2
.06 2
.12 8
he2 r
.01 6
.06 3
.05 2
.57 3
.38 1
.10 3
-.17 1
.15 4
-.11 3
-.07 0
-.18 7
.02 3
-.16 8
.02 9
he3
.18 3
-.05 6
.00 2
.35 1
.45 3
.46 3
-.01 1
-.09 5
-.09 1
-.11 5
.26 6
.09 1
.11 8
-.21 7
he4 r
.10 5
.08 2
-.07 2
.78 0
.18 8
.03 6
.01 4
-.02 7
.10 0
-.07 1
-.11 5
-.07 5
-.01 5
.04 0
he5
-.10 7
.03 1
.10 5
.36 1
.68 8
.16 6
.04 6
-.05 8
.13 5
.09 1
.08 4
-.06 2
-.01 2
-.00 3
-.08 7
he6 r
.14 0
.21 0
.14 3
.34 7
.50 4
.21 8
.00 1
.11 7
-.03 9
.15 5
-.15 9
.08 7
-.11 6
.09 6
he7
-.07 2
-.02 4
.14 6
.56 5
.43 3
-.01 1
.10 2
-.03 9
-.05 3
.09 8
.06 0
-.01 3
.08 4
.06 5
he8 r
-.05 5
-.12 8
-.05 2
.50 9
.02 6
.07 9
-.12 9
.02 2
.02 0
-.01 8
.00 2
.01 3
-.11 3
.14 8
he9
-.00 7
-.13 1
.00 6
.72 7
.03 4
.03 4
.11 2
.04 5
-.03 8
.07 4
.09 8
-.05 0
.04 2
-.20 4
he1 0r
.16 0
.09 8
-.10 0
.40 3
.46 3
.09 4
.09 7
.05 9
.06 5
-.01 2
-.03 8
.01 4
.10 7
.04 3
ha1 r
-.01 8
.05 4
.21 0
.04 6
-.00 2
.10 3
.02 0
-.00 8
.08 4
-.04 0
-.08 6
-.05 1
.55 8
-.00 1
ha2
.02 2
.02 3
.41 8
.13 0
.47 9
.05 7
.09 2
.03 6
.00 8
-.09 7
.05 2
-.07 8
.07 1
-.11 3
ha3 r
.10 7
.32 2
.36 0
-.19 4
-.12 6
.09 3
-.09 7
-.17 6
.04 1
-.27 0
.11 2
-.09 2
.24 4
-.00 1
ha4
-.06 9
.01 2
.74 2
-.07 0
.15 6
-.01 4
-.09 2
-.02 5
.11 4
.07 3
-.04 4
-.05 6
.14 2
-.03 4
ha5 r
-.11 0
.09 0
.39 0
-.07 1
.20 8
-.06 9
.03 2
.08 3
.03 7
.04 5
.07 5
.02 1
.53 9
.01 8
ha6
-.00 2
-.03 5
.69 1
-.07 9
.01 5
.10 6
-.06 9
-.02 0
-.01 1
-.00 1
-.01 3
-.01 1
.05 1
.13 7
ha7 r
-.03 6
.09 0
.54 0
.12 3
.40 8
.05 3
.09 7
-.01 7
.01 5
.11 1
-.05 0
.05 8
.21 1
.27 8
ha8
.05 9
.02 8
.61 1
-.03 4
-.06 8
.20 8
.02 0
.02 8
-.12 8
.03 9
.15 2
-.00 2
.10 7
.06 7
ha9
-.13 8
.11 5
.76 2
-.01 3
.07 9
.01 5
.10 3
.02 2
-.08 6
.05 5
-.08 5
.03 3
.00 4
-.27 3
ha1 0
-.05 2
.03 6
.27 7
.13 3
.32 0
.28 0
-.05 4
-.04 7
-.01 6
.10 9
.04 5
.09 1
.16 3
-.03 9
hc1
-.04 8
.51 2
-.07 7
-.02 3
3.1 0E-0 05
-.03 7
-.10 3
.19 3
.12 8
.05 3
.39 7
-.13 3
-.19 9
-.15 1
hc2 r
.05 6
.70 4
-.15 5
-.05 6
-.16 4
.02 2
.08 6
-.10 1
.09 4
-.06 2
.05 8
-.07 9
-.04 9
-.02 7
hc3
.11 1
.37 4
.10 3
-.07 5
.06 9
.16 8
.21 8
.07 5
.02 6
-.01 9
.60 4
-.07 4
.01 8
.10 9
hc4 r
.16 5
.62 5
-.00 3
.00 2
.04 2
.07 4
.12 9
-.06 9
.09 7
.04 8
-.13 2
.04 4
.15 7
.01 2
hc5
.02 6
.65 9
.07 3
-.10 5
.08 6
-.01 7
.01 3
-.06 0
-.12 6
.16 4
-.00 1
.05 5
.08 4
.00 8
hc6 r
.15 7
.69 5
-.00 8
.01 1
-.00 3
.04 2
.05 9
-.11 5
.00 2
-.04 6
.08 6
.01 7
-.00 9
.08 5
hc7
-.09 2
.57 0
.33 8
-.02 0
.09 5
-.00 3
-.05 4
.05 5
.07 3
.03 3
.08 1
.18 8
-.02 5
.03 3
hc8 r
.23 4
.30 7
.05 2
.06 0
.03 8
.02 9
.22 0
.12 7
-.07 9
.00 4
-.01 2
.03 5
.24 1
.21 2
hc9
-.15 9
.56 9
.14 4
.02 2
.26 5
.00 6
-.17 8
.00 3
.10 5
.02 3
.05 6
-.07 4
.05 8
-.00 9
hc1 0
.11 4
.34 8
.12 0
-.12 2
.01 5
.33 8
.05 4
.21 1
-.08 9
.02 0
.07 2
.01 5
.03 0
-.07 0
hs1 r
.76 5
.06 0
-.01 2
-.01 6
-.02 9
.10 2
.02 6
.08 1
-.03 4
.13 4
-.04 7
-.01 3
.00 7
.08 4
hs2
.49 7
.02 3
-.02 6
.07 4
.08 2
.02 0
.05 9
.09 8
.05 7
.12 0
.43 0
.28 7
-.16 3
-.06 4
hs3 r
.62 7
-.13 8
-.11 2
.14 1
-.14 7
.00 3
-.12 6
-.03 7
-.04 1
.12 1
.16 1
.13 0
-.07 9
-.03 5
hs4
.20 9
.01 9
-.04 1
-.06 1
-.02 3
-.02 9
-.01 4
.03 7
.12 4
.01 9
-.00 8
.84 3
-.03 2
-.00 2
hs5 r
.50 6
.09 0
-.01 7
-.08 0
-.01 7
.12 6
.11 6
-.01 5
.18 0
-.00 1
.03 9
-.02 5
.03 3
.46 3
hs6 r
.67 4
.07 2
-.13 2
-.05 1
.02 7
-.08 6
.19 7
.07 0
.14 0
-.12 2
-.07 3
-.01 3
.04 7
-.29 3
hs7 r
.52 9
.14 3
-.05 2
-.15 3
.09 0
-.01 9
.03 5
.08 5
.57 0
.01 8
.05 7
.19 3
.13 8
.23 0
hs8 r
.49 6
.15 5
-.08 7
-.14 7
.05 0
.02 5
.11 7
.04 2
.68 6
.01 5
.06 4
.18 3
.15 5
-.04 2
hs9 r
.69 2
.15 0
.06 2
-.01 9
.10 8
-.02 4
.12 3
.11 2
.12 2
-.12 1
.05 3
.06 1
-.06 6
.11 6
hs1 0r
.39 9
.16 6
.08 1
.20 1
.05 9
.18 5
.10 0
-.18 2
.32 5
-.08 8
-.13 8
.32 3
-.00 9
.03 2
ho1
.11 7
-.08 4
.06 9
.08 3
.03 9
.24 3
.03 7
.73 4
.08 6
-.01 9
.15 4
.03 8
.03 0
.10 2
ho2 r
.25 8
.04 7
-.18 2
-.00 4
-.00 9
.19 9
.65 6
.04 1
-.01 0
.02 5
.10 2
.11 7
.03 0
.00 3
ho3
-.03 8
-.00 3
.04 8
.06 0
.09 7
.62 9
.13 4
.18 6
.02 6
.05 5
.01 4
-.10 6
-.00 1
.00 2
ho4 r
.02 3
-.04 9
-.02 3
.04 7
.13 4
.22 4
.56 0
.06 2
.13 6
.16 7
-.04 4
-.03 5
.02 1
.05 0
ho5
.00 7
.13 9
.15 2
.01 7
.15 3
.35 0
.13 1
.10 1
.08 4
.78 7
-.04 4
-.00 5
.00 8
-.00 4
ho6 r
.02 5
.15 8
.05 3
.12 5
.66 9
.19 6
.05 6
.06 2
.10 4
-.01 1
.09 3
.05 1
ho7
.16 1
.07 5
.07 2
.05 2
-.09 6
.05 8
.29 5
.03 4
-.13 4
.39 2
.27 3
.04 5
-.04 8
.00 7
ho8
.11 2
-.11 9
-.07 1
.14 7
-.02 6
.09 3
.14 9
.77 9
-.04 2
.12 4
-.04 3
-.00 2
.00 9
-.08 6
ho9
-.02 7
.16 5
.32 0
-.13 1
.00 1
.13 9
.46 4
.17 0
-.00 1
.02 2
.14 5
-.15 2
.03 1
-.04 8
.15 0
.11 8
.17 0
.16 1
.08 1
.56 0
.31 4
.08 8
-.06 5
.31 9
.12 1
-.05 5
.00 0
.20 5
ho1 0
.05 3
.04 9
Ext ractio n Me thod : Maximum Like lihoo d.
Ro tation Meth od: V arim ax wi th Ka iser Norma lizati on.
a. Ro tation converge d in 1 9 iteration s.
Factors
Fac tor Transforma tion M atri x
Factor
1
1
2
3
4
5
6
7
8
9
10
11
12
13
14
.54 5
.33 2
.17 2
.16 7
.31 9
.35 3
.25 4
.16 5
.31 8
.21 8
.13 3
.17 9
.11 3
.07 7
2
-.46 8
-.05 7
.40 0
.47 5
.38 7
.26 1
.00 6
.04 2
-.31 6
.18 5
-.03 3
-.18 8
.02 7
-.04 2
3
-.22 8
.41 3
.58 1
-.60 7
-.12 1
.00 3
.01 4
-.12 3
.03 4
.10 0
.03 9
-.08 0
.14 3
.04 8
4
.03 9
.32 3
.12 2
.27 6
.32 3
-.33 9
-.30 5
-.44 1
.18 2
-.48 8
-.06 2
.09 7
.10 9
-.02 1
5
-.16 7
.69 6
-.49 0
.11 3
-.08 7
.08 7
.09 8
-.07 6
-.18 1
.08 3
.19 6
-.29 9
-.19 0
-.01 4
6
.24 7
-.01 3
.22 8
.03 7
-.12 0
.00 5
.12 4
.37 7
-.25 9
-.60 6
.35 1
-.37 5
.08 1
-.09 1
7
-.15 3
.27 8
.08 8
.09 8
-.05 8
-.30 3
-.30 6
.65 1
-.03 9
.06 5
.02 0
.45 9
-.17 4
-.14 9
8
.24 8
.00 8
.16 3
.03 5
-.19 3
.17 3
-.01 0
-.39 4
-.57 1
.00 5
.21 2
.48 6
-.25 9
-.11 8
9
.42 4
.12 4
.21 7
.21 7
-.24 6
-.32 0
-.08 5
.00 9
-.23 2
.28 0
-.50 1
-.34 0
-.08 1
.18 7
10
-.07 3
.04 2
.04 2
.16 1
-.15 7
-.30 6
.59 5
-.08 6
.05 5
.07 3
-.11 5
.09 1
.24 3
-.63 0
11
-.18 1
.15 8
-.07 7
.05 6
-.16 8
.25 7
.31 7
.11 6
-.14 4
-.38 8
-.50 3
.31 4
.20 5
.40 0
12
-.12 1
-.07 9
.04 7
.19 1
-.07 7
-.46 0
.28 5
-.05 6
-.02 2
.16 3
.47 7
.11 8
.15 7
.58 7
13
-.07 4
.01 8
.08 3
.37 0
-.63 5
.29 6
-.34 7
-.07 9
.28 0
.06 2
.15 4
-.01 0
.35 3
-.05 4
14
.13 3
.03 6
-.26 1
-.16 6
.21 0
-.05 2
-.24 6
.06 2
-.42 6
.16 6
.02 1
.04 8
.74 7
-.06 8
Ext ractio n Me thod : Maximum Like lihoo d.
Ro tation Meth od: V arim ax wi th Ka iser Norma lizati on.
There are five big clusters of variables and a bunch of little clusters. The miniclusters may represent
small groups of items with similar wordings or meanings. This is typical of factor analyses of items.
FALecture - 28
Printed on 2/18/2016
Restricting the model to 5 factors (Factor analysis was redone specifying only 5 factors.)
Rotated Factor Matrixa
Factor
1
2
3
4
5
he1
.612
-.108
-.177
-.113
.064
he2r
.703
-.056
.057
.008
-.023
he3
.534
.117
.108
-.007
.263
he4r
.745
.095
-.089
.051
-.034
he5
.643
-.013
.277
.079
.108
he6r
.579
.118
.223
.161
.225
he7
.676
-.081
.207
-.015
.092
he8r
.454
-.056
-.107
-.141
-.023
he9
.605
-.088
-.090
-.141
.125
he10r
.599
.226
.025
.090
.082
ha1r
.026
.049
.332
.036
.021
ha2
.322
.003
.500
.033
.083
ha3r
-.254
.087
.374
.328
-.085
ha4
-.025
-.061
.782
-.005
-.037
ha5r
.003
-.011
.544
.086
.010
ha6
-.100
-.048
.663
-.056
.074
ha7r
.280
.025
.656
.083
.128
ha8
-.097
-.049
.527
.017
.277
ha9
-.005
-.221
.656
.078
.099
ha10
.286
-.034
.393
.049
.171
hc1
-.028
-.011
-.113
.506
.053
hc2r
-.124
.091
-.179
.714
-.006
hc3
-.058
.137
.095
.417
.374
hc4r
.041
.232
.072
.605
.060
hc5
-.036
.003
.130
.644
.057
hc6r
.015
.157
-.008
.709
.050
hc7
.011
.003
.353
.537
.016
hc8r
.073
.233
.091
.265
.210
hc9
.157
-.083
.256
.576
-.087
hc10
-.069
.058
.119
.318
.366
hs1r
-.005
.585
-.052
.018
.227
hs2
.089
.482
-.083
.029
.223
hs3r
.052
.461
-.216
-.150
.072
hs4
-.040
.416
-.015
-.023
-.053
hs5r
-.075
.554
.014
.089
.203
hs6r
-.023
.601
-.147
.043
.028
hs7r
-.085
.811
.086
.133
.004
hs8r
-.098
.787
.035
.157
.026
hs9r
.038
.666
.031
.117
.113
hs10r
.218
.545
.104
.141
.020
ho1
.107
.185
.051
-.134
.388
ho2r
-.009
.308
-.194
.061
.473
ho3
.150
-.045
.078
.002
.547
ho4r
.106
.124
.018
-.036
.462
ho5
.122
.018
.217
.120
.537
ho6r
.151
.080
.223
.054
.556
ho7
-.026
.062
-.035
.078
.446
ho8
.129
.085
-.123
-.193
.377
ho9
-.148
-.034
.267
.169
.434
ho10
.195
.077
.157
.123
.752
FALecture - 29
I believe there is only 1
anomaly – only one item
whose loading is largest
on the “wrong” factor.
Way to go, IPIP!!!
Think about what this result
means.
Responses of the 166 persons to
items he2r to he10r were highly
correlated with each other and
not very highly correlated with
responses to the other items.
Persons who were high in E
tended to agree with those
items. Persons low in E tended
to disagree with those items.
These simultaneous differences
between people created the high
correlations between the
responses to the e items.
The same argument can be
made for each factor.
I would argue that the presence
of Extraversion in some persons
caused them to agree with the e
items. The absence of
Extraversion in other persons
caused them to disagree with
the e items. So individual
differences in Extraversion
caused the high correlations
among the e items.
Printed on 2/18/2016
Methods for determining the number of factors . . .
1. Eigenvalues >= 1 rule.
2. Scree test.
3. Parallel Analysis. O’Connor (2000).
O'Connor, B. P. (2000). SPSS and SAS programs for determining the number of
components using parallel analysis and Velicer's MAP test. Behavior Research
Methods, Instrumentation, and Computers, 32, 396-402.
4. R2-test. Nelson (2005). Reference in . . .
Bowler, M. C., Bowler, J. L., & Phillips, B. C. (2009). The Big-5+-2? The impact of
cognitive complexity on the factor structure of the five-factor model. Personality
and Individual Differences, 47, 979-984.
5. Theory. Based on a theory, you know that there will be K factors.
FALecture - 30
Printed on 2/18/2016
Using the Scree test – From Bowler, M. C., Bowler, J. L., & Phillips, B. C. (2009). The Big-5+-2?
The impact of cognitive complexity on the factor structure of the five-factor model. Personality and
Individual Differences, 47, 979-984.
Whole sample
The authors argued that the number of
factors in a common Big-5
questionnaire might depend on the
cognitive complexity of the
respondents.
They gave a Big-5 test (Goldberg’s
100 Unipolar Big Five Markers) to
700+ respondents.
They also gave the ComputerAdministered Rep Test (CART), a
questionnaire that yields a measure of
cognitive complexity of the
respondents.
Respondents with low
cognitive complexity
The authors used the Scree test and
the R2-test suggested by Nelson
(2005) to determine number of
factors.
They determined the number of
factors for the half of the sample with
the largest cognitive complexity and
again for the half of the sample with
the smallest cognitive complexity.
The data suggest 5 factors for the
whole sample, 3 factors for the
quarter of the sample lowest in
cognitive complexity and 7 factors
for the quarter highest in cognitive
complexity. Or not.
Respondents with high
cognitive complexity
FALecture - 31
Printed on 2/18/2016
Terri Rieth’s Dissertation Data
Organization Effectiveness and Authority Boundaries
The data factor analyzed here are from Terri Reith’s dissertation data. Terri was interested in the
general concept of authority boundaries – the separation of levels of authority in organizations.
She wanted to know whether or not the characteristics of authority boundaries within organizations
were related to the effectiveness of the organizations. She created a questionnaire measuring
Clarity, Permeability, and Firmness of Authority boundaries and also Perceived Effectiveness of
the organization. Her intent was to regress effectiveness onto the authority boundary measures to
determine the extent to which these measures accounted for effectiveness. In addition to the
perceived effectiveness measures, she also measured turnover and number of patient visits per
employee as objective measures of effectiveness. What a wonderful job she did.
Below is the factor analysis of her data. Technical note: Her data were employees from 89 physical
therapy clinics. Since her hypotheses were about between-clinic differences in effectiveness, the
factor analysis was on the within-clinic relationships between the variables. That way, the factor
analysis wouldn’t be tainted by the between-clinic relationships she was trying to examine.
She hoped to discover 3 authority boundary factors and one effectiveness factor from this analysis.
SAVE OUTFILE='E:\MdbR\Rieth\WithinGroupsrs.sav'
/COMPRESSED.
factor
/matrix = in(cor=*)
/method = correlation
/analysis = cq1 to eq23
/print = default
/plot=eigen
/diagonal=default
/criteria = factors(5)
/extraction = ml
/rotation = varimax.
Factor Analysis – Orthogonal Factors solution
Communaliti es
CQ 1
Ini tial
.66 5
Extractio n
.71 8
CQ 2
.72 1
.79 7
CQ 3
.65 5
.70 2
CQ 4
.46 6
.45 8
CQ 5
.49 6
.49 4
PQ 6
.41 6
.38 6
PQ 7
.63 6
.67 9
PQ 8
.40 7
.37 8
PQ 9
.69 0
.82 8
PQ 10
.48 3
.46 4
PQ 11
.37 9
.39 4
FQ 12
.47 4
.58 1
FQ 13R
.29 9
.35 5
FQ 14R
.29 6
.31 7
FQ 15
.39 1
.39 2
EQ 16
.61 4
.68 0
EQ 17
.68 6
.81 7
EQ 18
.63 4
.69 3
EQ 19
.42 4
.45 7
EQ 20
.38 2
.38 8
EQ 21
.37 4
.40 3
EQ 22
.54 7
.63 6
EQ 23
.55 0
.64 2
C items: Clarity of authority boundaries
P: Permeability of authority boundaries
F: Firmness of authority boundaries
E: Perceived Effectiveness of the
organization./
Extractio n M ethod : Ma ximu m Likeliho od.
FALecture - 32
Printed on 2/18/2016
Total Va rianc e Explaine d
Init ial Ei genvalues
Factor
1
To tal
8.1 22
Ext ractio n Su ms of Squa red L oadi ngs
% o f Variance Cu mulat ive %
35. 313
35. 313
To tal
7.6 05
Ro tation Sum s of S quared Lo ading s
% o f Variance Cu mulat ive %
33. 067
33. 067
To tal
2.9 74
% o f Variance Cu mulat ive %
12. 931
12. 931
2
2.3 34
10. 147
45. 460
1.9 48
8.4 69
41. 536
2.6 73
11. 623
24. 554
3
1.7 51
7.6 14
53. 074
1.2 91
5.6 15
47. 151
2.4 06
10. 460
35. 014
4
1.3 67
5.9 43
59. 017
.97 1
4.2 22
51. 373
2.3 38
10. 166
45. 180
5
1.1 81
5.1 35
64. 152
.84 1
3.6 57
55. 030
2.2 66
9.8 50
55. 030
6
.79 4
3.4 54
67. 606
7
.73 6
3.2 00
70. 806
8
.69 0
3.0 02
73. 807
9
.64 6
2.8 11
76. 618
10
.61 4
2.6 70
79. 288
11
.56 6
2.4 60
81. 747
12
.53 2
2.3 13
84. 061
13
.50 9
2.2 13
86. 274
14
.45 1
1.9 63
88. 237
15
.41 3
1.7 97
90. 034
16
.40 2
1.7 47
91. 781
17
.37 1
1.6 13
93. 394
18
.33 9
1.4 75
94. 869
19
.31 5
1.3 69
96. 237
20
.26 7
1.1 60
97. 397
21
.23 0
1.0 02
98. 399
22
.19 8
.86 2
99. 261
23
.17 0
.73 9
100 .000
Indicates 5 factors. We were
expecting 4 – Clarity,
Permeability, Firmness, and
Effectiveness - but 5 is close.
Ext ractio n Me thod: Maximum Like lihood .
Scree Plot
10
The predominance of the 1st
factor suggests to me that all the
responses were influenced by a
single, common factor.
More on that later in the course.
8
6
4
Eigenvalue
Indicates 5 factors
2
0
1
3
5
7
9
11
13
15
17
19
21
23
Factor Number
FALecture - 33
Printed on 2/18/2016
Unrotated matrix of loadings.
We usually don’t pay much attention to this matrix.
Fa ctor M atrixa
Fa ctor
CQ 1
1
.69 2
2
-.2 61
3
.36 7
4
-.1 73
5
-7. 248E -02
CQ 2
.71 8
-.3 18
.37 8
-.1 74
-8. 000E -02
CQ 3
.67 5
-.3 33
.33 4
-.1 52
-2. 977E -02
CQ 4
.62 4
-.1 21
.15 5
.12 5
.12 1
CQ 5
.64 3
-.1 82
.20 7
2.8 50E-02
-6. 029E -02
PQ 6
.56 1
-9. 774E -02
-.2 32
5.4 40E-02
7.3 90E-02
PQ 7
.62 8
-.2 89
-.4 48
-1. 933E -02
1.6 87E-03
PQ 8
.43 8
-.2 59
-.3 14
1.2 89E-02
-.1 44
PQ 9
.68 5
-.3 02
-.5 08
-9. 251E -02
-8. 412E -04
PQ 10
.52 1
-.2 68
-.3 37
2.9 78E-02
-7. 951E -02
PQ 11
.53 0
-.1 09
-.1 03
.16 1
.25 2
FQ 12
.56 1
-5. 418E -02
.16 5
.26 6
.40 5
FQ 13R
.34 5
-6. 571E -02
.15 6
.22 4
.39 7
FQ 14R
.36 3
-3. 261E -02
.12 3
.21 4
.35 1
FQ 15
.52 0
-6. 263E -02
1.3 23E-02
.20 5
.27 3
EQ 16
.53 7
.58 0
-1. 013E -02
-.2 16
8.6 57E-02
EQ 17
.63 0
.60 6
-6. 496E -02
-.2 19
2.8 08E-02
EQ 18
.60 8
.55 1
-1. 857E -02
-.1 35
4.6 09E-02
EQ 19
.54 2
.13 1
.13 2
.30 0
-.1 96
EQ 20
.54 9
.23 7
-6. 363E -02
.14 0
-8. 314E -02
EQ 21
.45 0
.29 5
2.2 97E-02
.25 8
-.2 15
EQ 22
.62 5
.17 5
2.7 47E-03
.41 9
-.1 98
EQ 23
.59 1
.27 4
7.0 72E-02
.39 8
-.2 31
Extractio n Me thod : Ma ximum Likeliho od.
a. 5 f actors extra cted . 6 ite ratio ns re quire d.
Goodne ss-of-fit Te st
Ch i-Squ are
37 8.852
df
14 8
Sig .
.00 0
FALecture - 34
Printed on 2/18/2016
Rotated matrix of factor loadings.
C items: Clarity of authority boundaries
P: Permeability of authority boundaries
F: Firmness of authority boundaries
E: Perceived Effectiveness of the organization./
Rotated Factor Ma trixa
Fa ctor
P
C
1
E
2
E
3
F
4
5
CQ 1
.20 3
.76 9
.15 0
.13 8
.20 9
CQ 2
.23 1
.81 4
.14 6
.10 5
.21 8
CQ 3
.24 3
.75 1
.10 9
7.5 49E-02
.24 6
CQ 4
.23 3
.39 2
.24 6
.12 3
.41 7
CQ 5
.25 0
.53 4
.26 9
9.8 05E-02
.25 2
PQ 6
.49 0
.14 5
.16 8
.17 4
.25 8
PQ 7
.77 5
.16 4
.10 1
.10 2
.17 4
PQ 8
.57 6
.15 8
.14 7
-5. 506E -03
2.0 88E-02
PQ 9
.86 0
.18 9
6.5 82E-02
.15 7
.15 3
PQ 10
.63 3
.17 0
.15 2
2.7 11E-02
.10 7
PQ 11
.36 0
.13 7
.14 3
.11 8
.45 9
FQ 12
.13 3
.22 7
.18 9
.12 2
.67 9
FQ 13R
3.7 34E-02
.13 2
7.4 46E-02
3.9 85E-02
.57 4
FQ 14R
6.2 31E-02
.12 1
.10 6
7.3 51E-02
.53 1
FQ 15
.24 4
.16 5
.18 4
.12 1
.50 6
EQ 16
5.2 74E-02
9.5 35E-02
.20 3
.78 2
.12 3
EQ 17
.13 6
.11 1
.27 1
.83 8
9.7 34E-02
EQ 18
.10 5
.11 7
.29 7
.74 7
.15 2
EQ 19
.12 0
.23 8
.57 6
.14 9
.18 0
EQ 20
.22 9
.11 1
.42 5
.34 1
.16 4
EQ 21
9.3 04E-02
7.6 40E-02
.56 1
.25 9
8.0 58E-02
EQ 22
.23 4
.13 8
.69 1
.17 3
.23 5
EQ 23
.13 0
.13 8
.71 8
.23 1
.19 0
Anomalies
The
effectiveness
items split into
two groups
which accounts
for the 5 factors.
Ex tractio n Me thod : Ma ximum Likeliho od.
Ro tation Met hod: Varim ax with K aiser Norm aliza tion.
a. Ro tation con verge d in 7 iteration s.
Not a bad solution considering this was only the second analysis. The first analysis
was with a small pilot sample of students – mostly MBA students, one which yielded
only 4 factors (effectiveness was only 1 factor in that solution).
Fa ctor Trans forma tion Matr ix
Fa ctor
1
1
2
3
4
.51 0
.51 1
.39 7
2
-.4 08
-.4 04
3
-.7 49
.62 6
4
-.0 68
5
-.0 84
5
.43 0
.36 9
.31 4
.75 1
-.0 88
.09 7
-.0 91
.17 0
-.3 70
.67 7
-.4 69
.42 5
-.2 15
-.5 26
.15 2
.80 4
The factor transformation
matrix is a matrix of
values that were involved
in the rotation of the
matrix of loadings.
Extractio n Me thod : Ma ximum Likeliho od.
Ro tation Met hod: Varim ax with K aiser Norm aliza tion.
FALecture - 35
Printed on 2/18/2016
Rieth Dissertation Factor Analysis –
Oblique Solution (allowing factors to be correlated)
factor
/matrix = in(cor=*)
/method = correlation
/analysis = cq1 to eq23
/print = default
/plot=eigen
/diagonal=default
/criteria = factors(5)
/extraction = ml
/rotation = oblimin(0).
All of the output up to the Goodness-of-fit is the same as the orthogonal rotation solution.
- - - - - - - - - - -
F A C T O R
A N A L Y S I S
More on goodness-of-fit later on.
Goodne ss-of-fit Te st
Ch i-Squ are
37 8.852
df
14 8
- - - - - - - - - - -
The pattern matrix is the matrix of
standardized factor loadings. Use these
to define factors. Each loading is the
partial correlation of a variable with
a factor – the correlation of the variable
with the factor holding the other factors
constant.
Sig .
.00 0
Pa ttern Matr ixa
Fa ctor
C
E
1
P
3
1.6 30E-02
E
.85 0
CQ 2
.90 0
2.0 08E-02
-6. 123E -03
-1. 741E -02
-1. 780E -02
CQ 3
.82 1
-2. 467E -03
-3. 891E -02
-5. 239E -02
3.9 39E-02
CQ 4
.31 7
1.4 93E-02
-7. 837E -02
.13 4
.31 1
CQ 5
.52 3
-1. 661E -02
-7. 690E -02
.17 4
7.4 41E-02
PQ 6
5.4 66E-03
.10 9
-.4 62
4.3 03E-02
.17 2
PQ 7
7.5 41E-03
3.9 11E-02
-.8 05
-4. 451E -02
5.4 78E-02
PQ 8
6.5 98E-02
-8. 191E -02
-.5 93
9.4 27E-02
-.1 01
PQ 9
3.0 44E-02
.11 0
-.9 01
-.1 18
1.5 92E-02
PQ 10
4.9 56E-02
-5. 412E -02
-.6 43
7.0 41E-02
-1. 267E -02
PQ 11
-1. 915E -02
4.3 23E-02
-.3 01
1.8 26E-02
.43 3
FQ 12
8.0 25E-02
2.4 80E-02
1.2 73E-02
6.0 32E-02
.68 3
FQ 13R
2.1 86E-02
-2. 174E -02
6.5 17E-02
-2. 045E -02
.62 1
FQ 14R
7.4 11E-03
1.0 24E-02
3.5 57E-02
1.3 61E-02
.56 2
FQ 15
2.1 26E-02
3.4 22E-02
-.1 50
7.3 07E-02
.48 6
EQ 16
2.7 97E-02
.83 6
4.1 24E-02
-2. 587E -02
1.4 71E-02
EQ 17
2.6 50E-02
.87 8
-3. 920E -02
3.4 88E-02
-4. 455E -02
EQ 18
2.6 15E-02
.76 1
2.9 57E-03
9.1 95E-02
2.5 01E-02
EQ 19
.14 6
-2. 843E -02
3.7 12E-02
.61 6
2.8 07E-02
EQ 20
-1. 211E -02
.24 3
-.1 36
.36 9
3.9 83E-02
EQ 21
-3. 030E -02
.11 9
1.8 31E-02
.60 1
-5. 026E -02
EQ 22
-2. 878E -02
-3. 916E -02
-9. 412E -02
.74 7
8.5 69E-02
EQ 23
-5. 631E -03
2.7 67E-02
2.6 84E-02
.78 3
3.3 78E-02
Ex tractio n Me thod : Ma ximum Likeliho od.
Ro tation Met hod: Oblim in with K aiser Norm aliza tion.
a. Ro tation con verge d in 7 iteration s.
FALecture - 36
4
-1. 240E -02
F
CQ 1
2
6.1 10E-02
5
-1. 753E -02
Loadings here are
partial regression
coefficients. Each
loading controls for all
other factors.
CQ4 is now “on board”
with the other CQ
items, if only barely.
PQ11 is still an
anomaly – influenced
more by F5 than by F3.
The effectiveness items
still form two factors.
The correlation of these
two factors is .555 (see
below).
Terri ended up
analyzing the
effectiveness items
separately – as
Perceived Cost
effectiveness and
Perceived Quality
effectiveness.
Printed on 2/18/2016
Str uctur e Ma trix
Fa ctor
1
2
3
4
5
CQ 1
.84 5
.29 3
-.4 03
.38 9
.43 1
CQ 2
.89 2
.26 9
-.4 37
.39 3
.45 0
CQ 3
.83 6
.23 1
-.4 32
.34 8
.45 6
CQ 4
.58 1
.30 0
-.4 07
.45 4
.56 8
CQ 5
.67 2
.27 6
-.4 26
.46 3
.44 6
PQ 6
.37 2
.31 0
-.5 75
.36 9
.40 1
PQ 7
.42 0
.24 4
-.8 21
.33 0
.35 2
PQ 8
.32 1
.11 2
-.6 04
.27 3
.17 0
PQ 9
.45 8
.29 4
-.9 04
.32 8
.35 1
PQ 10
.37 3
.16 3
-.6 77
.31 7
.26 7
PQ 11
.37 3
.26 3
-.4 72
.34 6
.55 7
FQ 12
.46 3
.29 4
-.3 12
.40 7
.75 5
FQ 13R
.29 6
.15 2
-.1 63
.22 5
.59 2
FQ 14R
.29 1
.18 8
-.1 86
.25 7
.56 2
FQ 15
.38 9
.27 3
-.3 80
.37 7
.59 6
EQ 16
.24 8
.82 3
-.1 86
.44 0
.26 6
EQ 17
.29 5
.90 1
-.2 79
.53 0
.27 6
EQ 18
.30 1
.82 7
-.2 55
.53 6
.31 8
EQ 19
.41 1
.35 5
-.2 86
.66 3
.35 3
EQ 20
.31 2
.49 3
-.3 58
.57 1
.32 4
EQ 21
.24 0
.42 3
-.2 22
.62 4
.23 1
EQ 22
.38 6
.41 9
-.4 03
.78 8
.42 4
EQ 23
.35 9
.46 4
-.3 06
.80 0
.37 6
The structure matrix contains
simple correlations of items with
factors.
It is less often interpreted.
Extractio n Me thod : Ma ximum Likeliho od.
Ro tation Met hod: Oblim in with K aiser Norm aliza tion.
Fa ctor Corre lation Matrix
Fa ctor
1
1
1.0 00
2
3
4
5
.29 2
-.4 88
.44 9
2
.29 2
1.0 00
-.2 61
.55 5
.31 5
3
-.4 88
-.2 61
1.0 00
-.4 04
-.3 73
4
.44 9
.55 5
-.4 04
1.0 00
.44 2
5
.51 9
.31 5
-.3 73
.44 2
1.0 00
Extractio n Me thod : Ma ximum Likeliho od.
Ro tation Met hod: Oblim in with K aiser Norm aliza tion.
.51 9
Intercorrelations of factors.
All correlations are
substantial.
Note that the two
effectiveness factors have
the highest correlation –
they’re separate but related.
How do we use the results?
1. Most common use
To create scales by summing the responses to items with highest loadings on a factor.
This is a crude way of creating factor scores.
This is what Terri did. She created 3 scales representing three different aspects of authority
boundary and used these as independent variables.
She created two scales representing two aspects of effectiveness and used these as dependent
variables.
FALecture - 37
Printed on 2/18/2016
Download