13 - Orthogonal Linear Contrasts, Post hoc tests

advertisement
Orthogonal Linear Contrasts
This is a technique for partitioning
ANOVA sum of squares into
individual degrees of freedom
Definition
Let x1, x2, ... , xp denote p numerical quantities
computed from the data.
These could be statistics or the raw observations.
A linear combination of x1, x2, ... , xp is defined to be
a quantity ,L ,computed in the following manner:
L = c1x1+ c2x2+ ... + cpxp
where the coefficients c1, c2, ... , cp are predetermined
numerical values:
Definition
Let m1, m2, ... , mp denote p means and c1, c2, ... , cp
denote p coefficients such that:
c1+ c2 + ... + cp = 0,
Then the linear combination
L = c1m1+ c2m2+ ... + cpmp
is called a linear contrast of the p means
m1, m2, ... , mp .
Examples
1. L  x 
x1  x2    x p
p
1
1
1




   x1    x2      x p
 p
 p
 p
2. L 
m1  m 2  m3
3

m 4  m5
A linear
combination
A linear
contrast
2
1
1
1
 1
 1
   m1    m2    m3     m4     m5
 3
 3
 3
 2
 2
3. L = m1 - 4 m2+ 6m3 - 4 m4 + m5
A linear
contrast
= (1) m1+ (-4) m2+ (6) m3 + (-4) m4 + (1) m5
Definition
Let
A = a1m1+ a2m2+ ... + apmp
and
B = b1m1+ b2m2+ ... + bpmp
be two linear contrasts of the p means m1, m2, ... , mp.
Then A and B are called Orthogonal Linear Contrasts
if in addition to:
a1+ a2+ ... + ap = 0 and
b1+ b2+ ... + bp = 0,
it is also true that:
a1b1+ a2b2+ ... + apbp = 0.
Example
Let
A
m1  m2  m3
3

m 4  m5
2
1
1
1
 1
 1
   m1    m2    m3     m4     m5
 3
 3
 3
 2
 2
 m1  m2

B
 m3    m4  m5 
 2

1
1
   m1    m2   1 m3  1 m4   1 m5
2
2
Note:
11 11 1
1
1
            - 1    1    - 1  0
 3  2  3  2  3
2
2
Definition
Let
A = a1m1+ a2m2+ ... + apmp,
B= b1m1+ b2m2+ ... + bpmp ,
..., and
L= l1m1+ l2m2+ ... + lpmp
be a set linear contrasts of the p means m1, m2, ... , mp.
Then the set is called a set of Mutually Orthogonal
Linear Contrasts if each linear contrast in the set is
orthogonal to any other linear contrast.
Theorem:
The maximum number of linear contrasts in a
set of Mutually Orthogonal Linear Contrasts
of the quantities m1, m2, ... , mp is p - 1.
p - 1 is called the degrees of freedom (d.f.)
for comparing quantities m1, m2, ... , mp .
Comments
1. Linear contrasts are making comparisons
amongst the p values m1, m2, ... , mp
2. Orthogonal Linear Contrasts are making
independent comparisons amongst the p
values m1, m2, ... , mp .
3. The number of independent comparisons
amongst the p values m1, m2, ... , mp is p – 1.
Definition
Let L  a1m1  a2 m2    a p m p
denote a linear contrast of the p means
m1 , m2 ,  , m p
Let
Lˆ  a1 x1  a2 x2    a p x p
where each mean, xi , is calculated from n
observations.
Then the Sum of Squares for testing
the Linear Contrast L,
i.e.
H0: L = 0 against HA: L ≠ 0
is defined to be:
2
ˆ
nL
SSL = 2 2
2
a1 +a2+...+ap
Note : if M  cLˆ  ca1 x1  ca2 x2 

ca p x p
nc 2 Lˆ2
n Lˆ2
SSM = 2 2 2 2
= 2 2
 SS L
2 2
2
c a1 +c a2 +...+c a p
a1 +a2 +...+a p
the degrees of freedom (df) for testing the
Linear Contrast L, is defined to be
dfL  1
the F-ratio for testing the Linear Contrast
L, is defined to be:
SSL
F
1
MSError
To test if a set of mutually orthogonal
linear contrasts are zero:
i.e.
H0: L1 = 0, L2 = 0, ... , Lk= 0
then the Sum of Squares is:
SSH0 = SSL1 + SSL2 + ...+ SSLk
the degrees of freedom (df) is
dfH0  k
SSH 0 k SSH 0 k
=
the F-ratio is: F 
2
MSError
s
Theorem:
Let L1, L2, ... , Lp-1 denote p-1 mutually
orthogonal Linear contrasts for comparing the
p means . Then the Sum of Squares for
comparing the p means based on p – 1 degrees
of freedom , SSBetween, satisfies:
SSBetween = n xi  x 
p
2
i 1
= SSL1 + SSL2 + ...+ SSLp -1
Comment
Defining a set of Orthogonal Linear Contrasts for
comparing the p means
x1 , x2 ,  , x p
allows the researcher to "break apart" the Sum of
Squares for comparing the p means, SSBetween, and
make individual tests of each the Linear Contrast.
The Diet-Weight Gain example
x1  100.0, x2  85.9, x3  99.5,
x4  79.2, x5  83.9, x6  78.7
The sum of Squares for comparing the 6
means is given in the Anova Table:
Five mutually orthogonal contrasts are given below
(together with a description of the purpose of these
contrasts) :
1
1
L1  x1  x2  x3   x4  x5  x6 
3
3
(A comparison of the High protein diets with Low
protein diets)
1
1
L2  x1  x 4   x3  x6 
2
2
(A comparison of the Beef source of protein with the
Pork source of protein)
1
1
L3  x1  x3  x 4  x6   x 2  x5 
4
2
(A comparison of the Meat (Beef - Pork) source of
protein with the Cereal source of protein)
1
1
L4  x1  x6   x3  x 4 
2
2
(A comparison representing interaction between
Level of protein and Source of protein for the Meat
source of Protein)
1
1
L5  x1  x3  2 x5   x 4  x6  2 x 2 
4
4
(A comparison representing interaction between
Level of protein with the Cereal source of Protein)
Table of Coefficients
diet
Contrast
L1
L2
L3
L4
L5
1
1
1
-1
1
-1
2
1
0
2
0
2
3
1
-1
-1
-1
-1
4
-1
1
-1
-1
1
5
-1
0
2
0
-2
6
-1
-1
-1
1
1
Note: L4 = L1 × L2 and L5 = L1 × L3
L1 is the 1 df for the Level main effect
L2 and L3 are the 2 df for the Source main effect
L4 and L5 are the 2 df for the Source-Level interaction
The Anova Table for Testing these contrasts is given
below:
Source:
DF:
Sum Squares:
Mean Square:
F-test:
Contrast L1
Contrast L2
Contrast L3
Contrast L4
Contrast L5
Error
1
1
1
1
1
54
3168.267
2.500
264.033
0.000
1178.133
11586.000
3168.267
2.500
264.033
0.000
1178.133
214.556
14.767
0.012
1.231
0.000
5.491
The Mutually Orthogonal contrasts that are
eventually selected should be determine prior to
observing the data and should be determined by the
objectives of the experiment
Another Five mutually orthogonal contrasts are
given below (together with a description of the
purpose of these contrasts) :
1
1
L1  x1  x3  x4  x6   x2  x5 
4
2
(A comparison of the Meat (Beef - Pork) source of
protein with the Cereal source of protein)
1
1
L2  x1  x 4   x3  x6 
2
2
(A comparison of the Beef source of protein with the
Pork source of protein)
L3  x1  x4
(A comparison of the high and low protein diets for
the Beef source of protein)
L4  x2  x5
(A comparison of the high and low protein diets for
the Cereal source of protein)
L5  x3  x6
(A comparison of the high and low protein diets for
the Pork source of protein)
Table of Coefficients
diet
Contrast
L1
L2
L3
L4
L5
1
1
1
1
0
0
2
0
-2
0
1
0
3
-1
1
0
0
1
4
1
1
-1
0
0
5
0
-2
0
-1
0
6
-1
1
0
0
-1
Note:
L1 and L2 are the 2 df for the Source main effect
L3 ,L4 and L5 are the 3 df comparing the Level within
the Source.
The Anova Table for Testing these contrasts is given
below:
Source:
DF:
Sum Squares:
Mean Square:
F-test:
Beef vs Pork ( L1)
Meat vs Cereal ( L2)
High vs Low for Beef ( L3)
High vs Low for Cereal ( L4)
High vs Low for Pork ( L5)
Error
1
1
1
1
1
54
2.500
264.033
2163.200
20.000
2163.200
11586.000
2.500
264.033
2163.200
20.000
2163.200
214.556
0.012
1.231
10.082
0.093
10.082
Techniques for constructing
orthogonal linear contrasts
I. Comparing first k – 1 with
Consider the p values – y1, y2, y3, ... , yp
L1 = 1st vs 2nd = y1 - y2
L2 = 1st , 2nd vs 3rd =½ (y1 + y2) – y3
L3 = 1st , 2nd , 3rd vs 4th
= 1/3 (y1 + y2 + y3) – y4
etc
th
k
Helmert contrasts
Contrast
L1
L2
L3
L4
Contrast
L1
L2
L3
L4
-1
-1
-1
-1
coefficients
1
0
0
-1
2
0
-1
-1
3
-1
-1
-1
0
0
0
4
explanation
2nd versus 1st
3rd versus 1st and 2nd
4th versus 1st, 2nd and 3rd
5th versus 1st, 2nd, 3rd and 4th
II.
Comparing between Groups then within groups
Consider the p = 10 values – y1, y2, y3, ... , y10
Suppose these 10 values are grouped
Group 1
y1, y2, y3
Group 2
y4, y5, y6 , y7
Group 3
y8, y9, y10
Comparison of Groups (2 d.f.)
L1 = Group 1 vs Group 2
= 1/3(y1 + y2 + y3) - 1/4(y4 + y5 + y6 + y7)
L2 = Group 1, Group 2 vs Group 2
= 1/7(y1 + y2 + y3 + y4 + y5 + y6 + y7) - 1/3( y8 + y9 + y10)
Comparison of within Groups
Within Group 1 (2 d.f.)
L3 = 1 vs 2 = y1 - y2
L4 = 1,2 vs 3= 1/2(y1 + y2) - y3
Within Group 2 (3 d.f.)
L5 = 4 vs 5 = y4 – y5
L6 = 4, 5 vs 6= 1/2(y4 + y5) – y6
L7 = 4, 5, 6 vs 7= 1/3(y4 + y5 + y6) –y7
Within Group 3 (2 d.f.)
L8 = 8 vs 9 = y8 – y9
L9 = 8, 9 vs 10= 1/2(y8 + y9) –y10
II.
Comparisons when Grouping is done on two
different ways
Consider the p = ab values – y11, y12, y13, ... , y1b , y21,
y22, y23, ... , y2b , ... , ya1, ya2, ya3, ... , yab
Row Groups
1
2
3
a
1
y11
y21
y31
⁞
ya1
Column Groups
2
3
...
y12 y13
...
y22 y23
...
y32 y33
...
⁞
⁞
⁞
ya2 ya3
...
b
y1b
y2b
y3b
⁞
yab
Comparison of Row Groups (a - 1 d.f.)
R1 , R2 , R3 , ... , Ra -1
Comparison of Column Groups (b - 1 d.f.)
C1 , C2 , C3 , ... , Cb -1
Interaction contrasts (a - 1) (b - 1) d.f.
(RC)11 = R1 × C1 , (RC)12 = R1 × C2 , ... ,
(RC)a - 1,b - 1 = Ra - 1 × Cb – 1
Comment: The coefficients of (RC)ij = Ri × Cj are
found by multiplying the coefficients of Ri with the
coefficients of Cj.
Example (a = 3, b = 4)
1
2
3
4
1
y11
y12
y13
y14
2
y21
y22
y23
y24
3
y31
y32
y33
y34
Orthogonal Contrasts
y11
y12
y13
y14
y21
y22
y23
y24
y31
y32
y33
y34
R1
1
1
1
1
-1
-1
-1
-1
0
0
0
0
R2
1
1
1
1
1
1
1
1
-2
-2
-2
-2
C1
1
-1
0
0
1
-1
0
0
1
-1
0
0
C2
1
1
-2
0
1
1
-2
0
1
1
-2
0
C3
1
1
1
-3
1
1
1
-3
1
1
1
-3
(RC)11=R1×C1
1
-1
0
0
-1
1
0
0
0
0
0
0
(RC)12=R1×C2
1
1
-2
0
-1
-1
2
0
0
0
0
0
(RC)13=R1×C3
1
1
1
-3
-1
-1
-1
3
0
0
0
0
(RC)21=R2×C1
1
-1
0
0
1
-1
0
0
-2
2
0
0
(RC)22=R2×C2
1
1
-2
0
1
1
-2
0
-2
-2
4
0
(RC)23=R2×C3
1
1
1
-3
1
1
1
-3
-2
-2
-2
6
Orthogonal Linear Contrasts
≠
Polynomial Regression
Let m1, m2, ... , mp denote p means and consider the
first differences
Dmi = mi - mi-1
if m1 = m2 = ... = mp then
Dmi = mi - mi-1 = 0
If the points (1, m1), (2, m2) … (p, mp) lie on a
straight line with non-zero slope then
Dmi = mi - mi-1 ≠ 0
but equal.
Consider the 2nd differences
D2mi = (mi - mi-1)-(mi -1 - mi-2) = mi - 2mi-1 + mi-2
If the points (1, m1), (2, m2) … (p, mp) lie on a
straight line then
D2mi = mi - 2mi-1 + mi-2 = 0
If the points (1, m1), (2, m2) … (p, mp) lie on a
quadratic curve then
D2mi = mi - 2mi-1 + mi-2 ≠ 0
but equal.
Consider the 3rd differences
D3mi = mi - 3mi-1 + 3mi-2 - mi-3
If the points (1, m1), (2, m2) … (p, mp) lie on a
quadratic curve then
D3mi = mi - 3mi-1 + 3mi-2 - mi-3 = 0
If the points (1, m1), (2, m2) … (p, mp) lie on a cubic
curve then
D3mi = mi - 3mi-1 + 3mi-2 - mi-3 ≠ 0
but equal.
Continuing,
4th differences, D4mi will be non- zero but equal if
the points (1, m1), (2, m2) … (p, mp) lie on a quartic
curve (4th degree).
5th differences, D5mi will be non- zero but equal if
the points (1, m1), (2, m2) … (p, mp) lie on a quintic
curve (5th degree).
etc.
Let
L = a2 Dm2 + a3 Dm3 + … + ap Dmp
Q2 = b3 D2m3 + … + bp D2mp
C = c4 D3m4 + … + cp D3mp
Q4 = d5 D4m5+ … + dp D4mp
etc.
Where a2, …, ap, b1, …, bp, c1, … etc are chosen so
that L, Q2, C, Q4, … etc are mutually orthogonal
contrasts.
If the means are equal then
L = Q2 = C = Q4 = … = 0.
If the means are linear then
L ≠ 0 but Q2 = C = Q4 = … = 0.
If the means are quadratic then
Q2 ≠ 0 but C = Q4, … = 0.
If the means are cubic then
C ≠ 0 but Q4 = … = 0.
Orthogonal Linear Contrasts for Polynomial Regression
Orthogonal Linear Contrasts for Polynomial Regression
Example
In this example we are measuring the “Life”
of an electronic component and how it
depends on the temperature on activation
Table
Activation
Temperature
0
53
50
47
Ti.
150
Mean
50
yij2 = 56545
25
50
75
100
60
67
65
58
62
70
68
62
58
73
62
60
T..
180
210
195
180
915
60
70
65
60
Ti.2/n = 56475
T..2/nt = 55815
The Anova Table
L = 25.00 Q2 = -45.00 C = 0.00 Q4 = 30.00
Source
Treat
Linear
Quadratic
Cubic
Quartic
Error
Total
SS
df
660
4
187.50
1
433.93
1
0.00
1
38.57
1
70
10
730
14
MS
165.0
187.50
433.93
0.00
38.57
7.00
F
23.57
26.79
61.99
0.00
5.51
The Anova Tables
for Determining degree of polynomial
Testing for effect of the factor
Source
Treat
Error
Total
SS
660
70
730
df
4
10
14
MS
165
7
F
23.57
Testing for departure from Linear
Q2 + C + Q4
Testing for departure from Quadratic
C + Q4
y = 49.751 + 0.61429 x -0.0051429 x^2
70
65
Life
60
55
50
45
40
0
20
40
60
Act. Temp
80
100
120
Multiple Testing
• Fisher’s L.S.D. (Least Significant
Difference) Procedure
• Tukey’s Multiple comparison procedure
• Scheffe’s multiple comparison procedure
Multiple Testing – a Simple Example
Suppose we are interested in testing to see if
two parameters (q1 and q2) are equal to zero.
There are two approaches
1. We could test each parameter separately
a) H0: q1 = 0 against HA: q1 ≠ 0 , then
b) H0: q2 = 0 against HA: q2 ≠ 0
2. We could develop an overall test
H0: q1 = 0, q2= 0 against HA: q1 ≠ 0 or q2 ≠ 0
1. To test each parameter separately
a) H0(1) :q1  0 against H A(1) :q1  0
then
b) H ( 2) : q  0 against H ( 2) : q  0
0
2
A
2
We might use the following test:
Reject H
(1)
0
if qˆ1  K
then
Reject H 0( 2) if qˆ2  K
K
is chosen so that the probability of a
Type I errorof each test is .
2. To perform an overall test
H0: q1 = 0, q2= 0 against HA: q1 ≠ 0 or q2 ≠ 0
we might use the test
Reject H0 if qˆ12  qˆ22  K(overall)
(overall )
K
is chosen so that the probability of a
Type I error is .
qˆ2
qˆ1  K
qˆ1
qˆ2
qˆ2  K
qˆ1
qˆ2
qˆ1  K
qˆ2  K
qˆ1
qˆ2
( multiple)
ˆ
q1  K
qˆ2  K
( multiple)
qˆ1
qˆ2
2
2
( overall )
ˆ
ˆ
q q  K
1
2

qˆ1
qˆ2
qˆ2  K( multiple)
qˆ1  K( multiple)
2
2
( overall )
ˆ
ˆ
q q  K
1
2

qˆ1
qˆ2
c1qˆ1  c2qˆ2  K( Scheffe)
2
2
( overall )
ˆ
ˆ
q q  K
1
c1qˆ1  c2qˆ2  K( Scheffe)
2

qˆ1
Post-hoc Tests
Multiple Comparison Tests
Post-hoc Tests
Multiple Comparison Tests
Multiple Testing
• Fisher’s L.S.D. (Least Significant
Difference) Procedure
• Tukey’s Multiple comparison procedure
• Scheffe’s multiple comparison procedure
Suppose we have p means
x1 , x2 ,  , x p
An F-test has revealed that there are significant
differences amongst the p means
We want to perform an analysis to determine
precisely where the differences exist.
Example One –way ANOVA
The F test – for comparing k means
Situation
• We have k normal populations
• Let mi and s denote the mean and standard
deviation of population i.
• i = 1, 2, 3, … k.
• Note: we assume that the standard deviation
for each population is the same.
s1 = s2 = … = sk = s
We want to test
H 0 : m1  m2  m3    mk
against
H A : mi  m j for at least one pair i, j
Anova Table
Source
d.f.
Sum of
Squares
Between
k-1
SSBetween
Mean
Square
MSBetween
Within
N-k
SSWithin
MSWithin
Total
N-1
SSTotal
k
SS Between   n  xi  x 
i 1
2
F-ratio
MSB /MSW
Comments
• The F-test H0: m1 = m2 = m3 = … = mk against HA: at
least one pair of means are different
• If H0 is accepted we know that all means are equal
(not significantly different)
• If H0 is rejected we conclude that at least one pair of
means is significantly different.
• The F – test gives no information to which pairs of
means are different.
• One now can use two sample t tests to determine
which pairs means are significantly different
Fishers LSD (least significant
difference) procedure:
Fishers LSD (least significant difference)
procedure:
1. Test H0: m1 = m2 = m3 = … = mk against HA:
at least one pair of means are different,
using the ANOVA F-test
2. If H0 is accepted we know that all means
are equal (not significantly different). Then
stop in this case
3. If H0 is rejected we conclude that at least
one pair of means is significantly different,
then follow this by
• using two sample t tests to determine which pairs
means are significantly different
Tukey’s Multiple Comparison
Test
Let
MSError
s

n
n
denote the standard error of each xi
Tukey's Critical Differences
D  q
s
n
 q
MS Error
n
Two means are declared significant if they
differ by more than this amount.
q = the tabled value for Tukey’s studentized
range p = no. of means, n = df for Error
Table: Critical values for Tukey’s
studentized Range distribution
Scheffe’s Multiple Comparison
Test
Scheffe's Critical Differences (for Linear
contrasts)
s
2
2
2
S   p  1F  p  1,n 
a1  a 2    a p
n

 p  1F  p  1,n 
MS Error
n
a12  a22    a 2p
A linear contrast is declared significant if it
exceeds this amount.
F  p 1,n  = the tabled value for F distribution
(p -1 = df for comparing p means,
n = df for Error)
Scheffe's Critical Differences
(for comparing two means)
L  xi  x j
S 
 p 1F  p 1,n 
MSError
n
2
Two means are declared significant if they
differ by more than this amount.
Multiple Confidence Intervals
Tukey’s Multiple confidence intervals
xi  x j  D
D  q
MS Error
n
Scheffe’s Multiple confidence intervals
xi  x j  S
2MSError
S   p  1 F  p  1,n 
n
One-at-a-time confidence intervals
xi  x j  t / 2
1 1
MS Error   
n n
Comments
Tukey’s Multiple confidence intervals
The probability that xi  x j  D contains mi  m j for
all i and j is 1  
One-at-a-time confidence intervals
xi  x j  t / 2
1 1
MS Error   
n n
The probability that each of these interval contains mi – mj is 1 – .
The probability that all of these interval contains mi – mj is
considerably lower than 1 – 
Scheffe’s Multiple confidence intervals
These intervals can be computed not only for simple differences in
means, mi – mj , but also any other linear contrast, c1m1 + … + ckmk.
The probability that all of these intervals contain its linear contrast
is 1 – 
Example
In the following example we are comparing weight
gains resulting from the following six diets
1. Diet 1 - High Protein , Beef
2. Diet 2 - High Protein , Cereal
3. Diet 3 - High Protein , Pork
4. Diet 4 - Low protein , Beef
5. Diet 5 - Low protein , Cereal
6. Diet 6 - Low protein , Pork
Gains in weight (grams) for rats under six diets
differing in level of protein (High or Low)
and source of protein (Beef, Cereal, or Pork)
Diet
Mean
Std. Dev.
x
x2
1
73
102
118
104
81
107
100
87
117
111
100.0
15.14
1000
102062
2
98
74
56
111
95
88
82
77
86
92
85.9
15.02
859
75819
3
94
79
96
98
102
102
108
91
120
105
99.5
10.92
995
100075
4
90
76
90
64
86
51
72
90
95
78
79.2
13.89
5
107
95
97
80
98
74
74
67
89
58
83.9
15.71
792
839
64462 72613
6
49
82
73
86
81
97
106
70
61
82
78.7
16.55
787
64401
The Diet Example
Source
d.f.
Sum of
Squares
Between
5
Within
Total
F-ratio
4612.933
Mean
Square
922.587
54
11586.000
214.556
(p = 0.0023)
59
16198.933
4.3
k = 6, n = 54(≈ 60), M.S.E = 214.556, n = 10
Tukey’s critical difference
q0.05 = 4.163
D0.05  q0.05
MSError
214.556
 4.163
 19.28
n
10
Tukey’s intervals
xi  x j  D0.05 or xi  x j  19.28
Scheffe’s critical difference
F0.05 = 2.368, n1 = k – 1= 5, n2 = 54 (≈ 60)
S0.05
2  214.556
2MSError
  k  1 F0.05
 5  2.368
 22.54
n
10
Scheffe’s intervals
xi  x j  S0.05 or xi  x j  22.54
One-at-a-time critical difference
t0.025 = 2.00o, n = 54 (≈ 60)
T0.05  t0.025
2  214.556
2MSError
  2.000
 13.10
n
10
One-at-a-time intervals
xi  x j  T0.05 or xi  x j  13.10
Multiple Confidence Intervals
Tukey
19.28
i
1
1
1
1
1
3
3
3
3
2
2
2
5
5
4
xi
100
100
100
100
100
99.5
99.5
99.5
99.5
85.9
85.9
85.9
83.9
83.9
79.2
j
3
2
5
4
6
2
5
4
6
5
4
6
4
6
6
xj
99.5
85.9
83.9
79.2
78.7
85.9
83.9
79.2
78.7
83.9
79.2
78.7
79.2
78.7
78.7
lower
-18.78
-5.18
-3.18
1.52
2.02
-5.68
-3.68
1.02
1.52
-17.28
-12.58
-12.08
-14.58
-14.08
-18.78
upper
19.78
33.38
35.38
40.08
40.58
32.88
34.88
39.58
40.08
21.28
25.98
26.48
23.98
24.48
19.78
Scheffe
22.54
lower
-22.04
-8.44
-6.44
-1.74
-1.24
-8.94
-6.94
-2.24
-1.74
-20.54
-15.84
-15.34
-17.84
-17.34
-22.04
upper
23.04
36.64
38.64
43.34
43.84
36.14
38.14
42.84
43.34
24.54
29.24
29.74
27.24
27.74
23.04
One-at-a-time
13.10
lower
-12.60
1.00
3.00
7.70
8.20
0.50
2.50
7.20
7.70
-11.10
-6.40
-5.90
-8.40
-7.90
-12.60
upper
13.60
27.20
29.20
33.90
34.40
26.70
28.70
33.40
33.90
15.10
19.80
20.30
17.80
18.30
13.60
Multiple comparisons for Factorial Designs
In a balanced completely randomized design
(CRD) it is appropriate to use Tukey’s or
Scheffe’s procedure to compare the cell
means associated with the interaction
Example – Four factor experiment
Four factors are studied for their effect on Y (luster
of paint film). The four factors are:
1) Film Thickness - (1 or 2 mils)
2) Drying conditions (Regular or Special)
3) Length of wash (10,30,40 or 60 Minutes), and
4) Temperature of wash (92 ˚C or 100 ˚C)
Two observations of film luster (Y) are taken
for each treatment combination
The data is tabulated below:
Regular
Dry
Minutes 92 C
1-mil Thickness
20
3.4 3.4
30
4.1 4.1
40
4.9 4.2
60
5.0 4.9
2-mil Thickness
20
5.5 3.7
30
5.7 6.1
40
5.5 5.6
60
7.2 6.0
100 C
92C
Special Dry
100 C
19.6
17.5
17.6
20.9
14.5
17.0
15.2
17.1
2.1
4.0
5.1
8.3
3.8
4.6
3.3
4.3
17.2
13.5
16.0
17.5
13.4
14.3
17.8
13.9
26.6
31.6
30.5
29.5
30.2
30.2
4.5 4.5
5.9 5.9
5.5 5.8
25.6
29.2
32.6
22.5
29.8
27.4
31.4
29.6
8.0 9.9
33.5
29.5
Since there is significant A(Temp) - B(drying) and A(Temp) –
D(Thickness) interactions, it is appropriate to compare the 8
Temp×drying×Thickness cell means.
Since length is significant and is additive with the other 3
factors it is approriate to compare the 4 cell means associated
• Since there is significant A(Temp) - B(drying) and
A(Temp) – D(Thickness) interactions, it is
appropriate to compare the 8
Temp×drying×Thickness cell means.
• Since the main effect for Length is significant and
is additive with the other 3 factors it is appropriate
to compare the 4 cell means associated with this
factor separately
Table 5: Critical Values for
the multiple range Test , and the F-distribution
Length
Temp,Thickness,Dry
q.05
q.01
F.05
F.01
3.84
4.60
4.80
5.54
2.92
2.33
4.51
3.30
Table 6: Tukey's and Scheffe's Critical Differences
Tukeys
Scheffés
 = .05
 = .01
 = .05
 = .01
Length
1.59
1.99
2.05
2.16
Temp, Thickness, Dry 3.81
4.59
4.74
5.64
Table of differences in means
4.25
4.25
4.44
5.66
15.45
17.43
28.76
29.95
4.44
0.19
5.66
1.41
1.22
15.45
11.2
11.01
9.79
17.43
13.18
12.99
11.77
1.98
28.76
24.51
24.32
23.1
13.31
11.33
29.95
25.7
25.51
24.29
14.5
12.52
1.19
Underlined groups that have no significant differences
There are many multiple (post hoc) comparison
procedures
1. Tukey’s
2. Scheffe’,
3. Duncan’s Multiple Range
4. Neumann-Keuls
etc
Considerable controversy:
“I have not included the multiple comparison methods of
D.B. Duncan because I have been unable to understand
their justification” H. Scheffe, Analysis of Variance
Download