Equivalence of the Full Model/Reduced Model and H :C

advertisement
Equivalence of the Full Model/Reduced Model
and H0 :C = 0 Sums of Squares
In a regression context, consider a (full rank model matrix)
X
n (r+1)
= (1jx1 jx2 j : : : jxr )
For p < r, let
Xp = (1jx1 jx2 j : : : jxp )
and
Up = (xp+1 jxp+2 j : : : jxr )
For
C=
j
0
I
(r p) (p+1) (r p) (r p)
consider
0
1
C (X0 X)
SSH 0 = (CbOLS )
C0 (CbOLS )
which we invented for testing H0 :C = 0. The object here is to show that
SSH 0
= SSEReduced SSEFull
= Y0 I PXp Y Y0 (I
0
= Y PX
PX ) Y
PXp Y
To begin, note that with
A = U0p I
1
PXp Up
U0p I
PXp
we have
C = AX
Then,
SSH 0
=
b
AY
0
(APX A0 )
1
b
AY
1
= Y0 PX A0 (APX PX A0 )
APX Y
Then write
PX A 0
= PX I
PXp Up U0p I
PXp Up
=
Up
PXp Up
U0p I
PXp
=
Up
PXp Up
Up
PXp Up
Use the abbreviation
Up = Up
PXp Up
1
I
0
1
PXp Up
1
1
Up
PXp Up
and note that
SSH 0
= Y0 Up Up0 Up
1
1
Up0 Up
Up0 Up Up0 Up
1
1
Up0 Up
1
Up0 Y
= Y0 PUp Y
It then su¢ ces to show that PUp = PX PXp .
Christensen’s Theorem B.47 shows that PX PXp is a perpendicular projection matrix. His Theorem B.48 says that C PX PXp is the “orthogonal
complement of C (Xp ) with respect to C (X)” de…ned on his page 395. That
is,
?
C PX PXp = C (X) \ C (Xp )
?
so it su¢ ces to show that C Up = C (X) \ C (Xp ) .
Each column of Up PXp Up is a di¤erence in a column of Up and a linear combination of columns of PXp . Since C (Up )
C (X) and C PXp =
C (Xp ) C (X), each column of Up PXp Up is in C (X) and C Up
C (X).
So then note that if v 2C Up 9 such that
v = Up
For such a v
I
PXp v = I
PXp Up = I
PXp
PXp Up = I
I
PXp Up = Up = v
?
?
C (X) \ C (Xp ) .
so v 2C (Xp ) . Thus C Up
?
Finally, suppose that v 2C (X) \ C (Xp ) and for some
0
1
1
(p+1) 1
B
=@
write
v = X = (Xp jUp )
2
(r p) 1
= Xp jUp + PXp Up
C
A
= Xp
1
+ Up + PXp Up
2
Then continuing
v = Xp
1
+ PXp Up + Up
2
= PXp Xp
1
+ PXp Up + Up
2
so
v = PXp (Xp jUp )
+ Up
Since v ?C (Xp ) we then have v = Up
C (Xp ) C Up and we’re done.
2
2
2
= PXp v + Up
2
and thus v 2C Up : So C (X) \
Download