Document 10639923

advertisement
When is the OLSE the BLUE?
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
1 / 40
When is the Ordinary Least Squares Estimator (OLSE) the Best Linear
Unbiased Estimator (BLUE)?
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
2 / 40
We already know the OLSE is the BLUE under the GMM, but are there
other situations where the OLSE is the BLUE?
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
3 / 40
Consider an experiment involving 4 plants.
Two leaves are randomly selected from each plant.
One leaf from each plant is randomly selected for treatment with
treatment 1.
The other leaf from each plant receives treatment 2.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
4 / 40
Let yij = the response for the treatment i leaf from plant j
(i = 1, 2; j = 1, 2, 3, 4). Suppose
yij = µi + pj + eij ,
where p1 , . . . , p4 , e11 , . . . , e24 are uncorrelated,
E(pj ) = E(eij ) = 0, Var(pj ) = σp2 , Var(eij ) = σ 2
c
Copyright 2012
Dan Nettleton (Iowa State University)
∀ i = 1, 2; j = 1, . . . , 4.
Statistics 611
5 / 40
Suppose σp2 /σ 2 is known to be equal to 2 and
y = (y11 , y21 , y12 , y22 , y13 , y23 , y14 , y24 )0 .
Show that the AM holds.
Find OLSE pf µ1 − µ2 and the BLUE of µ1 − µ2 .
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
6 / 40
Although we assumed that σp2 /σ 2 = 2 in our example, that assumption
was not needed to find the GLSE.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
23 / 40
We have looked at one specific example where the OLSE = GLSE =
BLUE.
What general conditions must be satisfied for this to hold?
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
24 / 40
Result 4.3:
Suppose the AM holds. The estimator t0 y is the BLUE for E(t0 y) ⇐⇒ t0 y
is uncorrelated with all linear unbiased estimators of zero.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
25 / 40
Proof:
(⇐=) Let h0 y be an arbitrary linear unbiased estimator of E(t0 y). Then
E((h − t)0 y) = E(h0 y − t0 y)
= E(h0 y) − E(t0 y)
= 0.
Thus, (h − t)0 y is linear unbiased for zero.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
26 / 40
It follows that
Cov(t0 y, (h − t)0 y) = 0.
Var(h0 y) = Var(h0 y − t0 y + t0 y)
= Var((h − t)0 y) + Var(t0 y).
∴ Var(h0 y) ≥ Var(t0 y) with equality iff
Var((h − t)0 y) = 0 ⇐⇒ h = t.
∴ t0 y is the BLUE of E(t0 y).
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
27 / 40
(=⇒) Suppose h0 y is a linear unbiased estimator of zero. If h = 0, then
Cov(t0 y, h0 y) = Cov(t0 y, 0) = 0.
Now suppose h 6= 0. Let
c = Cov(t0 y, h0 y)
and d = Var(h0 y) > 0.
We need to show c = 0.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
28 / 40
Now consider a0 y = t0 y − (c/d)h0 y.
E(a0 y) = E(t0 y) − (c/d)E(h0 y)
= E(t0 y).
Thus, a0 y is a linear unbiased estimator of E(t0 y).
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
29 / 40
Var(a0 y) = Var(t0 y − (c/d)h0 y)
= Var(t0 y) +
c2
Var(h0 y)
d2
− 2Cov(t0 y, (c/d)h0 y)
c2
d − 2(c/d)c
d2
c2
= Var(t0 y) − .
d
= Var(t0 y) +
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
30 / 40
Now
Var(a0 y) = Var(t0 y) −
c2
d
⇒ c = 0 ∵ t0 y has lowest variance among all linear
unbiased estimator of E(t0 y).
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
31 / 40
Corollary 4.1:
Under the AM, the estimator t0 y is the BLUE of E(t0 y) ⇐⇒ Vt ∈ C(X).
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
32 / 40
Result 4.4:
Under the AM, the OLSE of c0 β is the BLUE of c0 β ∀ estimable
c0 β ⇐⇒ ∃ a matrix Q 3 VX = XQ.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
35 / 40
Proof of Result 4.4:
(⇐=) Suppose c0 β is estimable.
Let t0 = c0 (X0 X)− X0 . Then
VX = XQ
⇒ VX[(X0 X)− ]0 c = XQ[(X0 X)− ]0 c
⇒ Vt ∈ C(X), which by Cor. 4.1,
⇒ t0 y is BLUE of E(t0 y)
⇒ c0 β̂ OLS is BLUE of c0 β.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
36 / 40
(=⇒) By Corollary 4.1, c0 β̂ OLS is BLUE for any estimable c0 β.
⇒ VX[(X0 X)− ]0 c ∈ C(X)
∀ c ∈ C(X0 )
⇒ VX[(X0 X)− ]0 X0 a ∈ C(X)
⇒ VPX a ∈ C(X)
∀ a ∈ Rn
∀ a ∈ Rn
⇒ ∃ qi 3 VPX xi = Xqi
∀ i = 1, . . . , p,
where xi denotes the ith column of X.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
37 / 40
⇒ VPX [x1 , . . . , xp ] = X[q1 , . . . , qp ]
⇒ VPX X = XQ, where Q = [q1 , . . . , qp ]
⇒ VX = XQ
c
Copyright 2012
Dan Nettleton (Iowa State University)
for
Q = [q1 , . . . , qp ].
Statistics 611
38 / 40
Show that ∃ Q 3 VX = XQ in our previous example.
c
Copyright 2012
Dan Nettleton (Iowa State University)
Statistics 611
39 / 40
Download