icmn^ t!& J*«

advertisement
t!&
icmn^
J*«
CONDITIONAL PREDICTION AND UNBIASEDNESS IN
STRUCTURAL EQUATIONS
Gordon M. Kaufman
12
August 1965
138-65
MASSACHUSETTS
INSTITUTE OF TECHNOLOGY
50 MEMORIAL DRIVE
CAMBRIDGE, MASSACHUSETTS 02139
CONDITIONAL PREDICTION AND UNBIASEDNESS IN
STRUCTURAL EQUATIONS
Gordon M. Kaufman
12 August 1965
138-65
"1^
/
CONDITIONAL PREDICTION
.'UD U!:BIASEi;NESS
IN STRUCTURAL EQUATIONS
Gordon M. Kaufman
12 August
1.
1965
Introduction
Sewall
4
[
estimators of
]
a
,
Waugh
[
6
],
and Srinivasan
5
[
discuss least squares
]
very particular set of two structural equations with no exoge-
nous variables and the net result of their discussion is that such estimators
are unbiased predictors of one dependent variable given the other.
pose here is to generalize Sewall 's main result in several
Our pur-
direction.^,
and
to provide some ancillary facts about the structural equations defined in
Throughout this note we distinguish a random variable from a
(2.1) below.
value assuned by it with a tilde; e.g. the random matrix ^.
denotes the Kroenecker direct product; e.g. A S
randon matrix
(m X m)
-
I
I
2.
c
o
(H,
n)''
if
4
hes density
-jCn-m-l)
'='
6. H PDS
n
where
fi
And we say that the
B.
is "Wishart with parameter set
ytr H £
''o^
i
£
The symbol
>
m-1
otherwise
is a
normalizing constant,
Generalization
First,
Two questions immediately arise:
assertion a.out
x'^^/x'^x
when m
>
exogenous variables are present?"
2
,
B
"^^^hat
is arbitrary
And second,
is
the analogue of Sewell's
(m x m)
"UTiat
non-singular, and
does the answer to the
single endogefirst question imply .bout the conditional expectation of any
:
- 2 -
nous variable given the
v,
lues of all other endogenous variables and the values
of the exogenous variables?"
To make these questions precise, consider the following system of stochastic
equations
iX^a)
where B and
_z
-^
is an
_rz(J) + u(J^
=
are (m x m) and (m x
r
coefficient matrices, fixed for all j,
r)
vector of predetermined variables and
(r X 1)
^
^"^
^
^^^
.-.(j)
We assume that {u
random vectors.
(m X 1)
(2.1)
j=l,2,...} is a sequence of mutually independent, identically Normal random
vectors with mean
One observes
1)
,
dimension
(_^
>
and PDS covariance matrix
^^
3~^>^
)
>
•
•
•
1 _^ p
(p
^
m-1,
x r)
,
let
^
=
B
Z
B
^
into
=
h^
;
t
=
(y_
-B
=
n^
and B is non-singular.
V
nor h is known with
t, t
y_^)
-1
conformably partition
f^
£
but neither B, nor
If we partition a generic
certainty.
(p x
0^
F
with
into
^
of
n =
dimensi on
,
Hj,
of
h
and set
r"
%2
=11
Si2
=11
.-1
fill'
%1
=21
£22
Sii
'^^^
^^ ^ ^^
=22
then
E(iilx2. ^) =
h-~
Sll 221^-^2
h
-^
(2.2)
and
Var(x^|X2
z)
ill = fOii
hi hi
221^
(2.3)
- 3 -
Now suppose we observe a sequence {(^/^
,
zT^h
j
,
sample observations generated according to (2.2).
I =
iL^^K
V=
E
.
.
.
,l}''h
Z =
,
= l,2,...,n}
of n
r+m
_>
Let
[z(l\...,z(")]
z of
.
rank
r,
and
z^J^
z^J)
= Z Z'
Then it is well known ([1] p. 183) that given
i E i z' V"^
and
I
E
-T, and l, the statistics
B,
[^^J^ - p
-A^^^
P z'-"
ELi^J) - P z^J^]
]
)
are mutually
independent
and that P
- is an unbiased estimator of = and n-r
J
r
=e
II
an uni^iased estimator of
Partition P and
fil.
as follows:
e
Si
P
dim (p X
r)
22
and
Sl2
11
e
=
,
= 21
e^^ dim (p x p)
;22
The analogue of the central question of
be stated:
Is
it
[
4
]
,
[
5
]
,
and
[
6
]
may now
true that
Ui[^^'^ \4^^'\^^''''h
- E(P,)
z^"^^^- E(i^2 122^4"'''^- ECP^) U^""''^
The answer is "yes" and this is proven shortly.
considerably more in the next section.
?
We go on, however, and do
(2-^)
Defining
|i
~
o
i
joint likelihood of (P,
(1)
~ Sl2 §22 =21 ^^'^
In
R,
„»
o'
£-ii
Normal with mean
P is
='[
given
=22^
e^,
(p X p)
is
„
R,
^
„,
R,
2>
^^'^
mal with mean H~
3"*^
£22'
^22' consequently
&
e
„
R,
„
given z^^ is Nor-
and variance-covariance matrix
H
and the marginal density of £22 ^^ Wisuart with
parameter set (2~2» n+p-r)
(4)
„>
and l^^ ^^^ jointly independent of P and Ij^^^^'
while the conditional density of
U
e
Wishart with parameter set (H,,, n-r)
and is independent of P,
(3)
and Q has these properties:
n
and variance-covariance matrix
n
H a V, and is independent of R
(2)
"^ show that the
~ =12 =22'
2
The density of
R,
j
.
And £(£22) = (n+p-r)fi22-
unconditional as regards P, ^22'
^"*^
^"
S1I.2
^"ltSi.2-Sn Si2lS22tii.2-aii Si2^
-^
''-'^
Sill
where c" is a normalizing constant equal to
r
jr
4[n+p-r])
IL'22'
1
^(r-p)
^2
2:2
and
(a)
r
p
P
Tpr
=
.1,
1—1
-
2("-^-)
'=11'
,
r
ihn-r])
p^2'
r(a4(i-l))-
This is the generalized multi-
^
derived by Savage
var iate Student density first
R
K
„
1
(
3
].
and variance-covariance matrix
mean hTJ
has
nas mectu
ti-^2
^.-^ H^„
0-1
Q h"''".
n-r-1 =22 ^ "ll"
(See Martin [2
])
Here
-
5
-
The above properties clearly imply that (2.4) holds and in
addition, since
,-(n+l)|
(n+1)
(n+1.
,,
1
Var(x^
= u-1
^
z^^^^ is an unbiased estimate
^
1^2
Sn. imply that
.
-,
,
j^^
»
^
(n+1)
,--(n+l)|
,.
of •ar(x-,^
1^2
,
^
(n+1),
^.
.,
Similrrly, since \aT{^^\z) =
^^2. an unbiased
)•
estimate of this variance-covariance matrix is
ning
"
,
—
Setting p=l and run-
e
list of assertions above answers the second question posed at the
dov.n the
outset of this section.
3.
Proofs
It is well known (see
[
1
183)
p.
],
that the joinl likelihood of (P,
t_)
given y, and H, is
e
c
where
^S»
-
ytr H
|(n-r-m-l)
e
e
•
(3.1)
l^l
To find the joint density of
is a normalizing constant.
^^^'^ ^^^ following
=1 2' Sii 2' =22^ "^
Lemma
(P,
c
Jtr H{[P-n]V[P-n]'^}
The Jacobian J(P,
:
to
e)
Proof
^ P. |n
R^ 2'
ill. 2' ^22"*
(P,
^^
2'
=11 2'
=22''
°^ ^^^ transformation from
i=£:22'^"
We split the Jacobian into the product of four transformations done
:
successively:
P
J(e22 ^ 422^ "
-^'
ill = ill.
J(£
£
->
e_
2
"^
£, £22 ^ =^2' =12
^°" §1
^ Si. 2 i22 Si.
)
=
1.
2
2-
"
"*
=12 =22'
5l
^^'^
^^° =11 ^ =11.2"
2
^
little algebra shows that
Consequently J(|^2 " Si. 2^ = '^2!^
|)
rewrite
^^
Multiplying all four Jacobians together prcves the lemma.
We can now find the joint density of (P, R^_2' ill.
(P,
^^^^^> ^^S. * P) =
2'
°^
=22^ ^^°^ ^^^^
by substituting in (3.1) and multiplying by the Jacobian I422
•
1
^'^^^^
- 6 -
tr H
= tr {H^^ e^^ + H^^
e
^ S21 ^12 +
^22
hi
= ^^ Sll S1I.2 ^ ^^ 222
hl^
^ ^^ SllfSl.2-SnSl2J|22L^.2-S'l^l2^'
hi
by writing out tr H
e,
substituting using the definitions of
the H..,
2,
and of R
and e,, „ in terms of tae e..,
=i.z
=il./
=ij
1
=ij
i,
—<
—<
j
in terms of
fi„
i
~<
The joint density (3.1) multiplied by l^l^ then gives that of (P, R
i,'
,
i
-J
—<
e
2.
,
I22) as
^tr H{[P-n]V[P-n]'^}
-
-
|tr H^.
t,,
^(n-r-m-1)
„
I411.2I
•
|tr 222
£22
e
i(n+P-r-in-l)
^^-^^
I
I422I
-
i^^ SiifSi.2-Sn Si2]i22fSi.2-Su S12]'
;h>
I
and the assertions 1,
2,
and
3
of the previous section follow directly.
To prove assertion 4, rewrite that part of (3.2) involving £^2 ^^
-
¥'
i(n+2p-r-m-l)
hi
^
I
,-1 „
it „
.„
wh-.ere A = [R^^2- ^11 212^'' SlltSi.2P,
£
,
and
e
,
0-1
0-1 „
Su Sl2^ ^ =12'
1
,
^^^^8^^-^'^^
"^^^ '^^^P^^' to
is
and use the fact that the constant that normalizes (3.3)
^(n+p-r)
c*|a|
,
with c*
a
function of m, n,
r,
and p aXone.
write the complete
Using a -.ell known determinental identity we may now
density as
-
7
-1
c"| [Ri^2ill
-
-^
^U^hl^h.l-
where c" is as defined in assertion
Sll S12]
4
t
^
^ Siil
of section
2.
- 7(n+p-r)
APR
6^1
Download