Proof of the Halmos-Savage Factorization Theorem (The- orem 50).

advertisement
Proof of the Halmos-Savage Factorization Theorem (Theorem 50).
We’ll take Lemma 51 (Lemma 1.1 of Shao) and Lemma 52 (Lemma 2.1 of
Shao) as given, prove Lemma 53, and then prove Theorem 50 given Lemmas 51,
52 and 53.
Consider the proof of Lemma 53 (assuming the truth of Lemmas 51
and 52).
Suppose that T is su¢ cient. Let B 2 B and note that by su¢ ciency 9 a
B0 -measurable function P (BjB0 ) such that for all
E (IB jB0 ) = P (BjB0 ) = P (BjB0 ) a.s. P
Then for B0 2 B0
Z
B0
P (BjB0 ) dP = P (B \ B0 ) for all
by the de…nition of conditional expectation. Then since
Z
P (BjB0 ) d = (B \ B0 )
2
=
(1)
P
ci P i ,
B0
and so (again by the de…nition of conditional expectation)
E (IB jB0 ) = P (BjB0 ) a.s.
and thus
E (IB jB0 ) = P (BjB0 ) a.s. P for all
(2)
(that is, this object serves as the conditional expectation wrt each P and wrt
).
Now consider P 0 and 0 , respectively the restrictions of P and to B0 and
dP 0
the R-N derivative d 0 on (X ; B 0 ). By Lemma 51 Theorem, 9 non-negative
F-measurable functions g such that
dP 0
=g
d 0
We need to show that g
T
(3)
T also serves as an R-N derivative of P wrt . So
1
let B 2 B.
Z
P (B) =
P (BjB0 ) dP
(by fact (1))
ZX
=
E (IB jB0 ) dP
(by fact (2))
X
Z
=
E (IB jB0 ) dP 0
(since E (IB jB0 ) is B0 -measurable)
X
Z
=
E (IB jB0 ) g T d 0 (by relationship (3))
X
Z
=
E (IB jB0 ) g T d
ZX
=
E (IB g T jB0 ) d
(since g T is B0 -measurable)
ZX
=
IB g T d
(X 2 B 0 and the de…nition of conditional expectation)
ZX
=
g Td
B
That is, g T works as the R-N derivative of P wrt
theorem is proved.
and the …rst half of the
So conversely, suppose that 9 non-negative F -measurable functions g
such that
dP
= g T a.s.
d
Since g T is B0 -measurable,
dP 0
= g T a.s. 0
(4)
d 0
To show that T is su¢ cient, it will be enough to show that for any B 2 B
E (IB jB0 ) = E (IB jB0 ) a.s. P for all
So for …xed
and B 2 B, de…ne a measure on (X ; B) by
Z
(C) =
IB dP = P (C \ B)
C
Now
P and
d
= IB a.s. P
dP
For all B0 2 B0 ,
Z
(B0 ) =
IB dP
B0
Z
=
E (IB jB0 ) dP
(5)
(by the de…ntion of conditional expectation)
B0
2
So for
0
the restriction of
to B0
d 0
= E (IB jB0 ) a.s. P 0
dP 0
(6)
(since E (IB jB0 ) is B0 -measurable and has the right integrals).
0
0
P0
d 0
d 0 dP 0
=
a.s. 0
d 0
dP 0 d 0
So since P 0
0
Then since
and facts (4) and (6) hold
d
d
0
0
T a.s. P 0
= E (IB jB0 ) g
On the other hand, from (5), we still have
d
dP
d
dP
= IB
= IB g
d
d
(7)
= IB a.s. P , so that
T a.s. P
This then implies by the de…nition of conditional expectation that
d
d
0
0
T jB0 ) a.s. P 0
= E (IB g
(The object on the right isR obviously B0 -measurable, and if we multiply by IB0
and integrate d 0 , we get IB\B0 g T d = (B0 ).) So
d
d
0
0
= E (IB jB0 ) g
T a.s. P 0
(8)
So from (7) and (8)
E (IB jB0 ) g
T = E (IB jB0 ) g
T a.s. P 0
This is enough to imply that
E (IB jB0 ) = E (IB jB0 ) a.s. P 0
(9)
Why? Let A = fxj g (T (x)) = 0g and note that
Z
P (A) =
g Td = 0
A
So X A has P probability 1 and so does fxj E (IB jB0 ) g T =E (IB jB0 )
g T g \ (X A). And on this set, E (IB jB0 ) =E (IB jB0 ).
So (9) holds and Lemma 53 has been proved (conditional on Lemmas 51 and
52).
Finally, assume that Lemmas 51, 52, and 53 have been established,
and consider the proof of Theorem 50 (the H-S Factorization Theorem).
3
Suppose that T is su¢ cient. P
implies that
dP
dP
=
d
d
d
d
a.e.
(and both functions on the right of this equality are non-negative). By Lemma
53; 9 a nonnegative function g such that
dP
=g
d
T a.s.
are
Let X = fxj dd (x) > 0g. I claim that as measures on X ; and
equivalent. Clearly,
on X : So suppose that B is a B-measurable subset
of X that has (B) = 0. But
Z
d
(B) =
d
B d
while since B X the integrand here is positive. So it must be that
Thus and are equivalent on X .
(B) = 0:
So then consider
=
dP
d
(g
d
d
=
d
d
T)
dP
d
g
T
Letting
dP
(x) 6= g (T (x))g
d
since the second term in this intersection has probability 0, (D) = 0 and
thus
dP
d
= (g T )
a.e.
d
d
D = fxj
and taking h =
d
d
(x) 6= 0g = X \ fxj
the forward implication is proved.
Conversely, suppose that the factorization holds, that is that f =
dP
= (g T ) h a.e. .
d
Now
1
X dP
d
=
ci
d
d
i=1
=
1
X
i
ci h (g
i
T)
i=1
=h
1
X
ci (g
i=1
= h (k T )
4
i
T)
for k =
P1
i=1 ci g
i
. Then the fact that P
dP
dP
=
d
d
so that
(g
T) h =
Then de…ne
g
On X
and this set has
d
d
dP
d
again implies that
a.e.
h (k T ) a.e.
8
< g T
T =
k T
: 0
dP
=g
d
probability 1, i.e.
dP
=g
d
.
d
>0
d
otherwise.
if
T
T a.e.
so that Lemma 53 then implies that T is su¢ cient.
So Theorem 50 has been proved (conditional on Lemmas 51-53).
5
Download