Probability: Solutions to Select Problems

advertisement
Probability:
Solutions to Select Problems
Davar Khoshnevisan
Department of Mathematics
University of Utah
Salt Lake City, UT 84112–0090
davar@math.utah.edu
http://www.math.utah.edu/˜davar
October 7, 2013
Chapter 6
Independence
6.1. First consider the case that f 0 > 0. For all ! 2 ⌦,
Z X(!)
Z1
f(X(!)) =
f 0 (x) dx =
1{06x6X} (!)f 0 (x) dx.
0
-1
Assuming that we can handle product-measurability issues, it follows
from Fubini–Tonelli that
Z 1
Z1
E[f(X)] = E
1{06x6X} (!)f 0 (x) dx =
P{X > x}f 0 (x) dx.
0
-1
0
0
0
0
This is the desired result
Rx 0when f > 0. In general, we write f = f+ - fand define f(±) (x) = 0 f±
(z) dz. The preceding development yields,
Z1
⇥
⇤
0
E f(±) (X) =
f±
(x)P{X > x} dx.
0
But f+ - f- = f (why?), whence the Problem. It suffices to prove the
asserted product measurability.
Evidently, f 0 is a product-measurable function of (x, !); so is x 7! 1[0,1) (x).
So it suffices to prove that (x, !) 7! 1{X>x} (!) is product-measurable.
Define
In (x, !) =
X
1[i/n,(i+1)/n) (x)1[j/n,(j+1)/n) (X(!)).
16i<j
Each In is manifestly product-measurable. Therefore, so is 1{X>x} (!) =
limn!1 In (x, !). Therefore, 1{y<X6x} = 1{y<X} - 1{x<X} is measurable
for all x > y; hence, so is 1{X=x} = limy"x 1{y<X6x} . Finally, we see that
1{X>x} (!) = 1{X>x} (!) + 1{X=x} (!) is measurable.
6.3. Note that
Z 1 Z 1
Z1 Z1
P{X > x , Y > y} dx dy = E
1{X>x}\{Y>y} dx dy = E[XY].
0
0
0
13
0
14
CHAPTER 6. INDEPENDENCE
The result follows from this and Lemma 6.9.
6.5. According to Problem 1.3 of Chapter 1 we can construct a probability
space (⌦, F, P) on which there are events A, B, C that are pairwise independent but not independent. Define X := 1A , Y := 1B , and Z := 1Z to
finish.
6.8. To prove Cor. 6.19 we apply Lemma 6.13 to find that if X and Y are independent then E[XY] = EXEY. Equivalently, Cov(X , Y) = 0. Next,
suppose E[Xi ] = 0, and Xi 2 L2 (P). Then,
2
!
!2 3
n
n
X
X
Var
Xi = E 4
Xi 5
i=1
2
= E4
=
n
X
i=1
n
X
i=1
X2i +
XX
16i6=j6n
Var(Xi ) +
i=1
3
Xi Xj 5
XX
Cov(Xi , Xj ).
16i6=j6n
If E[Xi ] 6= 0, then replace Xi by Yi := Xi - E[Xi ] to find that
!
n
n
X
X
XX
Var
Yi =
Var(Yi ) +
Cov(Yi , Yj ).
i=1
i=1
16i6=j6n
Now let Y and Z be random variables, and a, b 2 R. Then, you can check
that:
Var(Y + a) = Var(Y),
and
Cov(Y + a , Z + b) = Cov(Y , Z).
Together, these remarks prove Corollary 6.20.
6.9. Let Z denote an independent N(0, 1), and define Xn = X + (Z/n). The
variable Xn has all of the desired properties.
P
P1
1/p
p
} =
6.10. (i))(ii) If X1 2 Lp (P) then 1
n=1 P{|X1 | >
n=1 P{|Xn | > "n
p
p
p
" n} 6 E{|X1 | }/" . So by the Borel–Cantelli lemma, with probability
one, |Xn | 6 "n1/p for all n large. This proves that |Xn |/n1/p ! 0 a.s.
(ii)) (iii) This follows from the following real-variable Fact. If an /n⇢ !
0 for some ⇢ > 0, then max16j6n (aj /n⇢ ) ! 0.
Proof: If not, then maxj6n aj > "n⇢ infinitely often. But there exists
n0 such that for all n > n0 |an | 6 ("/2)n⇢ . So maxj6n0 aj > "n⇢ for
infinitely-many n’s, which is patently nonsense.
(iii))(i) Because (iii) implies (ii), it suffices to prove that (ii) implies (i).
But this too is Borel–Canelli (as in i)ii).
Download