Math 136 / Stat 219 - Stochastic Processes Homework Set 2

advertisement
Math 136 / Stat 219 - Stochastic Processes
Homework Set 2, Autumn 2011, Due: October 12
1. Exercise 1.3.21.
ANS: (i) The triangle inequality for the Lq norm is ∥X + Y ∥q ≤ ∥X∥q + ∥Y ∥q (X, Y ∈ Lq ). So
q.m.
|∥Xn ∥q − ∥X∥q | ≤ ∥Xn − X∥q , and since Xn → X implies ∥Xn − X∥q → 0, it can be concluded that
∥Xn ∥q → ∥X∥q , which is equivalent to E|Xn |q → E|X|q .
1.m
(ii)Since q ≥ 1, corollary 1.3.19 indicates Xn → X, thus E|Xn − X| → 0. Applying Jensen’s inequality
with g(x) = |x| to the R.V. Xn − X, we obtain that E|Xn − X| ≥ |E(Xn − X)|, so EXn → EX.
(iii)Let Xn have the distribution P (Xn = −1) = P (Xn = 1) = 0.5 for all n, X = 0. It is clear that
EXn = 0 = EX, but E|Xn − X| = 1.
2. Exercise1.4.12
ANS: For any bounded function g that is continuous on the range of f (X), i.e., g is continuous on
{f (ω) : ω ∈ Ω}. Let h(x) = g(f (x)). Then it is clear that h is bounded and h is a continuous function
on the range of X. By Proposition 1.4.11 and the assumption that Xn → X in law, we can obtain that
E(h(Xn )) → E(h(X)). That is to say, Eg(f (Xn )) → Eg(f (X)) for any bounded g that is continuous
on the range of f (X). Applying Proposition 1.4.11 again, we conclude that f (Xn ) → f (X) in law, as
required.
3. Exercise 1.4.14.
ANS: Note that the event {Mn ≤ x} is equivalent with the event {Ti ≤ x : 1 ≤ i ≤ n}. Therefore,
∏
P(Mn ≤ x) = P(Ti ≤ x , for all i = 1, 2, . . . n) = ni=1 P(Ti ≤ x), where in the last step we used
∏
independence of the Ti ’s. Thus, P(Mn ≤ x) = ni=1 FTi (x) = (1 − e−λx )n . So, we have
P(Mn − an ≤ y) = [1 − e−λ(y+an ) ]n
= exp{n × log[1 − e−λ(y+an ) ]} (if the limit exists, an → +∞)
= exp{−ne−λ(y+an ) (1 + o(1))}
−λy
So if eλan = n, P(Mn − an ≤ y) → e−e
possible solution is an =
log(n)
λ ,
, which is a distribution function on the real line. Therefore a
and in this case M∞ has the distribution function e−e
−λy
.
Question: What is EMn ?
4. Exercise 1.4.17.
d
ANS:(a)Let X be a bernoulli random variable, P(X = 1) = 0.5, and Y = 1 − X, therefore X = Y , but
1
P(X = Y ) = P(X = 0.5) = 0.
(b)Let Xn have the uniform distribution on [0, n1 ] and X = 0, then Xn → X, but it is obvious that Xn
has density but X doesn’t.
(c)(In this part, let [x] denote the Gaussian function, whose value is the greatest integer which doesn’t
exceed x.)
t
P(pZp > t) = P(Zp > [ ])
p
[ pt ]
= (1 − p)
→ e−t (p → 0)
Thus P(pZp ≤ t) → 1 − et for every t ∈ [0, +∞) , which is the cdf of Exp(1). This means the cdf of pZp
converges to the cdf of T pointwise, and so the statement of the problem can be concluded.
∫
(d)Since both fn (x) and f∞ (x) are probability density functions, it must be true that R fn (x)dx =
∫
∫
∫
R f∞ (x)dx = 1, which means R (f∞ (x) − fn (x))I{f∞ >fn } dx = R (fn (x) − f∞ (x))I{fn >f∞ } dx. Therefore
∫
∫
|f∞ (x) − fn (x)|dx = 2 (f∞ (x) − fn (x))I{f∞ >fn } dx
R
∫ R
=
gn (x)f∞ (x)dx
R
Since the RHS of the last equation is equal to Egn (X∞ ), gn (x) → 0 pointwise when n → ∞ and
∫
gn (x) ∈ [0, 2], it follows from Corollary 1.4.28 that R gn (x)f∞ (x)dx = Egn (X∞ ) → 0 as n → ∞.
Let Fn (x) n ∈ N ∪ {∞} denote the cdf of Xn , then it follows that for every x ∈ R
∫
|Fn (x) − F∞ (x)| = |
fn (x) − f∞ (x)dx|
(−∞,x]
∫
≤
|fn (x) − f∞ (x)|dx
(−∞,x]
∫
≤
|fn (x) − f∞ (x)|dx → 0
R
Thus Xn → X∞ in law.
5. Exercise 1.4.44.
ANS: We first show that if N is a R.V. that takes non-negative integer value, then EN =
j). In order to see this, note that
EN =
∞
∑
i=1
iP(N = i) =
∞ ∑
i
∑
P(N = i) =
i=1 j=1
∞ ∑
∞
∑
j=1 i=j
2
P(N = i) =
∞
∑
j=1
P(N ≥ j) .
∑∞
j=1 P(N
≥
Since ξi is positive, the monotone convergence theorem indicates that
EX
(M on)
∞
∑
(indep.)
∞
∑
=
Eξi I{N ≥i}
i=1
=
Eξi EI{N ≥i}
i=1
∞
∑
=
Eξi P (N ≥ i)
i=1
Eξ1
=
∞
∑
P (N ≥ i)
i=1
Eξ1 EN
=
Question: Suppose that the assumption is not N is independent with ξi . Instead, we assume that
{N ≤ k} ∈ σ(X1 , . . . , Xk ) for all k ∈ IN. Would the same result hold?
3
Download