Solutions - review problems for exam 2

advertisement
Solutions - review problems for exam 2 - Math 468/568, Spring 15
1. Calls arrive at a call center following a Poisson process with rate 4/min.
(a) What is the probability that no calls come in during a 2 minute period?
Solution: The number of calls in a 2 min period has a Poission distribution
with parameter 8. So the probability of no calls is exp(−8).
(b) What is the expected number of calls that come in during a 2 minute
period?
Solution: As in (a), the number of calls in a 2 min period has a Poission
distribution with parameter 8. So the mean is 8.
(c) Suppose we know that 3 calls come in during the first minute of a 2
minute period. What is the expected number of calls during the 2 minute
period?
Solution: The number of calls in the first minute and the number in the
second minute are independent random variables. So the mean number in
the second min given there are 3 in the first min is just the mean number in
the second min with no conditioning. This is 4. So the answer is 3 + 4 = 7.
2. Let Nt be a Poisson process with rate λ. Let Mt = exp(aNt + bt) where
a and b are constants. Recall that if X has a Poisson distribution with
parameter λ, then its moment generating function is
E[exp(uX)] = exp((eu − 1)λ)
(a) Find a relation between a and b that implies Mt is a martingale.
Solution: Let t > s. Since it is a Markov process, we need to show
E[Mt |Ms ] = Ms . We use the fact that the Ns and Nt − Ns are independent. So
E[Mt |Ms ] =
=
=
=
=
E[exp(aNt + bt)|Ms ] = exp(bt)E[exp(aNs + a(Nt − Ns ))|Ms ]
exp(bt) exp(aNs )E[exp(a(Nt − Ns ))|Ms ]
exp(bt) exp(aNs )E[exp(a(Nt − Ns ))]
exp(bt) exp(aNs ) exp((ea − 1)λ(t − s))
Ms exp(b(t − s) + (ea − 1)λ(t − s))
So we need b(t − s) + (ea − 1)λ(t − s) = 0, which is just b = −(ea − 1)λ.
(b) Fix a positive integer k and let T be the first time that Nt reaches k,
so T = min{t : Nt = k}. Use the optional sampling theorem to compute
E[exp(bT )]. You need not show the hypotheses are satisfied.
1
Solution: Optional sampling theorem says E[MT ] = E[M0 ] = 1. NT is
always k, so MT = exp(ak + bT ). So E[exp(ak + bT )] = 1. So E[exp(bT )] =
exp(−ak).
3. A continuous time Markov chain has state space {1, 2, 3}.


−1 1
0
A =  2 −4 2 
0
1 −1
Matlab gives (to two decimal places)


0.58 0.20 0.22
exp(A) =  0.40 0.20 0.40 
0.22 0.20 0.58
(a). If we start in state 2, find the expected value of the time of the first
transition.
Solution: The time to the first transition has an exponential distribution
with parameter α(2) = −A22 = 4. This has mean 1/4.
(b). If we start in state 2, find the distribution of X1 , i.e., find P (X1 = j)
for j = 1, 2, 3.
Solution: This is the second row of exp(A). So P (X1 = j|X0 = 2) =
0.4, 0.2, 0.4 for j = 1, 2, 3.
(c). If we start in state 2, find the distribution of X2 , i.e., find P (X2 = j)
for j = 1, 2, 3. (Don’t do this by using matlab to compute exp(2A). Just use
what is given.)
Solution: We use the distribution of X1 from the previous part. So the
distribution of X2 is (0.4, 0.2, 0.4) exp(A). This works out (surprisingly) to
be (0.4, 0.2, 0.4).
(d). If we start in state 2, find the limiting distribution of Xt , i.e., find the
limit as t goes to ∞ of P (Xt = j) for j = 1, 2, 3.
Solution: We need to find the stationary distribution π. It satisfies πA = 0.
A little algebra gives (2, 1, 2) as a solution. The sum must be 1, so π =
(2/5, 1/5, 2/5). This is the long time distribution of Xt no matter what state
we start in.
2
(e). Let T be the time of the first jump. If we start in state 2, find the
distribution of XT . Recall the convention that XT is the state immediately
after it has made the jump.
Solution: It will be in either state 1 or 3. The probabilities are α(2, 1)/α(2)
and α(2, 3)/α(2). These are both 1/2.
4. A continuous time Markov chain has state space {0, 1, 2, · · ·}. For x > 0,
the chain can jump from state x to only x + 1 or 0. The rates are
α(x, x + 1) = r and α(x, 0) = 1. From x = 0 the chain can only jump to 1
with rate α(0, 1) = r. Find the values of r for which the chain is transient.
Solution: The associated discrete time Markov chain has
p(x, x + 1) = r/(1 + r), p(x, 0) = 1/(1 + r) for x > 0.
And p(0, 1) = 1. The original continuous time chain is transient if and only
if this discrete time chain is transient. The later is transient if and only if
there is a solution to
a(x) =
X
a(0) = 1
p(x, y)a(y)
y6=x
inf a(x) = 0
x
Think of a(x) as the probability the chain eventually reaches 0 given that it
starts at x. So we have for x ≥ 1.
a(x) =
r
1
a(x + 1) +
a(0)
1+r
1+r
And since p(0, 1) = 1 we also have a(0) = a(1). So a(1) = 1. Using a(0) = 1,
solving for a(x + 1) gives for x ≥ 1,
a(x + 1) =
1+r
1
a(x) −
r
r
Use this equation with x = 1, 2, 3, · · · to solve for a(2), a(3), a(4), · · · and you
find they are all 1. So all the a(x) are 1. So there is no nontrivial solution
no matter what r is. So the chain is recurrent for all r.
5. (corrected) Let Wt be a standard Brownian motion, i.e., σ 2 = 1 and
there is no drift. Let Mt = exp(t/2) cos(Wt ). If Z has a standard normal
distribution, then E[cos(uZ)] = exp(−u2 /2). You may assume this fact.
3
Show that Mt is a martingale. Hint:
cos(A + B) = cos(A) cos(B) − sin(A) sin(B).
Solution: Let t > s. Since it is Markov we need to show E[Mt |Ws ] = Ms .
E[Mt |Ws ] = E[exp(t/2) cos(Ws + (Wt − Ws ))|Ws ]
= exp(t/2)E[cos(Ws ) cos(Wt − Ws )|Ws ] − exp(t/2)E[sin(Ws ) sin(Wt − Ws )|Ws ]
= exp(t/2) cos(Ws )E[cos(Wt − Ws )] − exp(t/2) sin(Ws )E[sin(Wt − Ws )]
= exp(t/2) cos(Ws )E[cos(Wt − Ws )]
We have used the fact that E[sin(W√t −Ws )] = 0 since the density of Wt −Ws is
an even function. Since (Wt −Ws )/ t − s has a standard normal distribution,
E[cos(Wt − Ws )] = exp((t − s)/2). So above is
= exp(t/2 − (t − s)/2) cos(Ws ) = Ms
6. (corrected) Let Wt be a standard Brownian motion, i.e., σ 2 = 1 and there
is no drift. The following will be useful in this problem. Let Z be a standard
normal RV. Then E[exp(uZ)] = exp(u2 /2). Differentiate this with respect
to u and we get E[Z exp(uZ)] = u exp(u2 /2).
(a). Let t > s > 0. Compute E[Ws exp(Wt )].
Solution:
E[Ws exp(Wt )] = E[Ws exp(Ws ) exp(Wt − Ws )]
= E[Ws exp(Ws )]E[exp(Wt − Ws )]
√
Now Z = Ws / s is a standard normal. So
√
√ √
√
E[Ws exp(Ws )] = E[ sZ exp( sZ)] = s s exp(s/2)
Similarly, E[exp(Wt − Ws )] = exp((t − s)/2). So answer is
s exp(s/2) exp((t − s)/2) = s exp(t/2)
(b). Let t > s > 0. Compute E[Wt exp(Wt )|Ws ].
Solution:
E[Wt exp(Wt )|Ws ] = E[Ws + (Wt − Ws ) exp(Ws + (Wt − Ws ))|Ms ]
= E[Ws exp(Ws + (Wt − Ws ))|Ms ] + E[(Wt − Ws ) exp(Ws + (Wt − Ws ))|Ms ]
4
= Ws exp(Ws )E[exp(Wt − Ws )] + exp(Ws E[(Wt − Ws ) exp(Wt − Ws )]
= Ws exp(Ws ) exp((t − s)/2) + exp(Ws )(t − s) exp((t − s)/2)
(c). Find the expected value of the random variable in part (b).
Solution: E[E[Wt exp(Wt )|Ws ]] = E[Wt exp(Wt )] = t exp(t/2).
7. Let Xt be a continuous time stochastic process. (The state space could
be discrete or all real numbers.) Xt has independent increments, i.e., for
0 < s1 < t1 < s2 < t2 < · · · sn < tn , the random variables Xt1 − Xs1 ,
Xt2 − Xs2 , ..., Xtn − Xsn are independent. Also, if 0 < s1 < t1 < s2 < t2 and
t2 − s2 = t1 − s1 , then Xt1 − Xs1 and Xt2 − Xs2 have the same distribution.
Let µ = E[X1 ] and σ 2 = var(X1 ). Find formulas for the mean and variance
of Xt . Hint: The Poisson process and Brownian motion with or without drift
are examples of such a process.
Solution: Let n be a positive integer. Let In,k = Xk/n − X(k−1)/n . So
X1 = In,1 + In,2 + · · · + In,2
By the assumptions, the In,k for a fixed value of n have the same distribution
and they are independent. Since X1 have mean µ this implies the mean of
In,k is µ/n. By the independence of the increments, the variance of X1 is the
sum of the variances of the In,k . So the variance of In,k is σ 2 /n.
Now consider Xm/n for any m. It is the sum of In,k with k = 1, 2, · · · , m.
So it has mean µ m
and variance σ 2 m
. Thus Xt has mean µt and variance
n
n
2
σ t for all rational t. Assuming the mean and variance are continuous in t,
we conclude
E[Xt ] = µt
var(Xt ) = σ 2 t
Note that these equations hold for Brownian motion and they hold for the
Poisson process with µ = λ and σ 2 = λ.
5
Download