18440: Probability and Random variables Quiz 2 Solutions

advertisement
Name:
ID:
18440: Probability and Random variables
Quiz 2
Solutions
• You will have 50 minutes to complete this test.
• No calculators, notes, or books are permitted.
• If a question calls for a numerical answer, you do not need to multiply everything out. (For
example, it is fine to write something like (0.9)7!/(3!2!) as your answer.)
• Don’t forget to write your name on the top of every page.
• Please show your work and explain your answer. We will not award full credit for the correct
numerical answer without proper explanation. Good luck!
1
2
Name:
ID:
Problem 1 (35 points) 3 types of particles (first,second and third type) are created according to
N1 (t), . . . , N3 (t), t ≥ 0 independent Poisson processes with λ = 1.
P
• (5 points) What is the law of 3i=1 Ni (1) ? (give the name and density)
• (5 points) What is the probability that N1 (1) + N2 (2) equal 10 ?
• (5 points) Write down the density function for the amount of time until a first particle (of
any kind) is created.
• (10 points) Compute the probability that after one hour 3 particles are created knowing
that no particles of the first kind were.
• (10 points) What is the conditional probability distribution of (N1 (1) + N2 (2))4 , knowing
N1 (1) + N2 (2) = 5N3 (1) ?
Proof. (1) The sum of independent Poisson random variables is Poisson with parameter equal to
P
the sum of parameters. Hence 3i=1 Ni (1) ∼ P oisson(3), and so
P(
3
X
Ni (1) = k) =
i=1
e−3 3k
.
k!
(2) As above we have that N1 (1) + N2 (2) ∼ P oisson(1 + 2), hence
P (N1 (1) + N2 (2) = 10) =
e−3 310
.
10!
(3) Let T1 be the first arrival time of a particle from N1 (t) + N2 (t) + N3 (t). We know that
N1 (t) + N2 (t) + N3 (t) is a Poisson process with rate the sum of the rates, in this case 3. Also
the first arrival time of a Poisson process with rate λ is distributed according to expon(λ). So we
conclude that
(
3e−3t if t > 0
fT1 (t) =
0 otherwise.
(4) The question asks to find P (N1 (1) + N2 (1) + N3 (1) = 3|N1 (1) = 0). Using Bayes rule and
independence we get
P (N1 (1) + N2 (1) + N3 (1) = 3|N1 (1) = 0) =
P (N1 (1) + N2 (1) + N3 (1) = 3, N1 (1) = 0)
=
P (N1 (1) = 0)
P (N2 (1) + N3 (1) = 3, N1 (1) = 0)
P (N2 (1) + N3 (1) = 3)P (N1 (1) = 0)
=
=
P (N1 (1) = 0)
P (N1 (1) = 0)
P (N2 (1) + N3 (1) = 3) = e−2
23
.
3!
(5) We have that N1 (1) and N2 (2) are non-negative integer random variables, so (N1 (1)+N2 (2))4
is a random variable supported on {k 4 |k ∈ Z≥0 }. So to determine the distribution it suffices to find
P ((N1 (1)+N2 (2))4 = k 4 |N1 (1)+N2 (2) = 5N3 (1)) = P (N1 (1)+N2 (2) = k|N1 (1)+N2 (2) = 5N3 (1)).
Using Bayes rule and independence we get
P (N1 (1)+N2 (2) = k|N1 (1)+N2 (2) = 5N3 (1)) =
P (N1 (1) + N2 (2) = k, N1 (1) + N2 (2) = 5N3 (1))
=
P (N1 (1) + N2 (2) = 5N3 (1))
P (N1 (1) + N2 (2) = k, k = 5N3 (1))
P (N1 (1) + N2 (2) = k)P (5N3 (1) = k)
=
.
P (N1 (1) + N2 (2) = 5N3 (1))
P (N1 (1) + N2 (2) = 5N3 (1))
3
Now we estimate the denominator and get
B = P (N1 (1) + N2 (2) = 5N3 (1)) =
∞
X
P (N1 (1) + N2 (2) = 5m, N3 (1) = m) =
m=0
∞
X
P (N1 (1) + N2 (2) = 5m)P (N3 (1) = m) =
m=0
∞
X
e−3
m=0
35m −1 1m
e
(5m)!
m!
On the other hand we have for the numerator
A(k) = P (N1 (1) + N2 (2) = k)P (5N3 (1) = k) =
(
35m −1 1m
e m! if k = 5m
e−3 (5m)!
0 if 5 - k
.
We thus conclude that the conditional distribution is
P ((N1 (1) + N2 (2))4 = k 4 |N1 (1) + N2 (2) = 5N3 (1)) =
A(k)
,
B
and 0 otherwise.
4
Name:
ID:
Problem 2 (20 points) Let X1 , X2 , X3 , X4 be independent uniform variables on [0, 1]. Write
Y = X1 + X2 + X3 and Z = X1 + X2 .
• (5 points) Compute the density function of Z.
• (5 points) Compute the probability that {Y < 1/2}.
• (5 points) Compute Cov(Z, Y ).
• (5 points) Are Z and Y independent ? Why ?
Proof. (1) Since Xi are supported in [0, 1] we know X1 + X2 is supported in [0, 2]. Next we have
that
R 1

Ry−1 fX1 (a)fX2 (y − a)da = 2 − y if 2 ≥ y ≥ 1
y
fZ (y) =
0 fX1 (a)fX2 (y − a)da = y if y ≤ 1


0 otherwise
(2)We have
Z
1/2 Z 1/2−x1
Z
1/2−x1 −x2
P (Y < 1/2) = P (X1 + X2 + X3 < 1/2) =
dx3 dx2 dx1 =
0
Z
0
1/2 Z 1/2−x1
0
1/2
Z
0
0
(1/2 − x1 − x2 )dx2 dx1 =
Z
1
1 1/2 2
1
2
(1/2 − x1 ) dx1 =
u du = .
2
2 0
48
0
(3) By linearity of expectation we know
E[Z] = E[X1 ] + E[X2 ] = 1/2 + 1/2 = 1 and similarly E[Y ] = 3/2.
Next we have
Z 1Z
E[ZY ] =
0
0
1Z 1
Z
1Z 1
(x1 + x2 )(x1 + x2 + x3 )dx3 dx2 dx1 =
0
0
Z
0
1
0
1
(x1 + x2 )2 + (x1 + x2 )dx2 dx1 =
2
3
7
1 3
7
20
x21 + x1 +
= + +
= .
2
12
3 4 12
12
Consequently,
Cov(Z, Y ) = E[ZY ] − E[Z]E[Y ] =
5 3
1
− = .
3 2
6
(4) It is clear that Y and Z are not independent, as otherwise their covariance would be 0, which
we saw to not be the case. Intuitively, we have Y > Z so knowing Z we have a lower bound on Y
so the two cannot be independent.
5
Name:
ID:
Problem 3 (25 points) Consider a sequence of independent tosses of a coin that is biased so that
it comes up heads with probability 2/3 and tails with probability 1/3. Let Xi be one if the i-th toss
is head, and 0 otherwise.
(1) (5 points) Compute E[Xi ] and V ar(Xi ).
(2) (5 points) Compute V ar(X1 + 2X2 + 3X3 + 4X4 + 5X5 )
P
(3) (5 points) Let Y = 40000
Xi . Approximate the probability that Y ≥ 27667.
Pi=1
P
20000
(4) (5 points) Let Z = i=1 Xi − 40000
i=20001 Xi . What can you say about the distribution of
Z/200?
(5) (5 points) Show that (Y /200, Z/200) are approximately independent.
Proof. (1) Set p = 2/3. Then
E[Xi ] = 1 × P ( heads on ith toss }) + 0 = p,
V ar(Xi ) = E[Xi2 ] − E[Xi ]2 = E[Xi ] − E[Xi ]2 = p(1 − p).
(2)We use that for independent random variables X, Y one has
V ar(aX + bY ) = a2 V ar(X) + b2 V ar(Y ),
to get in view of part (1)
V ar(X1 + 2X2 + 3X3 + 4X4 + 5X5 ) =
5
X
i2 V ar(Xi ) = p(1 − p)(1 + 4 + 9 + 16 + 25) = 55p(1 − p).
i=1
(3) We have using part (1) and linearity of expectation and variance for independent variables
80000
80000
≈ 26667 as well as V ar(Y ) = 40000 × p(1 − p) =
≈ 8889.
E[Y ] = 40000 × p =
3
9
Consequently we have by De Moivre - Laplace
Y − 26667
P( √
≥ a) ≈ 1 − Φ(a),
8889
where Φ denotes the Gaussian cdf, i.e.
Z y
1
2
√ e−x /2 .
Φ(y) =
2π
−∞
Consequently, we have
Y − 26667
27667 − 26667
√
P (Y ≥ 27667) = P ( √
≥
) ≈ 1 − Φ(a),
8889
8889
where a =
√1000 .
8889
P20000
P40000
(4)Let T1 =
i=20001 Xi . Similarly to our work above we know that
i=1 Xi and T2 =
E[Ti ] = 40000/3 = a and V ar(Ti ) = 40000/9 = b. By De Moivre Laplace we know that the
distribution of
Ti − 40000/3
,
200/3
is approximately Gaussian(0, 1) = N (0, 1). Consequently the distribution of Ti /200 is approximately N (200/3, 1/9). Also it is clear Ti are independent as they depend on disjoint collection of
tosses. Thus
Z/200 = T1 − T2 ≈ A − B,
6
where A, B are independent N (200/3, 1/9). Using that if X ∼ N (µ1 , σ12 ) and Y ∼ N (µ2 , σ22 ), one
has
aX + bY ∼ N (aµ1 + bµ2 , a2 σ12 + b2 σ22 ),
we conclude that
A − B ∼ N (0, 2/9).
So the distribution of Z/200 is approximately N (0, 2/9).
(5) We have Y /200 = T1 /200+T2 /200 so we have that the distribution of Y /200 is approximately
N (400/3, 2/9). We know that two Gaussian random variables are independent if and only if the
covariance is 0. Since Z/200 and Y /200 have approximately Gaussian distributions they would be
approximately independent if their covariance is 0. But we have
1
1
Cov(Y /200, Z/200) =
(E[Y Z]−E[Y ]E[Z]) =
(E[(T1 +T2 )(T1 −T2 )]−E[T1 +T2 ]E[T1 −T2 ]) =
40000
40000
1
(E[T12 ] − E[T22 ] − E[T1 ]2 + E[T2 ]2 ) = 0,
40000
where in the last line we used that T1 and T2 have the same distribution and hence the same
moments.
7
Name:
ID:
Problem 4 (10 points) Let X and Y be continuous random variables with joint probability
density
(
10a3 + 21b5 on a + b ≤ 1, a ≥ 1, b ≥ 0
fX,Y (a, b) =
0 otherwise.
(1) (5 points) Find the probability distribution of Z = 2Y + X.
(2) (5 points) Find the probability distribution of W = (X + Y )2 + 1.
Proof. (1): We calculate the density of Z. Since X, Y are supported on [0, 1] we know Z is supported
on [0, 3]. Then we have
Z ∞
Z min(1,x/2)
fZ (x) =
fX,Y (x − 2b, b)db =
10(x − 2b)3 + 21b5 db.
0
max(0,(x−1)/2)
Setting B = max(0, (x − 1)/2) and A = min(1, x/2), above becomes
7A6 − 7B 6
+ (−20A4 + 20B 4 ) + 10x3 (A − B) + (−60)x2 (A2 /2 − B 2 /2) + 40x(A3 − B 3 ).
2
(2): We calculate the density of W . Again we observe W is supported on [1, 2]. Then we have
for x ≥ 1
Z ∞
√
fW (x) =
fX,Y (a, x − 1 − a)da,
0
otherwise fW is 0. We calculate the above to be
Z min(1,√x−1)
√
3
10a
+
21(
x − 1 − a)5 da.
√
max(0, x−1−1)
√
Setting A = min(1, x − 1) and B = max(0, x − 1 − 1) we get the above equals
√
5 X
10 4
5 (−1)i 21( x − 1)5−i i+1
4
(A − B ) +
(A
− B i+1 ).
4
i
i+1
√
i=0
8
Name:
ID:
Problem 5 (10 points) A model for the movement of a stock supposes that if the present price
of the stock is s, then after one period, it will be either us with probability p, or ds with probability
1 − p. Assuming that successive movements are independent, approximate the probability that the
stock’s price will be up at least 30 percent after the next 1000 periods if u = 1.012, d = 0.990, p = .52.
Proof. Let X be the number of up jumps of the stock price and 1000 − X the number of down
jumps. X is Binomial(1000,p). Then we have that the price of the stock after 1000 periods is
Sf inal = uX d1000−X s.
Consequently, we are interested in
P (Sf inal ≥ 1.3s) = P (uX d1000−X ≥ 1.3),
which upon taking logs is just
P (X log(u) + (1000 − X) log(d) ≥ log(1.3)).
We fix
log(u) = a
log(d) = b.
Then our probability becomes
P (Xa + (1000 − X)b ≥ log(1.3)) = P (X ≥
log(1.3) − 1000b
).
a−b
Using De Moivre - Laplace formula we know that
X − 1000p
P(p
≥ y) ≈ 1 − Φ(y),
1000p(1 − p)
where
Z
Φ(y) =
Consequently we have that if C =
y
1
2
√ e−x /2 .
2π
−∞
log(1.3)−1000b
then
a−b
X − 1000p
C − 1000p
P (X ≥ C) = P ( p
≥p
) ≈ Φ(C 0 ),
1000p(1 − p)
1000p(1 − p)
where
C − 1000p
.
C0 = p
1000p(1 − p)
This suffices for the proof.
Download