FinMath September Review Exercises for Greg Lawler’s first lecture

advertisement
FinMath September Review Exercises for Greg Lawler’s first lecture
Exercise 1. Consider the following game. You start by rolling one die and it comes up value
D, and then you flip D coins. Your final result is X = 2H where H is the number of heads
that you flip.
• Find E[D], E[X].
It is easy to see that E[D] = 7/2. To compute E[X] one can either write
down all the possibilities or use an expectation version of the law of total
probability. Note that if D = k, then the expected number of heads is k/2
and hence E[X | D = k] = k. One can use the rule
E[X] =
6
X
P{X = k} E[X | D = k]
k=1
and find out that the answer is the same as E[D].
• Find E[X/D].
I know no way of doing this other than writing down all the possibilities.
of P{D = k, X = 2j}. This is tedious but straightforward.
• What is the probability that X is odd?
This is to see if you are awake at this point — the anwer is 0.
• Given that D = 1 what is the probability that X = 0?
P{D = 1, X = 0} = P{D = 1} P{X = 0 | D = 1} =
P{X = 0 | D = 1} =
1
11
= .
62
12
P{D = 1, X = 0}
1/12
1
=
= .
P{D = 1}
1/6
2
• Given that X = 0 what is the probability that D = 1?
P{X = 0} =
6
X
P{D = j} P{X = 0 | D = j}
j=1
1
1 −1 1 −2 1 −3
2 + 2 + 2 + · · · + 2−6
=
6
6
6
6
1
63
−6
=
[1 − 2 ] =
.
6
384
P{D = 1 | X = 0} =
1/12
32
P{X = 0, D = 1}
=
= .
P{X = 0}
63/384
63
1
• Which is larger, Var[D] or Var[X] (try answering without computing and then computing to verify your guess)?
The random variable Var[X] has the randomness of D as well as the randomness of the number of flips. Intuitively, one expects that X should
have more variance. One can compute (tedious, but straightforward) and
verify this.
Exercise 2. Suppose we play the following variant of Let’s Make a Deal. Suppose there are
four doors and two prizes put behind different doors. The prizes are worth $500 and $1000
respectively. There is nothing behind the other two doors. Assume that the game setters
choose the doors to put the prizes behind randomly. You get to choose one door. From
watching the show ahead of time, you have seen that the host always selects one door that is
not selected by the player and does not contain a prize and opens it. The host never opens
the fourth door unless she has to. If two of the first three doors have no prize and have not
been chosen by the player, she will randomly choose which of those two to open. After she
opens the door, you may change your choice. Under each of the following scenarios, give
your best strategy and your conditional expected winnings.
• You chose door 1 and the host opened door 2.
For integers 1 ≤ j < k ≤ 4, let Kj,k be the event that the prizes are behind
doors j and k. Note that P(Kj,k ) = 1/6 for each possibility (since there are
six possibilities). Let D be the event that door two was opened. Assuming
our choice of door one:
1
P(D | K1,4 ) = , P(D | K1,3 ) = 1,
2
P(D | K3,4 ) = 1.
Therefore,
1 1 1
1
5
· + ·1+ ·1= .
6 2 6
6
12
P(K1,4 ∩ D)
1/12
1
| D) =
=
= ,
P(D)
5/12
5
P(D) =
P(K1,4
and similarly,
2
P(K1,3 | D) = P(K3,4 | D) = .
5
Given this, there is a 4/5 probability that there is a prize behind door 3,
and that is what we should choose. The condtional expected winnings is
(4/5) (750) = 600.
• You chose door 1 and the host opened door 4.
In this case, you know that the prizes are behind doors 2 and 3, Choose
one of these two doors and your expected winnings are $750.
2
• You chose door 4 and the host opened door 2.
In this case
1
P(D | K1,4 ) = , P(D | K1,3 ) = 1,
2
1
P(D | K3,4 ) = .
2
1 1 1 1 1
4
· + · + ·1= .
6 2 6 2 6
12
P(K1,4 ∩ D)
1/12
1
| D) =
=
= ,
P(D)
4/12
4
P(D) =
P(K1,4
and similarly,
1
1
P(K1,3 | D) = , P(K3,4 | D) = .
2
4
The probability of a prize behind door 1 is 3/4; behind door 3 is 3/4 and
behind door 2 is 1/2. We should choose either door 1 or door 3 with
expected winnings of (3/4) 750 = 562.5
Exercise 3. As part of the freshman orientation at the University of Chicago, the new
students are tested to see if they have extrasensory perception (ESP). Each student is asked
ten consecutive true-false questions about a subject they know nothing about. However, there
is another person in the room who does know the answer and thinks the answer as hard as
she can. After administering the test to all the students, there were two students who got all
10 questions right. Given this, which of the following should you conclude:
• These students have ESP and were able to determine the answer from the brainwaves
of the woman who knew the answer.
• These students cheated on the test.
• Nothing unexpected happened.
You may assume that the entering class at University of Chicago has 1400 students.
The probability that a student gets all ten correct by chance is 2−10 = 1/1024.
There are about 1400 new freshman each year, so the expected number of
students who would get all correct is 1400/1024. Having two students get all
correct is not suprising at all and shows nothing about ESP or cheating.
Exercise 4. You flip a coin until the first time you have flipped three heads in a row. Find
the expected number of flips.
Partition the space into four events: T, HT, HHT, HHH corresponding to the events: first
flip is tails, first flip is heads and second is tails, first two flips are heads and third is tails,
first three flips are heads, respectively. Then
1
P(T ) = ,
2
1
P(HT ) = ,
4
1
P(HHT ) = P(HHH) = .
8
3
Let X be the total number of flips. Then
E[X] = P(T ) E[X | T ] + P(HT ) E[X | HT ]
+P(HHT ) E[X | HHT ] + P(HHH) E[X | HHH].
Using the reasoning as in class we can see that
E[X | T ] = 1 + E[X],
E[X | HT ] = 2 + E[X],
E[X | HHT ] = 3 + E[X].
E[X | HHH] = 3.
Therefore, if e = E[X],
e=
1
1
1
1
(1 + e) + (2 + e) + (3 + e) + · 3.
2
4
8
8
Solving gives e = 14.
Exercise 5. You have managed to convince a very rich person to play the following game.
You roll two dice. If the sum is 7, he gives you $1. If they the sum is 8, you give him $1. If
anything else comes up, no money exchanges hands. You have only $5 and will keep playing
the game as long as you still have money. Assuming your opponent never quits, what is
the probability that you will never go bankrupt?. (Hint: let q(k) be the probability of going
bankrupt given that you currently have $k. Write an equation for q(k). )
Suppose we start with k dollars. Consider the event that our bankroll every
goes down to k − 1 dollars. If this, happens, then the conditional probability to
go bankrupt will be q(k − 1). Therefore,
P{bankrupt starting with $k} = P{drop to $(k − 1)} q(k − 1).
But the probability to drop to $(k − 1) from $k is the same as the probability
to bankrupt starting with $1. Hence, q(k) = q(k − 1) q(1), and recursively we see
that q(k) = rk where r = q(1). We need to determine r. Using the law of total
probability, we can see that
5
1
5
1
q(k − 1) + 1 − −
q(k).
q(k) = q(k + 1) +
6
36
6 36
With a little algebra, this gives
q(k) − q(k + 1) =
or
5
[q(k − 1) − q(k)],
6
5 k−1
[r
− rk ],
6
5
r − r2 = [1 − r].
6
rk − rk+1 =
4
Solving this quadratic equation givs r = 1 (which does not make sense) or
r = 5/6. The probability starting with $5 of not going bankrupt is
5
5
1−
≈ .598.
6
Exercise 6 (From Quantnet forum). Suppose two players play a game. Player A flips coins
until she gets two heads in a row. Player B flips coins until he gets three heads in a row.
Player B wins is he flips fewer coins than Player A. What is the probability that Player
B wins? [Hint: assume they flip at the same time. Let q(j, k), j = 0, 1, k = 0, 1, 2 be the
probability that Player B wins given that currently A has j consecutive heads and B has k
consecutive heads. Write equations for q(j, k).)
The equations are
1
1
1
1
q(0, 0) + q(1, 0) + q(1, 1) + q(0, 1)
4
4
4
4
1
1
q(1, 0) = q(0, 0) + q(0, 1)
4
4
1
1
1
1
q(0, 1) = q(0, 0) + q(0, 2) + q(1, 0) + q(1, 2)
4
4
4
4
1
1
q(1, 1) = q(0, 0) + q(0, 2)
4
4
1 1
1
q(0, 2) = + q(0, 0) + q(1, 0)
2 4
4
1 1
q(1, 2) = + q(0, 0)
4 4
This gives six equations in six unknowns so choose your favorite way to solve
them.
q(0, 0) =
medskip
Exercise 7. Suppose that X has a uniform distribution on the interval [0, 2].
• Give the distribution function, density, mean, and variance of X.
The density is f (x) = 1/2, 0 < x < 2. The distribution function is

Z x
x≤0
 0,
x/2, 0 < x < 2,
F (x) =
f (t) dt =

−∞
1,
x ≥ 1.
The mean is clearly equal to 1. Also
Z ∞
Z
2
2
E[X ] =
x f (x) dx =
−∞
0
2
x2
4
dx = .
2
3
1
Var[X] = E[X 2 ] − E[X]2 = .
3
5
• Find the moment generating function and characteristic function of X.
Z 2
dx
e2t − 1
tx
etx
M (t) = E[e ] =
=
.
2
2t
0
The formula on the right is valid for t 6= 0. For t = 0, one can take the
limit as t → 0, or just look at the definition to see that M (0) = 0.
e2it − 1
.
2it
φ(t) = E[eitx ] =
• Let Y = X 3 . Find the density of Y .
We start by computing the distribution function.
F (t) = P{Y ≤ t} = P{X ≤ t
1/3
1/3
} = F (t
t1/3
)=
.
2
This is valid for 0 < t < 8. Differentiating gives the density
f (t) = F 0 (t) =
1
,
6t2/3
0 < t < 8.
• Suppose that X1 , X2 are independent each with the uniform distribution on [0, 2]. True
or false: X1 + X2 has a uniform distribution on [0, 4].
The answer is false. There are a number of ways to do this. One way
(which I leave to you to check) is to note that X1 + X2 has moment generating function
2
2t
e −1
MX1 (t) MX2 (t) =
.
2t
The moment generating function for a uniform on [0, 4] can be computed
as above to give
e4t − 1
.
4t
These are not the same function.
Exercise 8. Do a computer simulation of the following. Suppose one is catching fish on a
Saturday morning. One starts at 8:00 and stops at 12:00. The waiting time between fish
caught is exponential with parameter λ = .9 (time is measure in hours). Do many simulations
to see how many fish are caught by noon in order to get an estimate for
p(k) = P{exactly k fish caught in four hours}.
Compare your simulation to a Poisson random variable with parameter 3.6. (Why are we
choosing parameter 3.6?)
I will not do the computer simulation here. We choose 3.6 because it equals λt
where λ = .9 and t = 4.
6
Exercise 9. Suppose X1 , X2 , X3 are independent random variables each normal with mean
zero and variances 1, 4, 9, respectively. Find the following probabilities to four decimal places.
• P{X1 /X3 > 0}
Since X1 , X3 are both symmetric about the origin, so is X1 /X3 and this
probability equals 1/2.
• P{X3 > X1 + X2 + 1}
This is the same as
P{X3 − X1 − X2 > 1}.
The random variable X3 − X1 − X2 is normal with variance 9 + 1 + 4 = 14.
Hence if N denotes a standard normal,
√
√
√
P{X3 − X1 − X2 > 1} = P{ 14 N > 1} = P{N > 1/ 14} = Φ(−1/ 14).
The last value needs to be found by a computer or table.
• P {X12 + (X2 /2)2 + (X3 /3)2 ≤ 11.35} .
Note that
X12 + (X2 /2)2 + (X3 /3)2
is the sum of squares of three independent standard normals. Hence it
has a χ2 distribution with three degrees of freedom. By checking a table
(and noticing that I chose a good number) one sees that it is about .99.
To get to four decimal places, one needs a computer package.
Exercise 10. Suppose X and Y are independent random variables with exponential distributions with parameters λ1 and λ2 . Let Z = min{X, Y }, W = max{X, Y }. Recall that
P{X ≥ t} = e−λ1 t , P{Y ≥ t} = e−λ2 t .
• Show that Z has an exponential distribution. What is the parameter?
P{min{X, Y } ≥ t} = P{X ≥ t, Y ≥ t} = P{X ≥ t} P{Y ≥ t} = e−(λ1 +λ2 )t .
Hence Z has an exponential distribution with parameter λ1 + λ2 .
• Does W has an exponential distribution? Why or why not?
P{max{X, Y } ≤ t} = P{X ≤ t, Y ≤ t} = P{X ≤ t}P{Y ≤ t} = [1 − e−λ1 t ][1 − e−λ2 t ].
The right-hand side cannot be written in the form 1 − e−λt for some t.
Therefore W does not have an exponential distribution.
• Suppose λ1 = λ2 ? Does W − Z have an exponential distribution?
The answer is yes. Let us think it out first. Consider X, Y as the times
until events occur. The minimum is the time that the first one occurs
7
and the maximum is the time that the second one occurs. After the first
one occurs, the memoryless property of the exponential states that the
amount of time until the second occurs is the same as starting afresh. It
does not matter which happened first, since the rates are the same. There
are several ways to make this precise. We will compute
P{W − Z ≤ t} = P{W − Z ≤ t, X < Y } + P{W − Z ≤ t, Y < X}.
By symmetry the two terms on the right-hand side are equal. We will
compute the first
Z ∞
P{W − Z ≤ t, X < Y } =
P{s ≤ Y ≤ s + t} dP{X ≤ s}
0
Z ∞
[e−λs − e−λ(s+t) ] λ e−λs ds
=
0
Z ∞
−λt
= [1 − e ]
e−λs λ e−λs ds
0
= [1 − e−λt ]/2.
Therefore,
P{W − Z ≤ t} = 1 − e−λt .
• How about if λ1 6= λ2 ?
The answer is no. If one wants to verify this one can compute P{W −Z ≤ t}
as in the last part and see that it does not equal the distribution function
of an exponential. One must take a little care in the computation, because
in this case
P{W − Z ≤ t, X < Y } =
6 P{W − Z ≤ t, Y < X}.
However, the computation above can be followed.
Exercise 11. Let X1 , X2 , . . . be independent standard normal distributions with mean µ and
variance σ 2 . Let
n
1X
X1 + · · · + Xn
X̄n =
, Sn2 =
[Xj − X̄n ]2 .
n
n j=1
Give the distributions of the following random variables (including the values of the parameters).
X̄n
Sn
σ2
n
1 X
(Xj − µ)2 ,
2
σ j=1
8
√
n X̄n
Sn
The goal of this was just to see if you are reading carefully.
• normal mean µ variance σ 2 /n;
• This is not a nice distribution that we can give although we can say that if
Y = Sn /σ 2 , then (n − 1)σY 2 is χ2n−1 .
• χ2n
• This is not a nice distribution although
with n − 1 degrees of freedom
√
n(X̄n − µ)/Sn has a t distribution
Exercise 12. Let X, Y denote the values of two indepedent die rolls (so each takes values
1, ..., 6), and let Z = X + Y and let W = 1 or 2 depending on whether Z is even or odd.
Give the following conditional expectations. In each case, be explicit about how many different
values it takes.
E[X | Y ]
Since X and Y are independent, E[X | Y ] = E[X] = 7/2.
E[X + Z | Y ]
Using the rules for conditional expectation,
E[X + Z | Y ] = E[2X + Y | Y ] = 2 E[X | Y ] + E[Y | Y ] = 2E[X] + Y = 7 + Y.
There are six distinct values the random varaible can take on depending on
the value of Y.
E[Z | W ]
1
1
1
· 2 + · 4 + · 6 = 4,
3
3
3
1
1
1
E[Z | W = 2] = · 1 + · 3 + · 5 = 3.
3
3
3
There are two possible values depending on the value of W .
E[Z | W = 1] =
E[Y | W ]
Note that by symmetry, E[Y | W ] = E[X | W ]. Also,
E[Y | W ] = E[Z − X | W ] = E[Z | W ] − E[X | W ] = E[Z | W ] − E[Y | W ].
Solving, we get E[Y | W ] = E[Z | W ]/2. Therefore,
3
E[Y | W = 2] = .
2
E[Y | W = 1] = 2,
9
There are two possible values depending on the value of W .
E[Z | X + Y ]
This is just to see if you are awake. Since Z = X + Y the answer is X + Y .
E[W | Y ].
This is a bit tricky. If you see one die, the conditional probability that the sum
will be even will be 1/2 regardless of what value that one die take on. Hence,
3
E[W | Y ] = E[W ] = .
2
E[3X 2 + 2X | X]
The random variable 3X 2 + X is measurable with respect to x and hence
E[3X 2 + 2X | X] = 3X 2 + X.
There are six possible values depending on the value of X.
Are Y and Z independent random variables?
No
Are Y and W independent random variables?
As we described above, yes.
Exercise 13. In the previous exercise find Cov(X, Z), Cov(X, Y ), Cov(Z, W ). SInce X, Y
are independent,
Cov(X, Y ) = E[XY ] − E[X] E[Y ] = 0.
To compute Cov(X, Z), note that
91
+
E[XZ] = E[X + XY ] = E[X ] + E[X] E[Y ] =
6
2
2
2
7
.
2
(simple calculation).
2
7
E[X] E[Z] = 2
,
2
2
91
7
35
Cov(X, Z) = E[XZ] − E[X] E[Z] =
−
= .
6
2
12
To compute E[ZW ] one could write out all the possibilities. However, we have
seen that Y and W are independent and hence
7
E[X] = ,
2
7 7
E[Z] = 7 = + ,
2 2
E[Y W ] = E[Y ] E[W ].
Similarly, E[XW ] = E[X] E[W ]. Hence
E[ZW ] = E[(X + Y ) W ] = E[XW ] + E[Y W ] = (E[X] + E[Y ]) E[W ] = E[Z] E[W ].
Therefore, Cov(Z, W ) = 0.
10
Exercise 14. Suppose Z1 , Z2 , Z3 are independent standard normals and
X = 2Z1 + 3Z2 + 4Z3 ,
Y = Z1 − Z3 ,
W = X − Y.
Why does (W, X, Y ) have a joint normal distribution? Find the covariance matrix. Are W, X
independent? Are W, Y independent?
We can write
W = Z1 + 3Z2 + 5Z3 .
Therefore, (W, X, Y ) have a joint normal distribution by the definition (they
are linear combinations of independent standard normals.
E[X 2 ] = 22 + 32 + 42 = 29,
E[Y 2 ] = 12 + 12 = 2,
E[XY ] = (2 · 1) + (3 · −1) = −1,
E[W 2 ] = 12 + 32 + 52 = 35.
E[XW ] = (2 · 1) + (3 · 3) + (4 · 5) = 31,
E[Y W ] = (1 · 1) + (−1 · 5) = −4.
The matrix is


35 31 −4
Γ =  31 29 −1  .
−4 −1 2
Since the covariances are not equal to zero, neither W, X nor W, Y are independent.
Exercise 15. In the previous example, let R = Z1 − Z2 , Q = Z1 + Z2 . Are R and Q
independent?
Note that (R, Q) has a centered joint normal distribution. By checking E[RQ] =
0, we can see that R and Q are independent.
Exercise 16. Find two random variables X, Y that are not independent that satisfy E[XY ] =
E[X] E[Y ]. (They better not have a joint normal distribution!)
There are many possibilities. But we actually saw an example in these exercises. In Exercise 3, Z and W satisfy E[ZW ] = E[Z] E[W ] but it is easy to see
that they are not independent.
11
Download