Chapter #7 – Properties of Expectation Question #4:

advertisement
Chapter #7 – Properties of Expectation
1
Question #4: Let π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 𝑦 when 0 < 𝑦 < 1 and 0 < π‘₯ < 𝑦 and π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise.
(a) Verify that π‘“π‘‹π‘Œ (π‘₯, 𝑦) is a density function, (b) find 𝐸[π‘‹π‘Œ], (c) find 𝐸[𝑋], (d) find 𝐸[π‘Œ], and
(e) compute the covariance between 𝑋 and π‘Œ given by πΆπ‘œπ‘£(𝑋, π‘Œ).
1
1 π‘₯ 𝑦
𝑦1
a) We must show that 1 = ∫0 ∫0
1
𝑑π‘₯𝑑𝑦 = ∫0 [𝑦] 𝑑𝑦 = ∫0 1 𝑑𝑦 = [𝑦]10 = 1.
𝑦
0
b) From Proposition 2.1, we know that if the random variables 𝑋 and π‘Œ have the joint
∞ ∞
density function π‘“π‘‹π‘Œ (π‘₯, 𝑦), then 𝐸(𝑔(𝑋, π‘Œ)) = ∫−∞ ∫−∞ 𝑔(π‘₯, 𝑦)π‘“π‘‹π‘Œ (π‘₯, 𝑦) 𝑑π‘₯𝑑𝑦. Here we
1
𝑦
are given 𝑔(π‘₯, 𝑦) = π‘₯𝑦, so we have that 𝐸(π‘‹π‘Œ) = ∫0 ∫0 𝑔(π‘₯, 𝑦)π‘“π‘‹π‘Œ (π‘₯, 𝑦) 𝑑π‘₯𝑑𝑦 =
1
𝑦 π‘₯𝑦
∫0 ∫0
𝑦
1
𝑦
1
1
1
1
𝑦
1
1
𝑑π‘₯𝑑𝑦 = ∫0 ∫0 π‘₯ 𝑑π‘₯𝑑𝑦 = 2 ∫0 [π‘₯ 2 ]0 𝑑𝑦 = 2 ∫0 𝑦 2 𝑑𝑦 = 6 [𝑦 3 ]10 = 6.
1
𝑦π‘₯
c) We are now given that 𝑔(π‘₯, 𝑦) = π‘₯, so we can calculate 𝐸(𝑋) = ∫0 ∫0
1
1 π‘₯2
𝑦
1 𝑦2
1
∫ [ ] 𝑑𝑦 = 2 ∫0
2 0 𝑦
𝑦
0
1
1
1
𝑦
𝑑π‘₯𝑑𝑦 =
1
𝑑𝑦 = 2 ∫0 𝑦 𝑑𝑦 = 4 [𝑦 2 ]10 = 4.
1
𝑦𝑦
d) We have 𝑔(π‘₯, 𝑦) = 𝑦, so 𝐸(π‘Œ) = ∫0 ∫0
1
1
𝑦
1
1
𝑑π‘₯𝑑𝑦 = ∫0 [π‘₯]0 𝑑𝑦 = ∫0 𝑦 𝑑𝑦 = 2 [𝑦 2 ]10 = 2.
𝑦
e) To derive a computational formula, we have πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸[(𝑋 − 𝐸(𝑋)(π‘Œ − 𝐸(π‘Œ)] =
𝐸[π‘‹π‘Œ − 𝑋𝐸(π‘Œ) − π‘ŒπΈ(𝑋) + 𝐸(𝑋)𝐸(π‘Œ)] = 𝐸(π‘‹π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ) +
𝐸(𝑋)𝐸(π‘Œ) = 𝐸(π‘‹π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ). We therefore see that for these random variables
1
1
1
1
1
πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ) = 6 − (4) (2) = 6 − 8 =
8−6
48
2
1
= 48 = 24.
Question #5: The county hospital is located at the center of a square whose sides are 3 miles
wide. If an accident occurs within this square, then the hospital sends out an ambulance. The
road network is rectangular, so the travel distance from the hospital, whose coordinates are
(0,0) to the point (π‘₯, 𝑦) is |π‘₯| + |𝑦|. If an accident occurs at a point that is uniformly
distributed in the square, find the expected travel distance of the ambulance.
ο‚· We first let 𝑍 = |𝑋| + |π‘Œ|, so we would like to compute 𝐸(𝑍). Since the sides of the
3
3
3
3
square have length 3, we have that − 2 < 𝑋 < 2 and − 2 < π‘Œ < 2, which implies that
|𝑋| <
3
2
3
3 3
and |π‘Œ| < 2. Finally, note that 𝑋, π‘Œ~π‘ˆπ‘πΌπΉ(− 2 , 2). Therefore, we have that
3/2
𝐸(𝑍) = 𝐸(|𝑋| + |π‘Œ|) = 𝐸(|𝑋|) + 𝐸(|π‘Œ|) = ∫−3/2 (3
2
3/2 π‘₯
2 ∫0
3/2 𝑦
𝑑π‘₯ + 2 ∫0
3
π‘₯2
𝑑𝑦 = 2 [ 6 ]
3
3/2
0
𝑦2
3/2
+ 2[ 6 ]
0
3/2
π‘₯
3
2
) 𝑑π‘₯ + ∫−3/2 (3
−(− )
9
2
9
36
𝑦
3
2
) 𝑑𝑦 =
−(− )
3
= 2 (24) + 2 (24) = 24 = 2.
Question #6: A fair die is rolled 10 times. Calculate the expected sum of the 10 rolls.
ο‚· We have a sum 𝑋 of 10 independent random variables 𝑋𝑖 . We therefore have that
1
1
10
10
10
𝐸(𝑋) = 𝐸(∑10
𝑖=1 𝑋𝑖 ) = ∑𝑖=1 𝐸(𝑋𝑖 ) = ∑𝑖=1 [1 ∗ 6 + β‹― + 6 ∗ 6] = ∑𝑖=1
21
6
21
= 10 ( 6 ) = 35.
We could also have found this from noting that 𝐸(𝑋𝑖 ) = 3.5 for all 𝑖 = 1,2, … ,10, so
10
that 𝐸(𝑋) = ∑10
𝑖=1 𝐸(𝑋𝑖 ) = ∑𝑖=1 3.5 = 10(3.5) = 35.
Question #9: A total of 𝑛 balls, numbered 1 through 𝑛, are put into 𝑛 urns, also numbered 1
through 𝑛 in such a way that ball 𝑖 is equally likely to go into any of the urns 1,2, … , 𝑖. Find
(a) the expected number of urns that are empty; (b) probability none of the urns is empty.
a) Let 𝑋 be the number of empty urns and define an indicator variable 𝑋𝑖 = 1 if urn 𝑖 is
empty and 𝑋𝑖 = 0 otherwise. We must find the expected value of this indicator
random variable, which is simply the probability that it is one: 𝐸(𝑋𝑖 ) = 𝑃(𝑋𝑖 = 1).
Since a ball can land in any of the 𝑖 urns with equal probability, the probability that
1
the ith ball will not land in urn 𝑖 is (1 − 𝑖 ). Similarly, the probability that the i + 1st
1
ball will not land in urn 𝑖 is (1 − 𝑖+1). Using this reasoning, we calculate that 𝐸(𝑋𝑖 ) =
1
1
1
1
𝑃(𝑋𝑖 = 1) = (1 − 𝑖 ) (1 − 𝑖+1) (1 − 𝑖+2) … (1 − 𝑛) = (
𝑖−1
𝑖
𝑖
) (𝑖+1) … (
𝑛−1
𝑛
)=
can then use this to calculate the 𝐸(𝑋) = 𝐸(∑𝑛𝑖=1 𝑋𝑖 ) = ∑𝑛𝑖=1 𝐸(𝑋𝑖 ) =
1
𝑛
1
1 𝑛(𝑛−1)
𝑛
𝑛
∑𝑛𝑖=1 𝑖 − 1 = ∑𝑛−1
𝑖=0 𝑖 = (
2
)=
𝑖−1
.
We
𝑛
𝑖−1
∑𝑛𝑖=1
𝑛
=
𝑛−1
2
.
b) For all of the urns to have at least one ball in them, the nth ball must be dropped into
1
the nth urn, which occurs with probability 𝑛. Similarly, the n – 1st ball must be dropped
into the n – 1st urn, which has a probability of
1
1
1
, and so on. We can therefore
𝑛−1
1
1
1
calculate that 𝑃(π‘›π‘œ π‘’π‘šπ‘π‘‘π‘¦ π‘’π‘Ÿπ‘›π‘ ) = (𝑛) (𝑛−1) … (2) (1) = 𝑛!.
Question #11: Consider 𝑛 independent flips of a coin having probability 𝑝 of landing on
heads. Say that a changeover occurs whenever an outcome differs from the one preceding it.
For instance, if 𝑛 = 5 and the outcome is 𝐻𝐻𝑇𝐻𝑇, then there are 3 changeovers. Find the
expected number of changeovers. Hint: Express the number of changeovers as the sum of a
total of 𝑛 − 1 Bernoulli random variables.
ο‚· Let 𝑋𝑖 = 1 if a changeover occurs on the ith flip and 𝑋𝑖 = 0 otherwise. Thus, 𝐸(𝑋𝑖 ) =
𝑃(𝑋𝑖 = 1) = 𝑃(𝑖 − 1 𝑖𝑠 𝐻, 𝑖 𝑖𝑠 𝑇) + 𝑃(𝑖 − 1 𝑖𝑠 𝑇, 𝑖 𝑖𝑠 𝐻) = 𝑝(1 − 𝑝) + (1 − 𝑝)𝑝 =
2𝑝(1 − 𝑝) whenever 𝑖 ≥ 2. If we let 𝑋 denote the number of changeovers, then we
have that 𝐸(𝑋) = 𝐸(∑𝑛𝑖=2 𝑋𝑖 ) = ∑𝑛𝑖=2 𝐸(𝑋𝑖 ) = ∑𝑛𝑖=2 2𝑝(1 − 𝑝) = 2(𝑛 − 1)(1 − 𝑝)𝑝.
Question #20: In an urn containing 𝑛 balls, the ith ball has weight π‘Š(𝑖), 𝑖 = 1, … , 𝑛. The balls
are removed without replacement, one at a time, according to: At each selection, the
probability that a given ball in the urn is chosen is equal to its weight divided by the sum of
the weights remaining in the urn. For instance, if at some time 𝑖1 , … , π‘–π‘Ÿ is the set of balls
remaining in the urn, then the next selection will be 𝑖𝑗 with probability ∑π‘Ÿ
π‘Š(𝑖𝑗 )
π‘˜=1 π‘Š(π‘–π‘˜ )
, 𝑗 = 1, … , π‘Ÿ.
Compute the expected number of balls that are withdrawn before ball number 1 is removed.
ο‚· Let 𝑋𝑗 = 1 if ball 𝑗 is removed before ball 1 and 𝑋𝑗 = 0 otherwise. Then we see that
π‘Š(𝑗)
𝐸(∑𝑗≠1 𝑋𝑗 ) = ∑𝑗≠1 𝐸(𝑋𝑗 ) = ∑𝑗≠1 𝑃(𝑋𝑗 = 1) = ∑𝑗≠1 π‘Š(𝑗)+π‘Š(1).
Question #33: If 𝐸(𝑋) = 1 and π‘‰π‘Žπ‘Ÿ(𝑋) = 5, find (a) 𝐸[(2 + 𝑋)2 ], and (b) π‘‰π‘Žπ‘Ÿ(4 + 3𝑋).
a) We calculate that the 𝐸[(2 + 𝑋)2 ] = 𝐸(4 + 4𝑋 + 𝑋 2 ) = 𝐸(4) + 4𝐸(𝑋) + 𝐸(𝑋 2 ) =
4 + 4(1) + π‘‰π‘Žπ‘Ÿ(𝑋) + [𝐸(𝑋)]2 = 4 + 4 + 5 + 12 = 14, which can calculate using the
identity that π‘‰π‘Žπ‘Ÿ(𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2.
b) π‘‰π‘Žπ‘Ÿ(4 + 3𝑋) = π‘‰π‘Žπ‘Ÿ(3𝑋) = 32 π‘‰π‘Žπ‘Ÿ(𝑋) = 9(5) = 45.
Question #38: If the joint density π‘“π‘‹π‘Œ (π‘₯, 𝑦) =
2𝑒 −2π‘₯
π‘₯
when 1 ≤ π‘₯ < ∞ and 0 ≤ 𝑦 ≤ π‘₯ and
π‘“π‘‹π‘Œ (π‘₯, 𝑦) = 0 otherwise, compute πΆπ‘œπ‘£(𝑋, π‘Œ).
ο‚·
Since πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ), we must compute the following:
∞
π‘₯
o 𝐸(π‘‹π‘Œ) = ∫0 ∫0 π‘₯𝑦 ∗
∞
1
2𝑒 −2π‘₯
π‘₯
1 2
∞
π‘₯
∞
∫0 π‘₯ 2 𝑒 −2π‘₯ 𝑑π‘₯ = (2) (2) ∫0 𝑑 2 𝑒 −𝑑 𝑑𝑑 =
π‘₯ 2𝑒 −2π‘₯
o Since 𝑓𝑋 (π‘₯) = ∫0
π‘₯
∞
𝑑𝑦𝑑π‘₯ = ∫0 ∫0 2𝑦 𝑒 −2π‘₯ 𝑑𝑦𝑑π‘₯ = ∫0 [𝑦 2 𝑒 −2π‘₯ ]0π‘₯ 𝑑π‘₯ =
Γ(3)
8
1
= 4.
π‘₯
1
𝑑𝑦 = 2𝑒 −2π‘₯ , we have 𝐸(𝑋) = ∫0 2π‘₯𝑒 −2π‘₯ 𝑑π‘₯ = 2 by doing
1
integration by parts with 𝑒 = π‘₯, 𝑑𝑣 = 2𝑒 −2π‘₯ so that 𝑑𝑒 = 1𝑑π‘₯, 𝑣 = − 2 𝑒 −2π‘₯ .
∞
∞
o 𝐸(π‘Œ) = ∫0 ∫𝑦 2𝑦 ∗
∞
∫0 π‘₯𝑒 −2π‘₯
ο‚·
𝑑π‘₯ =
𝑒 −2π‘₯
π‘₯
∞ 𝑑 −𝑑
∫ 𝑒
2 0 2
1
∞
π‘₯
𝑑π‘₯𝑑𝑦 = ∫0 ∫0 2𝑦 ∗
𝑑𝑑 =
Γ(2)
4
𝑒 −2π‘₯
π‘₯
∞ 𝑦 2 𝑒 −2π‘₯
𝑑π‘₯𝑑𝑦 = ∫0 [
1
π‘₯
= 4 with 𝑑 = 2π‘₯ and 𝑑𝑑 = 2𝑑π‘₯.
1
1
1
1
1
1
Thus, πΆπ‘œπ‘£(𝑋, π‘Œ) = 𝐸(π‘‹π‘Œ) − 𝐸(𝑋)𝐸(π‘Œ) = (4) − (2) (4) = 4 − 8 = 8.
π‘₯
] 𝑑𝑦 =
0
Question #41: A pond contains 100 fish, of which 30 are carp. If 20 fish are caught, what are
the mean and variance of the number of carp among the 20?
ο‚·
Let 𝑋 denote the number of carp caught of the 20 fish caught. We assume that each of
the 𝐢(100,20) ways to catch 20 fish from 100 are equally likely, so it is clear that
𝑋~π»π‘Œπ‘ƒπΊ(𝑛 = 20, 𝑁 = 100, π‘š = 30). This implies that 𝐸(𝑋) =
π‘‰π‘Žπ‘Ÿ(𝑋) =
π‘›π‘š (𝑛−1)(π‘š−1)
𝑁
[
𝑁−1
+1−
π‘›π‘š
𝑁
19∗29
] = 6(
99
+ 5) =
π‘›π‘š
𝑁
=
20∗30
100
= 6, while
112
33
.
Question #22: Suppose that 𝑋1, 𝑋2 and 𝑋3 are independent Poisson random variables with
respective parameters πœ†1 , πœ†2 and πœ†3 . Let 𝑋 = 𝑋1 + 𝑋2 and π‘Œ = 𝑋2 + 𝑋3 . Calculate (a) 𝐸(𝑋)
and 𝐸(π‘Œ), and (b) πΆπ‘œπ‘£(𝑋, π‘Œ).
a) 𝐸(𝑋) = 𝐸(𝑋1 + 𝑋2 ) = 𝐸(𝑋1 ) + 𝐸(𝑋2 ) = πœ†1 + πœ†2 and 𝐸(π‘Œ) = 𝐸(𝑋2 + 𝑋3 ) = πœ†2 + πœ†3
since whenever a random variable 𝑍~𝑃𝑂𝐼𝑆(πœ†), we have that 𝐸(𝑍) = π‘‰π‘Žπ‘Ÿ(𝑍) = πœ†.
b) πΆπ‘œπ‘£(𝑋, π‘Œ) = πΆπ‘œπ‘£(𝑋1 + 𝑋2 , 𝑋2 + 𝑋3 ) = πΆπ‘œπ‘£(𝑋1 , 𝑋2 ) + πΆπ‘œπ‘£(𝑋1 , 𝑋3 ) + πΆπ‘œπ‘£(𝑋2 . 𝑋2 ) +
πΆπ‘œπ‘£(𝑋2 , 𝑋3 ) = 0 + 0 + π‘‰π‘Žπ‘Ÿ(𝑋2 ) + 0 = πœ†2 since the three random variables are
independent (implying that their covariance is zero) and πΆπ‘œπ‘£(𝑍, 𝑍) = π‘‰π‘Žπ‘Ÿ(𝑍).
Download