Uploaded by jiangdumpling

Probability Solutions: Self-Quizzes & Tests, Chapter 10

advertisement
SOLUTIONS TO SELF-QUIZZES AND SELF-TESTS
CHAPTER 10
FUNDAMENTALS
OF PROBABILITY
WITH
STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
A CHAPMAN & HALL BOOK
Chapter 10
Solutions to Self-Quiz Problems
2
SOLUTIONS TO SELF-QUIZ PROBLEMS
Section 10.1
1. Let X be the number of disk storage wallets that Shiante should search to find the
DVD. For i = 1, 2, . . ., 11, let Xi = 1 if the DVD is stored in the ith wallet that
Shiante searches, and Xi = 0, otherwise. Then
X = 1 · X1 + 2 · X2 + · · · + 11 · X11 ,
and, therefore,
E(X) = 1 · E(X1) + 2 · E(X2) + · · · + 11 · E(X11)
=1·
1
1
1
1 11 × 12
1
+2·
+ · · · + 11 ·
= (1 + 2 + · · · + 11) =
·
= 6.
11
11
11
11
11
2
2. By the Cauchy-Schwarz inequality,
E(XY ) ≤ E(X 2)E(Y 2 ) .
Note that XY = X and Y 2 = Y . Since X is nonnegative, and avoiding trivial cases,
E(X) > 0, we have
E(X) = E(X) = E(XY ) ≤ E(X 2)E(Y 2 ).
However,
E(Y 2 ) = E(Y ) = P (X > 0).
Thus
E(X) ≤
This implies that
2
E(X 2)P (X > 0) ≥ E(X) .
Hence
or
E(X 2)P (X > 0).
2
E(X)
,
P (X > 0) ≥
E(X 2)
2
E(X)
,
1 − P (X > 0) ≤ 1 −
E(X 2)
which establishes the relation
2
E(X 2) − E(X)
Var(X)
=
.
P (X = 0) ≤
2
E(X )
E(X 2)
Chapter 10
Solutions to Self-Quiz Problems
3
Section 10.2
1. Clearly, E(Xi) = E(Xj ) = 1/4. For 1 ≤ i ≤ 8, let Ai be the event that the ith card
drawn is a heart. Then
Xi Xj =
⎧
⎨1
if Ai Aj occurs
⎩
0
otherwise.
Therefore,
E(XiXj ) = P (Ai Aj ) = P (Aj | Ai )P (Ai ) =
3
12 1
· = .
51 4
51
Hence
Cov(Xi, Xj ) = E(XiXj ) − E(Xi)E(Xj ) =
1
1
3
−
=−
.
51 16
272
2. Suppose that X and Y are the lifetimes of two devices each with mean 1/λ. V is the
time when the first device dies; U − V is the additional lifetime of the other device,
which, by the memoryless property of the exponential, is itself an exponential random
variable with mean 1/λ independently of V . So, by independence of U − V and V ,
Cov(U − V, V ) = 0. Hence
Cov(U, V ) = Cov(U − V, V ) + Cov(V, V ) = 0 + Var(V ).
Note that
P (V > t) = P min(X, Y ) > t
= P (X > t, Y > t) = P (X > t)P (Y > t) = e−λt · e−λt = e−2λt
implies that V is exponential with mean 1/(2λ) and variance 1/(4λ 2). Therefore,
Cov(U, V ) = Var(V ) = 1/(4λ2).
Section 10.3
1. With these answers, we have
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = 23 − 4 = 19,
Var(X) = 13 − 4 = 9, and Var(Y ) = 40 − 4 = 36. Therefore,
ρ(X, Y ) =
19
Cov(X, Y )
> 1.
=
σX σY
18
This is not possible, since −1 ≤ ρ(X, Y ) ≤ 1.
Chapter 10
Solutions to Self-Quiz Problems
4
2. Note that
n
Var(X1 + X2 + · · · + Xn ) =
Var(Xi ) + 2
Cov(Xi, Xj )
i=1
i<j
n−1
n
Cov(Xi , Xj ).
= 4n + 2
i=1 j=i+1
Now
ρ(Xi, Xj ) =
1
Cov(Xi , Xj )
Cov(Xi , Xj )
=−
=
σXi σXj
4
16
implies that Cov(X i , Xj ) = −1/4. Since
n−1
i=1
n
j=i+1 Cov(Xi , Xj ) has
(n − 1) + (n − 2) + · · · + 1 =
(n − 1)n
2
identical terms, we have
Var(X1 + X2 + · · · + Xn ) = 4n + (n − 1)n −
1 n(17 − n)
=
.
4
4
Section 10.4
1. Due to the memoryless property of exponential random variables, it does not matter
who cried last or how many times before. Let X be the time until Hannah cries next
and Y be the time until Joshua cries next; X and Y are independent exponential random variables with parameters 10 and 7, respectively. By Remark 10.2, the desired
probability is
7
7
= .
P (Y < X) =
7 + 10
17
2. Let midnight be labeled t = 0. It should be clear that the percentage of the cars
parked illegally in the time interval (0, 30) and caught is equal to the percentage of
all cars parked illegally and caught. Between 0 and 30, let Y be the time a random
car is illegally parked on the street; Y is a uniform random variable over (0, 30).
Suppose that the car parked illegally at Y is left parked for X units of time. Let A be
the event that it is caught. The desired percentage is 100 · P (A).
30
30
1
1
dy
P (A | Y = y) dy =
P (X > 30 − y)
P (A) =
30
30
0
0
30
1
dy = 0.317.
e−(1/10)(30−y) ·
=
30
0
Therefore, the percentage of the cars that are parked illegally and caught is 31.7%.
Note that, as expected, the final answer does not depend on λ.
Chapter 10
Solutions to Self-Quiz Problems
5
Section 10.5
1. The conditional probability density function of of Y , given that X = 0.97 is normal
with mean
μY + ρ
σY
0.19 (x − μX ) = 0.53 + (0.2)
(0.97 − 0.95) ≈ 0.5327
σX
0.28
and standard deviation
σY
1 − ρ2 = (0.19) 1 − (0.2)2 ≈ 0.1862.
Therefore, the desired probability is calculated as follows:
0.60 − 0.5327 Y − 0.5327
≥
P (Y ≥ 0.60 | X = 0.97) = P
X = 0.97
0.1862
0.1862
= P (Z ≥ 0.36) = 1 − Φ(0.36) = 1 − 0.6409 = 0.3594,
where Z is a standard normal random variable and Φ is its distribution function.
2. By (10.24), f (x, y) is maximum if and only if Q(x, y) is minimum. Let z 1 =
y − μY
x − μX
and z2 =
. Then |ρ| ≤ 1 implies that
σX
σY
Q(x, y) = z12 − 2ρz1 z2 + z22 ≥ z12 − 2|ρz1 z2 | + z22
≥ z12 − 2|z1 z2 | + z22 = |z1 | − |z2 |
2
≥ 0.
This inequality shows that Q is minimum if Q(x, y) = 0. This happens at x = μ X
and y = μY . Therefore, (μX , μY ) is the point at which the maximum of f is obtained.
Chapter 10
Solutions to Self-Test Problems
6
SOLUTIONS TO SELF-TEST PROBLEMS
1. Let X be the period between two consecutive calls made to extension 1247; X is an
exponential random variable with mean 1/λ. We have
∞
P N2 (X) = i | X = x λe−λx dx
P (N = i) = P N2 (X) = i =
=
=
∞
0
μi λ
i!
0
−λx
P N2 (x) = i λe
∞
0
dx =
∞
0
e−μx (μx)i −λx
λe
dx
i!
xi e−(λ+μ)x dx.
Making the substitution (λ + μ)x = y yields
∞
μi λ
1
yi
1
μi λ ∞
−y
dy =
·
e
e−y y i dy
P (N = i) =
i! 0 (λ + μ)i
λ+μ
i! (λ + μ)i+1 0
μ i
λ
μi λΓ(i + 1)
=
=
i! (λ + μ)i+1
λ+μ λ+μ
i
λ
λ , i = 0, 1, 2, . . . .
= 1−
λ+μ
λ+μ
2. Let X be the annual calamity loss of the business in a random year. Let Y be the
annual calamity loss of the business that year paid by the insurance company. We are
interested in E(Y ):
E(Y ) = E E(Y | X) =
=
0.4
0.25
0.4
0.25
E(Y | X = x)f (x) dx +
xf (x) dx +
∞
0.4
∞
0.4
E(Y | X = x)f (x) dx
(0.4)f (x) dx
∞
1
1
x·
dx + (0.4)
dx
=
5
5
64x
0.25
0.4 64x
0.4
≈ 0.251953 + 0.061035 = 0.312988.
Therefore, the expected value of the annual loss of the business paid by the insurance
company is $312,988.
3. Note that
n
Var(X1 + X2 + · · · + Xn ) =
Var(Xi ) + 2
i=1
Cov(Xi, Xj ),
i<j
Chapter 10
n−1
i<j
7
n
Cov(Xi, Xj ) =
where
Solutions to Self-Test Problems
Cov(Xi, Xj ) has
i=1 j=i+1
(n − 1) + (n − 2) + · · · + 1 =
(n − 1)n
2
terms, and Cov(Xi , Xj ) = ρ(Xi, Xj )σXi σXj = aσ 2 . Hence
Var(X1 + X2 + · · · + Xn ) ≥ 0
yields
nσ 2 + (n − 1)naσ 2 ≥ 0,
1
.
which implies that 1 + (n − 1)a ≥ 0 or a ≥ −
n−1
4. Clearly X is exponential with parameter λ. Let Y 1 , Y2 , and Y3 be the lifetimes of
the electron guns of Dan’s monitor. Then Y 1 , Y2 , and Y3 are independent exponential
random variables each with parameter λ. Hence, Y = min(Y1 , Y2 , Y3 ) is exponential
with parameter 3λ. This gives
P (X < Y ) =
1
λ
= ;
λ + 3λ
4
1
.
E min(X, Y ) =
4λ
To find E max(X, Y ) , note that, by Remark 6.4,
∞
P max(X, Y ) > t dt
E max(X, Y ) =
0
∞
1 − P max(X, Y ) ≤ t dt
=
0
=
=
∞
1 − P (X ≤ t, Y ≤ t) dt
1 − P (X ≤ t)P (Y ≤ t) dt
0
∞
0
∞
1 − (1 − e−λt )(1 − e−3λt) dt
0
1
1 −3λt
1 −4λt ∞ 13
e
e
λ.
+
=
= − e−λt −
λ
3λ
4λ
12
0
=
5. Let X be the time until the seismograph placed in the New Mexico desert is replaced.
Let L be its lifetime. L is an exponential random variable with parameter λ = 1/7.
Let
⎧
if L ≥ 5
⎨1
Y =
⎩
0
if L < 5.
Chapter 10
Solutions to Self-Test Problems
8
We are interested in E(X). We have
E(X) = E E(X | Y ) = E(X | Y = 0)P (Y = 0) + E(X | Y = 1)P (Y = 1)
= E(L | L < 5)P (L < 5) + E(X | L ≥ 5)P (L ≥ 5).
Clearly,
P (L < 5) =
5
0
1 −t/7
e
dt = 1 − e−5/7 ≈ 0.51,
7
P (L ≥ 5) ≈ 1 − 0.51 = 0.49.
E(X | L ≥ 5) = 5.
To find E(L | L < 5), first we find the distribution function of L given that L < 5,
we have
⎧
⎪
0
t<0
⎪
⎪
⎪
⎪
⎪
⎨ P (L ≤ t)
0≤t<5
P (L ≤ t | L < 5) =
⎪
P (L < 5)
⎪
⎪
⎪
⎪
⎪
⎩1
t ≥ 5.
So
⎧
⎪
0
⎪
⎪
⎪
⎪
⎪
⎨
1 − e−t/7
P (L ≤ t | L < 5) =
⎪
⎪ 1 − e−5/7
⎪
⎪
⎪
⎪
⎩1
t<0
0≤t<5
t ≥ 5.
Let fL|L<5 be the conditional probability density function of L given L < 5. f L|L<5
is obtained by differentiating the right side of the distribution function of L given that
L < 5. Thus
⎧
⎪
(1/7)e−t/7
⎪
⎨
0≤t<5
1 − e−5/7
fL|L<5 (t) =
⎪
⎪
⎩0
otherwise.
Therefore,
1/7
E(L | L < 5) =
1 − e−5/7
=
5
0
te−t/7 dt
5
1
−t/7
−
(7t
+
49)e
≈ 2.205.
0
7(1 − e−5/7 )
So the probability we are interested in is
E(X) = E(L | L < 5)P (L < 5) + E(X | L ≥ 5)P (L ≥ 5)
= (2.205)(0.51) + (5)(0.49) ≈ 3.57.
Chapter 10
Solutions to Self-Test Problems
9
y
x
Figure 10a
Region R of Self-Test Problem #6 of Chapter 10.
6. Region R is the shaded area of Figure 10a. Since the area of R is 9π − 4π = 5π, the
joint probability density function of X and Y is given by
⎧
1
⎪
if (x, y) ∈ R
⎨
f (x, y) = 5π
⎪
⎩
0
otherwise.
As Figure 10b shows,
P (0 < X < 1) = 2P (0 < X < 1 | Y > 0).
Since P (0 < X < 1 | Y > 0) = P (0 < X < 1), X and Y are dependent. To show
that X and Y are uncorrelated, we need to calculate E(X), E(Y ), and E(XY ). To
calculate
f (x, y) dx dy,
E(X) =
R
we convert from Cartesian to polar coordinates. We have x = r cos θ, y = r sin θ,
and dx dy = r dr dθ. So
2π 3
2π 3
1
1
2
r dr dθ =
(r cos θ) ·
(r cos θ) r dr dθ
E(X) =
5π
5π 0
0
2
2
2π
2π
19 19
sin θ
cos θ dθ =
= 0.
=
15π 0
15π
0
Chapter 10
y
Solutions to Self-Test Problems
10
x =1
x
Figure 10b
The shaded regions show that
P (0 < X < 1 | Y > 0) = P (0 < X < 1)
Similarly E(Y ) = 0. The fact that E(X) = E(Y ) = 0 must be evident since X and
Y assume positive and negative values with the same probabilities. Now we calculate
E(XY ). We have
2π 3
1
r dr dθ
5π
0
2
3
2π
1
3
sin θ cos θ
r dr dθ
=
5π 0
2
13 2π 1
13 2π
sin 2θ dθ
sin θ cos θ dθ =
=
4π 0
4π 0
2
2π
1
1 1
13
13
− cos 2θ
− +
= 0.
=
=
8π
2
8π
2 2
0
E(XY ) =
(r cos θ)(r sin θ) ·
Hence
Cov(X, Y ) = E(XY ) − E(X)E(Y ) = 0 − 0 · 0 = 0.
This shows that X and Y are uncorrelated.
7. Let N (t) be the number of claims received by the insurance company at or prior to
time t. Let Λ be the average
number of claims during a one month period. We are
given that N (t) : t ≥ 0 is a Poisson process with parameter Λ, where Λ is a gamma
Chapter 10
Solutions to Self-Test Problems
11
random variable with parameters n and λ. To calculate the distribution of N (12), the
desired random variable, we will condition on Λ.
∞
λe−λx (λx)n−1
dx
P N (12) = i | Λ = x
P N (12) = i =
(n − 1)!
0
∞ −12x
e
(12x)i λe−λx (λx)n−1
·
dx
=
i!
(n − 1)!
0
∞
12iλn
e−(λ+12)x xn+i−1 dx Now let y = (λ + 12)x.
=
i! (n − i)! 0
∞
12i λn
1
=
·
e−y y n+i−1 dy
i! (n − 1)! (λ + 12)n+i 0
1
12i λn
·
Γ(n + i)
i! (n − 1)! (λ + 12)n+i
λ n (n + i − 1)!
12 i
=
=
λ + 12
λ + 12
i! (n − 1)!
n
12 i
n+i−1
λ
,
=
λ + 12
λ + 12
n−1
For i ≥ 0, letting j = n + i, we have shown that
12 j−n
j−1
λ n
,
P n+N (12) = j =
λ + 12
n − 1 λ + 12
i = 0, 1, 2, . . ..
j = n, n+1, n+2, . . ..
This shows that n + N (12) is negative binomial with parameter n and λ (λ + 12).
8. For i = 1, 2, . . ., 10, j = i + 1, i + 2, . . ., 11, and k = j + 1, j + 2, . . . , 12, let
Xijk = k if the ith, the jth, and the kth movies withdrawn are Marlon Brando
10
11
12
movies. Let Xijk = 0, otherwise. Then X =
Xijk is the number
i=1 j=i+1 k=j+1
of movies Professor Jackson withdraws until all Brando movies are pulled out. The
desired quantity is
10
11
12
E(X) = E
10
Xijk =
i=1 j=i+1 k=j+1
10
11
12
=
i=1 j=i+1 k=j+1
k· 11
12
E(Xijk )
i=1 j=i+1 k=j+1
10
11
12
1
1
=
k.
12
220
i=1 j=i+1 k=j+1
3
Now
12
k=j+1
j
12
k=
k−
k=1
k=
k=1
1
12 × 13 j(j + 1)
−
= (156 − j 2 − j).
2
2
2
Chapter 10
Solutions to Self-Test Problems
12
So
11
12
j=i+1 k=j+1
11
1
k=
2
=
1
2
(156 − j 2 − j)
j=i+1
11
(156 − j 2 − j) −
j=1
i
(156 − j 2 − j)
j=1
=
1
11 × 12 × 23 11 × 12
1716 −
−
− 156i
2
6
2
i(i + 1)(2i + 1) i(i + 1)
+
+
6
2
= 572 − 78i +
i(i + 1)(i + 2)
.
6
Therefore,
1
E(X) =
220
10 i=1
572 − 78i +
i(i + 1)(i + 2) 6
1
78 10 × 11
·
+
= 26 −
220
2
220
= 26 −
10 i=1
i(i + 1)(i + 2) 6
1
39
39
+
· 4290 =
= 9.75.
2
1320
4
9. Let X1 = 1 for counting the first person that Amber will meet. Let X 2 be the number
of people Amber will meet until she finds a person whose birthday is different from
the birthday of the first person she met. Clearly, X 2 is a geometric random variable
with probability of success p given by p = 364/365. Let X 3 be the number of people
Amber will meet until she finds a person whose birthday is different from the first two
birthdays that she has already recorded. Then X 3 is geometric with p = 363/365.
In general, for 2 ≤ i ≤ 365, let Xi be the number of people Amber will meet until
she finds a person whose birthday is different from the first i − 1 birthdays she has
already recorded. Then Xi is a geometric random variable with
p=
366 − i
365 − (i − 1)
=
.
365
365
Clearly, X, the number of people that Amber will need to meet in order to achieve
her goal, satisfies the following equation.
X = X1 + X2 + X3 + · · · + X365.
Since the expected value of a geometric random variable with parameter p is 1/p, we
Chapter 10
Solutions to Self-Test Problems
13
have
E(X) = E(X1) + E(X2) + E(X3 ) + · · · + E(X365)
= 1+
365
365 365
+
+···+
364 363
1
364
= 1 + 365
i=1
1
≈ 1 + 365(6.47574252996) ≈ 2364.65.
i
Now, since the variance of a geometric random variable with parameter p is
and since X1 , X2 , . . . , X365 are independent random variables,
365
365
Var(Xi ) =
Var(X) =
i=1
=
Var(Xi )
i=2
366 − i
365
365(i − 1)
365
=
≈ 216, 417.19.
2
(366 − i)2
366 − i
i=2
365
365 1 −
i=2
Therefore, σX =
1−p
,
p2
Var(X) ≈ 465.20.
10. The set of possible outcomes of this experiment is
N N N N, 6NN N, N 6N N, NN 6N, NN N 6, 66NN, 6N6N, 6NN 6, N66N,
N 6N 6, N N 66, 666N, 66N 6, 6N 66, N666, 6666 .
The probability of obtaining N N N N is (5/6) 4 = 625/1296. The probability of each
of the possible outcomes 6N N N , N 6N N , N N 6N , and N N N 6 is (1/6)(5/6) 3 =
125/1296. The probability of each of the possible outcomes 66N N , 6N 6N , 6N N 6,
N 66N , N 6N 6, and N N 66 is (1/6)2(5/6)2 = 25/1296. The probability of each
of the possible outcomes 666N , 66N 6, 6N 66, N 666 is (1/6) 3(5/6) = 5/1296.
Finally, the probability of obtaining 6666 is (1/6) 4 = 1/1296. Based on these probabilities, for example,
P (X = 2, Y = 2) = P {66N N, N 66N, NN66} =
25
25
75
25
+
+
=
.
1296 1296 1296
1296
Similar calculations yield the following table, which represents p(x, y), the joint
probability mass function of X and Y , p X (x), the marginal probability mass function
of X, and pY (y), the marginal probability mass function of Y .
Chapter 10
14
Solutions to Self-Test Problems
y
x
0
1
2
3
4
p X (x)
0
625/1296
0
0
0
0
625/1296
1
0
500/1296
0
0
0
500/1296
2
0
75/1296
75/1296
0
0
150/1296
3
0
0
10/1296
10/1296
0
20/1296
4
0
0
0
0
1/1296
1/1296
pY (y)
625/1296
575/1296
85/1296
10/1296
1/1296
To calculate the correlation coefficient of X and Y , among other quantities, we need
to calculate Cov(X, Y ) and hence E(XY ). So first we calculate the probability mass
function of XY , which is represented in the following table.
i
0
1
2
3
4
6
8
9
12
16
P (XY = i)
625
1296
500
1296
75
1296
0
75
1296
10
1296
0
10
1296
0
1
1296
Now we will calculate all the quantities that we need to find ρ(X, Y ). We have
E(X) = 0 ·
500
150
20
1
864
2
625
+1·
+2·
+3·
+4·
=
= ,
1296
1296
1296
1296
1296
1296
3
1296
625
500
150
20
1
+ 12 ·
+ 22 ·
+ 32 ·
+ 42 ·
=
= 1,
1296
1296
1296
1296
1296
1296
2
2 2
5
= ,
Var(X) = E(X 2) − E(X) = 1 −
3
9
√
5
.
σX =
3
E(X 2) = 02 ·
E(Y ) = 0 ·
575
85
10
1
779
625
+1·
+2·
+3·
+4·
=
,
1296
1296
1296
1296
1296
1296
1021
625
575
85
10
1
+ 12 ·
+ 22 ·
+ 32 ·
+ 42 ·
=
,
1296
1296
1296
1296
1296
1296
2
779 2
1021
716, 375
−
,
=
Var(Y ) = E(Y 2 ) − E(Y ) =
1296
1296
1, 679, 616
E(Y 2 ) = 02 ·
Chapter 10
σY =
E(XY ) = 0 ·
15
√
6480 28, 655
716, 375
=
.
1, 679, 616
1, 679, 616
500
75
75
625
+1·
+2·
+3·0+4·
1296
1296
1296
1296
+6·
10
1
31
10
+8·0+9·
+ 12 · 0 + 16 ·
= .
1296
1296
1296
36
Cov(X, Y ) = E(XY ) − E(X)E(Y ) =
ρ(X, Y ) =
Solutions to Self-Test Problems
895
31 2 779
− ·
=
,
36 3 1296
1944
895/1944
Cov(X, Y )
= √ √
≈ 0.946.
σX σY
5 6480 28, 655
3
1, 679, 616
As expected, this shows that there is almost a linear relation, hence a strong positive
correlation, between X and Y .
Download