Homework 5 Solutions

advertisement
Homework 5 Solutions
1. Ross 4.47 p. 193
a) The number of guilty votes is a binomial random variable with parameters n (the number
of judges) and p (the probability that a judge votes the defendant guilty; varies depending
on whether defendant is truly guilty or is innocent).
Hence, we have for a guilty defendant
(i)
with 9 judges:
9
9
P(declared guilty )    (0.7) i (0.3) 9i  0.9012
i 5  i 
(ii)
with 8 judges:
8
8
P(declared guilty )    (0.7) i (0.3) 8i  0.8059
i 5  i 
(iii) with 7 judges:
7
7
P(declared guilty )    (0.7) i (0.3) 7i  0.8740
i 4  i 
b) Similarly, for an innocent defendant, we have
(i)
with 9 judges:
9
9
P(declared guilty )    (0.3) i (0.7) 9i  0.0988
i 5  i 
(ii)
with 8 judges:
8
8
P(declared guilty )    (0.3) i (0.7) 8i  0.0580
i 5  i 
(iii) with 7 judges:
7
7
P(declared guilty )    (0.3) i (0.7) 7i  0.1260
i 4  i 
c) The defense attorney can make up to 2 peremptory challenges from a panel of 9 judges.
The number of challenges made should be chosen to yield the lowest probability of being
declared guilty; hence, if the defense attorney is 60% certain that his client is guilty, we
have
(i)
with 9 judges: P(declared guilty )  (0.6)(0.9012)  (0.4)(0.0988)  0.5802
(ii)
with 8 judges: P(declared guilty )  (0.6)(0.8059)  (0.4)(0.0580)  0.5067
(iii) with 7 judges: P(declared guilty )  (0.6)(0.8740)  (0.4)(0.1260)  0.5748
So the defense attorney should make only 1 challenge in order to ensure that his client
has the highest probability of being found not guilty.
2. Ross 4.53 p. 194
a) If we assume independence of birthdays for spouses and that all birthdays are equally
likely, then the probability that an arbitrary couple have a birthday on April 30 is just
(1 / 365) 2 (using 1/366 would be acceptable as well). Note that the exact distribution on
the number of such couples is binomial with n  80,000 and p  (1 / 365) 2 . However,
because we have n large and p small, the number of such couples is approximately
Poisson with parameter   80,000 /(365 2 )  0.6 . Hence, the probability that at least one
pair were both on April 30 is approximately 1  e 0.6  0.4512 .
b) On the other hand, the probability that an arbitrary couple were born on the same day of
the year is 1/365 (or 1/366 if you prefer). Using the same reasoning as before, the
number of such couples will be approximately Poisson with parameter
  80,000 / 365  219.18 , and hence, the probability that at least one couple share the
same birthday is approximately 1  e 219.18  1 .
3. Ross 4.63 p. 195
a) If people enter a gambling casino at a rate of 1 every 2 minutes, then the exact number
that enter during a 5-minute interval has a Poisson distribution with parameter   2.5 .
Hence, the probability that nobody enters between 12:00 and 12:05 is just e 2.5  0.0821 .
b) The probability that at least 4 people enter the casino during that interval is just 1 minus
the probability that no more than 3 people enter the casino during that interval, which is
just

(2.5) 2 2.5 (2.5) 3 2.5 
1  e 2.5  2.5e 2.5 
e 
e   0.2424
2!
3!


4. Ross 4.64 p. 195
a) If the suicide rate is 1 suicide per 100,000 inhabitants per month, then the actual number
of suicides in a city of 400,000 in a given month is approximately Poisson with parameter
  4 . Hence, the probability that there will be 8 or more suicides in a given month is
just 1 minus the probability that there will be no more than 7 suicides in that month
(letting N represent the number of suicides):
7
4 i e 4
P( N  8)  1  
 0.0511
i!
i 0
b) If X represents the number of months during the year that have 8 or more suicides, then X
is a binomial random variable with n  12 and p  0.0511 . Hence, the probability that
there will be at least 2 months during the year that have 8 or more suicides is just
P( X  2)  1  P( X  2)  1  [(0.9489)12  12(0.0511)(0.9489)11 ]  0.1227
c) If Y represents the first month to have 8 or more suicides, where the present month
corresponds to Y  1 , then Y is a geometric random variable with parameter p  0.0511 ;
hence, we have the probability P(Y  i)  (0.9489) i 1 (0.0511) .
5. Ross 4.10 Theoretical Exercises p. 198
Since X is a binomial random variable with parameters n and p, we have that
n
 1  n  1  n  i
 1  n!  i
n i


 p (1  p ) n i
E

p
(
1

p
)










i
X

1
i

1
i

1
i
!
(
n

i
)!

 i 0 
 

i 0 

n

 i
n!
1 n  (n  1)!  i
 p (1  p ) n i 

 p (1  p ) n i
  

n  1 i 0  (i  1)!(n  i )! 
i  0  (i  1)! ( n  1)! 
n
 n  1 i 1
1

 p (1  p ) n i


(n  1) p i 0  i  1 
Now, substitute in j  i  1 , so we get
n 1 n  1

 j
1
 1 

 p (1  p ) n 1 j
E



j
X

1
(
n

1
)
p


j 1 

 n 1  n  1 j

n
1
 p (1  p ) n 1 j   (1  p ) n 1 
 
(n  1) p  j 0  j 
0

The terms of the sum inside the brackets are just probabilities from a binomial distribution with
parameters n  1 and p, and since we sum the terms from 0 to n  1 , that makes the sum itself
equal to 1; hence, we get
1
 1 
E

1  (1  p ) n 1 .

X

1
(
n

1
)
p





6. Ross 4.20 Theoretical Exercises p. 199
Note that Y, the number of heads that occur when all n coins are tossed, has a distribution that is
approximately Poisson with parameter   np . Then X, the number of heads that occur the first
time at least one head appears, is distributed as the conditional distribution of Y given that Y  0 ;
hence,
e  
P( X  1)  P(Y  1 | Y  0) 
.
1  e 
So, explanation (b) is correct.
7.
a) Ross 4.76 p. 196
Let E1 denote the event that the mathematician first discovers that the left-hand
matchbox is empty and there are k matches in the right-hand box, and let E2 denote the
event that the mathematician first discovers that the right-hand matchbox is empty and
there are k matches in the left-hand box.
E1 will occur if and only if the ( N1  1) -th choice of the left-hand matchbox is made at
the ( N1  1  N 2  k ) -th trial. Hence, from Equation (8.2) with p  1 / 2 , r  N1  1 , and
n  N1  N 2  k  1 , we get
N  N  k 1
 N1  N 2  k  1  1 2
 
P( E1 )  
N1

 2 
Using the same logic, we would find that the probability of E2 is
N  N  k 1
 N  N 2  k  1  1 2
 
P( E 2 )   1
N2

 2 
Since the two events are mutually exclusive, the probability that, at the moment the
mathematician first discovers that one of his two matchboxes, there are exactly k matches
in the other box is just the sum of the probabilities of E1 and E2 :
 N 1  N 2  k   N 1  N 2  k  1  N1  N 2  k 1
  
 

N
N
1
2
 
 2 

b) Ross 4.77 p. 196
Let E denote the event that the left-hand matchbox is emptied and there are k matches in
the right-hand box. Then E will occur if and only if the N-th choice of the left-hand
matchbox is made on the ( N  N  k ) -th trial. Using Equation (8.2) with p  1 / 2 ,
r  N , and n  2N  k , we get
2 N k
 2 N  k  1 1 
 
.
P( E )  
 N  1  2 
But, by symmetry, this is the same probability of the event that the right-hand matchbox
is emptied and there are k matches in the left-hand box. Hence, our desired result is
2 N k
 2 N  k  1 1 
 
.
2
 N  1  2 
8. Ross 4.80 pp. 196-97
a) The probability of the player selecting 2 numbers and both numbers being among the 20
is just
 20 
 
2
19
P( winning )    
 0.0601
 80  316
 
2
So, the fair payoff would be the value x that gives an expected payoff of 0:
19 
 19 

x
  (1)1 
0
 316 
 316 
Solving for x gives us x  297 / 19  15.63 ; hence, the fair payoff should be about $15.63
for every $1 bet made.
b) The number of winning numbers selected by the player (k) in this case follows a
hypergeometric distribution with m  20 and N  80 . Hence, we have that
 20  60 
 

k  n  k 

Pn ,k 
.
 80 
 
n
c) Calculating P10, k gives us the following table
Keno Payoffs/Probabilities in 10 Number Bets
No. of matches Payoff/$1 bet
Probability
0-4
-1
9.3534  10 1
5
1
5.1428  10 2
6
17
1.1479  10 2
7
179
1.6111  10 3
8
1,299
1.3542  10 4
9
2,599
6.1206  10 6
10
24,999
1.1221  10 7
The expected payoff using these numbers ends up being  0.2057  0.21 ; in other
words, you lose an average of $0.21 for every $1 10-number bet that you make.
(Side note: to speed things up, these calculations were done in R; to reproduce the results, paste
the non-bolded text into the R console.
To get the hypergeometric probabilities (in the syntax, m represents the quantity “winning
numbers,” n represents the quantity “losing numbers,” and k represents the quantity “your
selected numbers”):
dhyper(0:10, m=20, n=60, k=10)
[1] 4.579070e-02 1.795714e-01 2.952568e-01 2.674024e-01 1.473189e-01
[6] 5.142769e-02 1.147939e-02 1.611143e-03 1.354194e-04 6.120649e-06
[11] 1.122119e-07
To get the expected payoff (the vector “payoff” contains a payoff corresponding to each possible
number of matches, the multiplication in the second command gives you what can be described
as crossproducts, and sum adds up the crossproducts):
payoff <- c(-1, -1, -1, -1, -1, 1, 17, 179, 1299, 2599, 24999)
sum(payoff * dhyper(0:10, m=20, n=60, k=10))
[1] -0.2057456
End: Side note)
Download