advertisement

March 6 Homework Solutions Math 151, Winter 2012 Chapter 6 Problems (pages 287-291) Problem 31 According to the U.S. National Center for Health Statistics, 25.2 percent of males and 23.6 percent of females never eat breakfast. Suppose that random samples of 200 men and 200 women are chosen. Approximate the probability that (a) at least 110 of these 400 people never eat breakfast. Let M denote the number of men that never eat breakfast and W denote the number of women that never eat breakfast. Note that M is a binomial random variable with p = .252 and n = 200, and W is a binomial random variable with p = .236 and n = 200. We compute E[M ] = 200(.252) = 50.4, E[W ] = 200(.236) = 47.2, Var(M ) = 200(.252)(1 − .252) ≈ 37.7, Var(M ) = 200(.236)(1 − .236) ≈ 36.1. Thus we may approximate M by a normal random variable with µ = 50.4 and σ 2 ≈ 37.7 and we may approximate W by a normal random variable with µ = 47.2 and σ 2 ≈ 36.1. We want to compute P {110 ≤ M + W }. We may approximate M + W by a normal distribution with µ = 50.4 + 47.2 = 97.6 and σ 2 ≈ 37.7 + 36.1 = 73.8. Hence M + W − 97.6 110 − 97.6 110 − 97.6 √ √ √ ≤ ≈1−Φ P {110 ≤ M + W } = P 73.8 73.8 73.8 ≈ 1 − Φ(1.44) = 1 − .9251 = .0749. (b) the number of the women who never eat breakfast is at least as large as the number of the men who never eat breakfast. We want to compute P {M ≤ W } = P {M − W ≤ 0}. We may approximate M − W by a normal distribution with µ = 50.4 − 47.2 = 3.2 and σ 2 = 37.7 + 36.1 = 73.8. Hence −3.2 M − W − 3.2 √ ≥√ P {M ≤ W } = P {M − W ≤ 0} = P 73.8 73.8 −3.2 ≈ Φ √ ≈ 1 − Φ(.37) = 1 − .6443 = .3557. 73.8 Problem 44 If X1 , X2 , X3 are independent random variables that are uniformly distributed over (0, 1), compute the probability that the largest of the three is greater than the sum of the other two. 1 Note that if, for example, X1 ≥ X2 + X3 , then X1 is automatically the largest of the three. We want to compute P {X1 ≥ X2 + X3 } + P {X2 ≥ X1 + X3 } + P {X3 ≥ X1 + X2 }. By symmetry, these three terms are all equal, P {X1 ≥ X2 + X3 }. Recall from Example 3a on y 2−y fX2 +X3 (y) = 0 so it suffices to compute the first term page 252 that if 0 ≤ y ≤ 1, if 1 < y ≤ 2, otherwise. Hence Z 1 Z P {X1 ≥ X2 + X3 } = x Z 1 Z 0 Z 1 · y dydx = fX1 (x)fX2 +X3 (y)dydx = 0 x 0 0 0 1 1 2 1 x dx = . 2 6 So the probability that the largest of the three is greater than the sum of the other two is P {X1 ≥ X2 + X3 } + P {X2 ≥ X1 + X3 } + P {X3 ≥ X1 + X2 } = 1 1 1 1 + + = . 6 6 6 2 Chapter 6 Theoretical Exercises (pages 291-293) Problem 19 Let X1 , X2 , X3 be independent and identically distributed continuous random variables. (a) Compute P {X1 > X2 |X1 > X3 }. We consider 7 mutually exclusive cases: the first six are X1 > X2 > X3 , X1 > X3 > X2 , X2 > X1 > X3 , X2 > X3 > X1 , X3 > X1 > X2 , X3 > X2 > X1 , and the seventh case is where at least two of X1 , X2 , X3 are equal. Since the Xi are continuous random variables, the probability that at least two of the Xi are equal is zero. Since X1 , X2 , X3 , are independent and identically distributed, the first six cases are all equally likely. This means that P (X1 > X2 > X3 ) = 1/6, and similarly for the other orderings. Hence P {X1 > X2 |X1 > X3 } = 2 P {X1 > X2 > X3 } + P {X1 > X3 > X2 } = . P {X1 > X2 > X3 } + P {X1 > X3 > X2 } + P {X2 > X1 > X3 } 3 (b) Compute P {X1 > X2 |X1 < X3 } P {X1 > X2 |X1 < X3 } = P {X3 > X1 > X2 } 1 = . P {X2 > X3 > X1 } + P {X3 > X1 > X2 } + P {X3 > X2 > X1 } 3 2 (c) Compute P {X1 > X2 |X2 > X3 }. P {X1 > X2 |X1 > X3 } = 1 P {X1 > X2 > X3 } = . P {X1 > X2 > X3 } + P {X2 > X1 > X3 } + P {X1 > X3 > X2 } 3 (d) Compute P {X1 > X2 |X2 < X3 }. P {X1 > X2 |X1 > X3 } = P {X1 > X3 > X2 } + P {X3 > X1 > X2 } 2 = . P {X1 > X3 > X2 } + P {X3 > X1 > X2 } + P {X3 > X2 > X1 } 3 Problem 28 Show that the median of a sample of size 2n + 1 from a uniform distribution on (0, 1) has a beta distribution with parameters (n + 1, n + 1). Let X denote the median of the independent and identically distributed random variables X1 , . . . , X2n+1 . Consider equation 6.2 on page 272. By replacing n with 2n + 1 and by choosing j = n + 1, we get that the probability density function of X is (2n + 1)! (F (x))n (1 − F (x))n f (x), n!n! where f is the common probability density function and F is the common cumulative distribution function of the Xi ’s. Since X1 , . . . , X2n+1 are uniformly distributed on (0, 1), this means that f (x) = 1 for 0 ≤ x ≤ 1 and F (x) = x for 0 ≤ x ≤ 1. Hence ( (2n+1)! n x (1 − x)n for 0 ≤ x ≤ 1 n!n! fX (x) = 0 otherwise. fX (x) = Now, consider the Beta distribution on page 218. Note that Z 1 B(n + 1, n + 1) = xn (1 − x)n dx 0 Z 1 n = xn+1 (1 − x)n−1 dx after integrating by parts n+1 0 = ... Z 1 n! = x2n dx after integrating by parts (2n) · . . . · (n + 1) 0 n! = (2n + 1)(2n) · . . . · (n + 1) n!n! = . (2n + 1)! Hence ( fX (x) = = (2n+1)! n x (1 n!n! − x)n 0 ( 1 xn (1 B(n+1,n+1) 0 for 0 ≤ x ≤ 1 otherwise − x)n for 0 ≤ x ≤ 1 otherwise. 3 So X has a Beta distribution with parameters (n + 1, n + 1). Chapter 7 Problems (pages 373-379) Problem 1 A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice, and if tails, then one-half of the value that appears on the die. Determine her expected winnings. Let X denote the value on the die, let Y be 1 if the coin lands heads and 0 if the coin lands tails, and let g(X, Y ) denote the winnings. Then her expected winnings are 6 X 1 6 X X 1 E[winnings] = g(x, y)p(x, y) = 2x · p(x, 0) + x · p(x, 1) 2 x=1 y=0 x=1 6 6 X 1 1 1 5 X 5 = 2x · + x· = x= · 21 = 4.375. 12 2 12 24 x=1 24 x=1 Problem 12 A group of n men and n women is lined up at random. (a) Find the expected number of men who have a woman next to them. Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man standing next to a woman and Xi = 0 otherwise. We want to compute " 2n # 2n 2n X X X E Xi = E[Xi ] = P {Xi = 1}. i=1 i=1 i=1 Note X1 = 1 only if the first person is male and the second is female. There are 2n(2n − 1) ways to choose the first two people. There are n ways to choose the first person to be male and n ways to choose the second person to be female, and hence n2 ways we can have Xi = 1. Hence P {X1 = 1} = Similarly P {X2n = 1} = n n2 = . 2n(2n − 1) 4n − 2 n . 4n−2 Now let’s find P {Xi = 1} for 1 < i < 2n. There are 2n(2n − 1)(2n − 2) ways to choose the (i − 1)-th, i-th, and (i + 1)-th person. We have Xi = 1 if the three people chosen are female male female, female male male, or male male female. Hence there are (n)(n)(n − 1) + (n)(n)(n − 1) + (n)(n − 1)(n) = 3n2 (n − 1) ways we can have Xi = 1. Hence P {Xi = 1} = 3n2 (n − 1) 3n = . 2n(2n − 1)(2n − 2) 8n − 4 4 Hence " E 2n X i=1 # Xi = 2n X P {Xi = 1} = 2 · i=1 n 3n 3n2 − n + (2n − 2) · = . 4n − 2 8n − 4 4n − 2 (b) Repeat part (a), but now assuming that the group is randomly seated at a round table. Label the people in order 1 though 2n and let Xi = 1 if the i-th person is a man standing next to a woman and Xi = 0 otherwise. We want to compute " 2n # 2n 2n X X X E Xi = E[Xi ] = P {Xi = 1}. i=1 i=1 i=1 Since the group is now seated in a circle, for 1 ≤ i ≤ 2n we have P {Xi = 1} = 3n . 8n − 4 This computation is the same as the one that we used in part (a) for 1 < i < 2n. Hence " 2n # 2n X X 3n 3n2 E Xi = P {Xi = 1} = 2n · = . 8n − 4 4n − 2 i=1 i=1 Problem 19 A certain region is inhabited by r distinct types of a certain species of insect. Each insect caught will, independently of the types of the previous catches, be of type i with probability Pi , i = 1, . . . , r, r X Pi = 1. i=1 (a) Compute the mean number of insects that are caught before the first type 1 catch. Let X denote the number of insects caught before the first type 1 catch. Then P {X = x} = (1 − P1 )x P1 . Hence E[X] = ∞ X x(1 − P1 )x P1 x=0 = P1 ∞ X x(1 − P1 )x x=0 = P1 · 1 − P1 P12 using the formula ∞ X n=1 1 − P1 = . P1 5 nz n = z for |z| < 1 (1 − z)2 (b) Compute the mean number of types of insects that are caught before the first type 1 catch. Let Xi denote the number of insects of type i caught before the first type 1 catch. Let g(0) = 0 and let g(x) = 1 for positive integers x > 0. We want to compute " r # r r r X X X X E g(Xi ) = E[g(Xi )] = P {g(Xi ) = 1} = P {Xi ≥ 1}. i=2 i=2 i=2 i=2 Let X denote the number of insects caught before the first type 1 catch as in part (a). Then P {Xi ≥ 1} = ∞ X P {Xi ≥ 1, X = x} x=0 since the events{Xi ≥ 1, X = x} are mutually exclusive and have union{Xi ≥ 1} ∞ X = (P {X = x} − P {Xi = 0, X = x}) = x=0 ∞ X ((1 − P1 )x P1 − (1 − P1 − Pi )x P1 ) x=0 P1 P1 − P1 P1 + Pi using the formula for a geometric series, twice Pi = . P1 + Pi = Hence " E r X # g(Xi ) = r X i=2 P {Xi ≥ 1} = i=2 r X i=2 Pi . P 1 + Pi Problem 24 A bottle initially contains m large pills and n small pills. Each day, a patient randomly chooses one of the pills. If a small pill is chosen, then that pill is eaten. If a large pill is chosen, then the pill is broken in two; one part is returned to the bottle (and is now considered a small pill) and the other part is then eaten. (a) Let X denote the number of small pills in the bottle after the last large pill has been chosen and its smaller half returned. Find E[X]. Label the small pills initially present 1 though n and label the small pills created by splitting a large one n + 1 though n + m. Let Ii = 1 if the i-thPpill remains after the last large pill is chosen and let Ii = 0 otherwise. Then X = m+n i=1 Ii , so E[X] = n+m X E[Ii ] = i=1 n+m X i=1 6 P {Ii = 1}. We will calculate P {Ii = 1} by considering the two cases 1 ≤ i ≤ n and n + 1 ≤ i ≤ n + m separately. If 1 ≤ i ≤ n, then the i-th small pill is initially present. Pretend we keep choosing pills until all of them are gone. It suffices to consider the order in which the i-th pill and the m large pills are chosen. There are m + 1 of these pills, so the probability that the i-th pill is chosen last among them is P {Ii = 1} = 1/(m + 1). If n + 1 ≤ i ≤ n + m, then the i-th small pill is formed by breaking a large pill in two. It suffices to consider the order in which the large pills and the i-th small pill are chosen. Label the large pills 1 through m in the order in which they are initially chosen. Let Ji denote this label for the large pill corresponding to the i-th pill, that is, let Ji denote when the large pill is broken forming the i-th small pill. By conditioning on the value of Ji , we get P {Ii = 1} = m X P ({Ii = 1}|{Ji = j})P {Ji = j}. j=1 The probability that the large pill corresponding to the i-th small pill is labeled j out of the m large pills is P {Ji = j} = 1/m. Once the j-th large pill is broken to form the i-th small pill, m − j large pills and the i-th small pill remain, so the probability that the i-th small pill is chosen last among these m − j + 1 pills is P ({Ii = 1}|{Ji = j}) = 1 . m−j+1 Hence P {Ii = 1} = = = m X j=1 m X j=1 m X k=1 P ({Ii = 1}|{Ji = j})P {Ji = j} 1 1 · m−j+1 m 1 , km by letting k = m − j + 1. Therefore E[X] = n+m X P {Ii = 1} i=1 = n X P {Ii = 1} + i=1 n+m X P {Ii = 1} i=n+1 m X 1 1 +m· =n· m+1 km k=1 m X1 n = + . m + 1 k=1 k 7 (b) Let Y denote the day on which the last large pill is chosen. Find E[Y ]. There are a total of n + 2m days. On the Y -th day the last large pill is chosen and for the remaining X days small pills are chosen. Here X is the number of small pills in the bottle after the last large pill has been choosn, as in part (a). Thus X + Y = n + 2m, so m X1 n − . E[Y ] = E[n + 2m − X] = n + 2m − E[X] = n + 2m − m + 1 k=1 k Problem 26 If X1 , X2 , . . . , Xn are independent and identically distributed random variables having uniform distributions over (0, 1), find (a) E[max(X1 , . . . , Xn )]. We want to compute Z E[max(X1 , . . . , Xn )] = 1 Z ... 0 1 max(x1 , . . . , xn )dx1 . . . dxn . 0 We are integrating over the region where 0 ≤ x1 , x2 , . . . , xn ≤ 1. We can break up this region into the n! regions corresponding to each of the n! possible orderings of x1 , . . . , xn , plus a negligible region of zero volume where at least two of the xi ’s are equal. Since the X1 , . . . , Xn are independent and identically distributed, the integrals over each of the n! different regions will have the same value. Hence we’ll only consider the region where x1 < x2 < . . . < xn . We have 8 Z 1 Z ... 0Z = n! 1 max(x1 , . . . , xn )dx1 . . . dxn E[max(X1 , . . . , Xn )] = 0 max(x1 , . . . , xn )dx1 . . . dxn x1 <...<xn since the integral is the same over all n! such regions Z = n! xn dx1 . . . dxn x1 <...<xn since the max in this region is xn Z x2 Z 1 Z xn Z xn−1 xn dx1 . . . dxn ... = n! 0 0 0 0 Z x3 Z 1 Z xn Z xn−1 xn x2 dx2 . . . dxn ... = n! 0 0 0 0 Z x4 Z 1 Z xn Z xn−1 1 xn x23 dx3 . . . dxn ... = n! 2 0 0 0 0 = ... Z 1 1 = n! xn · xn−1 dxn n (n − 1)! 0 Z 1 xnn dxn =n 0 n = . n+1 (b) E[min(X1 , . . . , Xn )]. The clever way to do this problem is to note that min(X1 , . . . , Xn ) = 1 − max(1 − X1 , . . . , 1 − Xn ). So taking the expectation of both sides yields E[min(X1 , . . . , Xn )] = 1 − E[max(1 − X1 , . . . , 1 − Xn )] n =1− n+1 by part (a), since each Xi has the same distribution as 1 − Xi 1 = . n+1 Alternatively, like in part (a), we can express E[min(X1 , . . . , Xn )] as a definite integral and then break this integral up into a sum of n! integrals with equal values, corresponding to each of the n! orderings of x1 , . . . , xn . We obtain 9 1 Z Z ... 0Z = n! 1 min(x1 , . . . , xn )dx1 . . . dxn E[min(X1 , . . . , Xn )] = 0 min(x1 , . . . , xn )dx1 . . . dxn x1 <...<xn since the integral is the same over all n! such regions Z = n! x1 dx1 . . . dxn x1 <...<xn since the min in this region is x1 Z x2 Z 1 Z xn Z xn−1 x1 dx1 . . . dxn ... = n! 0 0 0 0 Z x3 Z 1 Z xn Z xn−1 1 ... = n! x2 dx2 . . . dxn 2 0 0 0 0 = ... Z 1 1 n xn dxn = n! 0 n! Z 1 = xnn dxn 0 = 1 . n+1 Chapter 7 Theoretical Exercises (pages 380-384) Problem 14 For Example 2i, show that the variance of the number of coupons needed to amass a full set is equal to N −1 X iN (N − i)2 i=1 When N is large, this can be shown to be approximately equal (in the sense that their ratio approaches 1 as N → ∞) to N 2 π 2 /6. Recall from Example 2i on page 303 that we can write X = X0 + X1 + . . . + XN −1 . Since the Xi are independent, we have Var(X) = N −1 X Var(Xi ). i=1 Recall k−1 N −i i P {Xi = k} = for k ≥ 1, N N and E[Xi ] = N/(N − i). We compute k−1 ∞ N −iX 2 i N −i 1 + i/N N (N + i) 2 E[Xi ] = k = · = 3 N k=1 N N (1 − i/N ) (N − i)2 10 where the second equality follows since ∞ X k 2 xk−1 = k=1 1+x (1 − x)3 Hence Var(Xi ) = E[Xi2 ] − E[Xi ]2 = and Var(X) = N −1 X for |x| < 1. N (N + i) N2 iN − = , 2 2 (N − i) (N − i) (N − i)2 Var(Xi ) = i=1 N −1 X i=1 11 iN . (N − i)2