Final – Practice Exam Question #1: If π1 and π2 denotes a random sample of size π = 2 from the population with 1 density function ππ (π₯) = π₯ 2 1{π₯ > 1}, then a) find the joint density of π = π1 π2 and π = π2, and b) find the marginal probability density function of π. a) By the independence of the two random variables π1 and π2, we have that their joint density is given by ππ1 π2 (π₯1 , π₯2 ) = ππ1 (π₯1 )ππ2 (π₯2 ) = π₯1−2 π₯2−2 = (π₯1 π₯2 )−2 whenever π’ π’ π₯1 > 0 and π₯2 > 0. Then π’ = π₯1 π₯2 and π£ = π₯2 implies that π₯2 = π£ and π₯1 = π₯ = π£ , so 2 ππ₯1 we can compute the Jacobian as π½ = ππ’ πππ‘ [ππ₯ 2 ππ’ ππ₯1 ππ£ ] ππ₯2 ππ£ 1 π’ 1 − π£2 ] = π£. This implies 1 = πππ‘ [ π£ 0 π’ −2 1 π’ that the joint density of π and π is πππ (π’, π£) = ππ1 π2 (π£ , π£) |π½| = (π£ π£) π£ 1 = π’2 π£ π’ whenever π£ > 1 and π£ > 1 → π’ > π£. This also shows π and π are independent. π’ π’ 1 b) We have ππ (π’) = ∫1 πππ (π’, π£) ππ£ = ∫1 π’2 π£ 1 ln(π’) ππ£ = π’2 [ln(π£)]1π’ = π’2 if π’ > 1. Question #2: Find the Maximum Likelihood Estimators (MLE) of π and πΌ based on a random πΌ sample π1 , … , ππ from ππ (π₯; π, πΌ) = ππΌ π₯ πΌ−1 1{0 ≤ π₯ ≤ π} where π > 0 and πΌ > 0. ο· We first note that the likelihood function is given by πΏ(π, πΌ) = ∏ππ=1 ππ (π₯π ; π, πΌ) = ∏ππ=1 πΌπ −πΌ π₯ππΌ−1 1{0 ≤ π₯π ≤ π} = πΌ π π −ππΌ (∏ππ=1 π₯π )πΌ−1 1{0 ≤ π₯1:π }1{π₯π:π ≤ π}. Then by inspection, we see that πΜππΏπΈ = ππ:π . To find the MLE of πΌ, we compute ln[πΏ(π, πΌ)] = π ln(πΌ) − ππΌ ln(π) + (πΌ − 1) ∑ππ=1 ln(π₯π ) so that the derivative with respect to πΌ is π ππΌ π π ln[πΏ(π, πΌ)] = πΌ − π ln(π) + ∑ππ=1 ln(π₯π ) = 0 → πΌ = π ln(π)−∑π . π=1 ln(π₯π ) π π ln(π ). )−∑ π:π π π=1 second derivative is zero, we have found that πΌΜππΏπΈ = π ln(π Since the Question #3: If π1 , … , ππ is a random sample from π(π, π 2 ), then a) find a UMVUE for π when π 2 is known, and b) find a UMVUE for π 2 when π is known. a) We have previously shown that the Normal distribution is a member of the Regular Exponential Class (REC), where π1 = ∑ππ=1 ππ and π2 = ∑ππ=1 ππ2 are jointly complete sufficient statistics for the unknown parameters π and π 2 . Since we have that πΈ(π1 ) = 1 1 πΈ(∑ππ=1 ππ ) = ∑ππ=1 πΈ(ππ ) = ∑ππ=1 π = ππ, the statistic π = π π1 = π ∑ππ=1 ππ = πΜ will be unbiased for the unknown parameter π. The Lehmann-Scheffe Theorem then guarantees that π = πΜ is a UMVUE for π when π 2 is known. b) Since πΈ(π2 ) = πΈ(∑ππ=1 ππ2 ) = ∑ππ=1 πΈ(ππ2 ) = ∑ππ=1 πππ(ππ ) + πΈ(ππ )2 = ∑ππ=1 π 2 + π 2 = 1 π(π 2 + π 2 ), the statistic π = π π2 − π 2 will be unbiased for the unknown parameter π 2 when π is known. The Lehmann-Scheffe Theorem states that π is a UMVUE for π 2 . πΌ Question #4: Consider a random sample π1 , … , ππ from ππ (π₯; π, πΌ) = ππΌ π₯ πΌ−1 1{0 ≤ π₯ ≤ π}, where π > 0 is unknown but πΌ > 0 is known. Find the constants 1 < π < π depending on the values of π and πΌ such that (πππ:π , πππ:π ) is a 90% confidence interval for π. ο· π₯ We first compute the CDF of the population πΉπ (π₯; π, πΌ) = ∫0 ππ (π‘; π, πΌ) ππ‘ = πΌ π₯ πΌ π₯ 1 π₯ πΌ ∫ π₯ πΌ−1 ππ‘ = ππΌ [πΌ π₯ πΌ ] = (π) whenever 0 ≤ π₯ ≤ π. Then we use this to compute ππΌ 0 0 π₯ ππΌ the CDF of the largest order statistic πΉππ:π (π₯) = π(ππ:π ≤ π₯) = π(ππ ≤ π₯)π = (π) . In order for (πππ:π , πππ:π ) to be a 90% confidence interval for π, it must be the case that π(πππ:π ≥ π) = 0.05 and π(πππ:π ≤ π) = 0.05. We solve each of these two π equations for the unknown constants, so π(πππ:π ≥ π) = 0.05 → π (ππ:π ≥ π) = π π π/π ππΌ 0.05 → 1 − π (ππ:π ≤ π) = 0.95 → 1 − πΉππ:π (π) = 0.95 → 1 − ( 1 πππΌ 1 1⁄ππΌ = 0.95 → π = (0.95) 20 1⁄ππΌ = (19) π ) = 0.95 → 1 − . Similarly, we find that π = (20)1⁄ππΌ . Question #5: If π1 and π2 are independent and identically distributed from πΈππ(π) such 1 π₯ π that their density is ππ (π₯) = π π −π 1{π₯ > 0}, then find the joint density of π = π1 and π = π2. 1 ο· We have π’ π’π£ 1 1 π’ −π’(1−π£) 0 ]| = [π π −π ] [π π − π ] [π’] = π2 π π π’ 1 πππ (π’, π£) = ππ1 π2 (π’, π’π£) |πππ‘ [ π£ whenever we have that π’ > 0 and π’π£ > 0 → π£ > 0. Question #6: Let π1 , … , ππ be a random sample from ππΈπΌ(π, π½) with known π½ such that π½ their common density is ππ (π₯; π, π½) = ππ½ ο· π₯ π½ π₯ π½−1 π −(π) π₯ π½ π½−1 −( π ) π½ We have πΏ(π, π½) = ∏ππ=1 ππ½ π₯π π 1{π₯ > 0}. Find the MLE of π. π = (π½π −π½ ) (∏ππ=1 π₯π )π½−1 π π 1 1 π½ − π½ ∑π π=1 π₯π π so that π½ ln[πΏ(π, π½)] = π ln(π½) − ππ½ ln(π) + (π½ − 1) ∑ππ=1 ln(π₯π ) − ππ½ ∑ππ=1 π₯π . Then we can calculate that the π= 1 ∑π π₯ π½ ππ½ π=1 π π ln[πΏ(π, π½)] = − ππ π½ π½ →π = ∑π π=1 π₯π π ππ½ π π½ π½ + ππ½+1 ∑ππ=1 π₯π = 0 → ππ½ π π½ π½ = ππ½+1 ∑ππ=1 π₯π → 1 1 π½ π½ . Therefore, we have found πΜππΏπΈ = (π ∑ππ=1 ππ ) . Question #7: Use the Cramer-Rao Lower Bound to show that πΜ is a UMVUE for π based on a 1 π₯ random sample of size π from an πΈππ(π) distribution where ππ (π₯; π) = π π −π 1{π₯ > 0}. ο· We first find the Cramer-Rao Lower Bound; since π(π) = π, we have [π ′ (π)]2 = 1. π Then we have ln π(π; π) = − ln(π) − π so that π2 1 π2 2π π ππ 1 π ln π(π; π) = − π + π2 and 1 2 2π 1 1 ln π(π; π) = π2 − π3 . Finally, −πΈ [ππ2 ln π(π; π)] = − π2 + π3 πΈ(π) = π3 − π2 = π2 ππ2 implies that πΆπ πΏπ΅ = 1 π2 π . To verify that πΜ is a UMVUE for π, we first show that πΈ(πΜ ) = 1 1 πΈ (π ∑ππ=1 ππ ) = π ∑ππ=1 πΈ(ππ ) = π ππ = π. Then we check that the variance achieves 1 the Cramer-Rao Lower Bound from above, so πππ(πΜ ) = πππ (π ∑ππ=1 ππ ) = 1 π ∑ππ=1 πππ(ππ ) = 2 1 ππ 2 = π2 π2 π = πΆπ πΏπ΅. Thus, πΜ is a UMVUE for π. Question #8: Use the Lehmann-Scheffe Theorem to show that πΜ is a UMVUE for π based on π₯ 1 a random sample of size π from an πΈππ(π) distribution where ππ (π₯; π) = π π −π 1{π₯ > 0}. ο· We know that πΈππ(π) is a member of the Regular Exponential Class (REC) with π‘1 (π₯) = π₯, so the statistic π = ∑ππ=1 π‘1 (ππ ) = ∑ππ=1 ππ is complete sufficient for the parameter π. Then πΈ(π) = πΈ(∑ππ=1 ππ ) = ∑ππ=1 πΈ(ππ ) = ππ implies that the estimator 1 1 π = π π = π ∑ππ=1 ππ = πΜ is unbiased for π, so the Lehmann-Scheffe Theorem guarantees that it will be the Uniform Minimum Variance Unbiased Estimator of π. Question #9: Find a 100(1 − πΌ)% confidence interval for π based on a random sample of π₯ 1 size π from an πΈππ(π) distribution where ππ (π₯) = π π −π 1{π₯ > 0}. Use the facts that πΈππ(π) ≡ πΊπ΄πππ΄(π, 1), that π is a scale parameter, and that πΊπ΄πππ΄(2, π) ≡ π 2 (2π). ο· Since π is a scale parameter, we know that π = Μ ππΏπΈ π π πΜ =π= ∑π π=1 ππ ππ will be a pivotal quantity. Note that we used the fact that πΜππΏπΈ = πΜ , which was derived in a previous exercise. We begin by noting that since each ππ ~πΈππ(π) ≡ πΊπ΄πππ΄(π, 1), we can conclude that π΄ = ∑ππ=1 ππ ~πΊπ΄πππ΄(π, π). In order to obtain a chi-square distributed 2 pivotal quantity, we use the modified pivot π = 2ππ = π ∑ππ=1 ππ . To verify its 2 distribution, we use the CDF technique so πΉπ (π) = π(π ≤ π) = π (π ∑ππ=1 ππ ≤ π) = π (∑ππ=1 ππ ≤ 1 ππ ππ π−1 ( ) ππ Γ(π) 2 2 ( π− so π [ππΌ2 (2π) < 2 ππ implies ) = πΉπ΄ ( 2 ) ππ ) 2 π 2ππΜ π π 2 1 π that ππ π 2 = 2πΓ(π) π π−1 π −2 . This proves that π = π ∑ππ=1 ππ = 2 < π1− πΌ (2π)] = 1 − πΌ → π [ 2 π 2 ππ π ππ (π) = ππ πΉπ΄ ( 2 ) = ππ΄ ( 2 ) 2 = 2ππΜ πΌ (2π) 1− 2 2ππΜ 2ππΜ π ~π 2 (2π) < π < π2 (2π)] = 1 − πΌ is the πΌ 2 desired 100(1 − πΌ)% confidence interval for the unknown parameter π. Question #10: Let π1 , … , ππ be independent and identically distributed from the population 1 with CDF given by πΉπ (π₯) = 1+π −π₯ . Find the limiting distribution of ππ = π1:π + ln(π). ο· We have that πΉππ (π¦) = π(ππ ≤ π¦) = π(π1:π + ln(π) ≤ π¦) = π(π1:π ≤ π¦ − ln(π)) = π 1 − π(π1:π > π¦ − ln(π)) = 1 − π(ππ > π¦ − ln(π))π = 1 − (1 − πΉπ (π¦ − ln(π))) = π 1 π π ln(π)−π¦ π ππ −π¦ 1 π 1 − (1 − 1+π −π¦+ln(π) ) = 1 − (1+π ln(π)−π¦ ) = 1 − (1+ππ −π¦ ) = 1 − (1 + ππ −π¦ ) = 1 − 1 π (1 + ππ −π¦ ) = 1 − (1 + ππ¦ π ) . Then lim πΉππ (π¦) = 1 − lim (1 + π π→∞ π→∞ π ππ limiting distribution since lim (1 + π) π→∞ ππ¦ π π¦ ) = 1 − π π is the π = π ππ for all real numbers π and π. Question #11: Let π1 , … , ππ be independent and identically distributed from the population 1 with a PDF given by ππ (π₯) = 2 1{−1 < π₯ < 1}. Approximate π(∑100 π=1 ππ ≤ 0) using Φ(π§). ο· 1 From the given density, we know that π~πππΌπΉ(−1,1) so πΈ(π) = 0 and πππ(π) = 3. 100 100 Then if we define π = ∑100 π=1 ππ , we have that πΈ(π) = πΈ(∑π=1 ππ ) = ∑π=1 πΈ(ππ ) = 0 and 100 πππ(π) = πππ(∑100 π=1 ππ ) = ∑π=1 πππ(ππ ) = 100/3. These facts allow us to compute π(∑100 π=1 ππ ≤ 0) = π(π ≤ 0) = π ( π−0 √100/3 ≤ 0−0 ) ≈ π(π ≤ 0) = Φ(0) = 1/2. √100/3 1 Question #12: Suppose that π is a random variable with density ππ (π₯) = 2 π −|π₯| for π₯ ∈ β. Compute the Moment Generating Function (MGF) of π and use it to find πΈ(π 2 ). ο· ∞ ∞ 1 By definition, we have ππ (π‘) = πΈ(π π‘π ) = ∫−∞ π π‘π₯ ππ (π₯) ππ₯ = ∫−∞ π π‘π₯ π −|π₯| ππ₯ = 2 ∞ ∫ π π‘π₯ π −π₯ 2 0 1 1 0 1 ∞ 1 0 ππ₯ + 2 ∫−∞ π π‘π₯ π π₯ ππ₯ = 2 ∫0 π π₯(π‘−1) ππ₯ + 2 ∫−∞ π π₯(π‘+1) ππ₯. After integrating 1 and collecting like terms, we obtain ππ (π‘) = 2 [(π‘ + 1)−1 − (π‘ − 1)−1 ]. We then compute π 1 π (π‘) = 2 [−(π‘ + 1)−2 + (π‘ − 1)−2 ] and ππ‘ π π2 1 2 2 1 π2 ππ‘ 2 1 2 2 ππ (π‘) = 2 [(π‘+1)3 − (π‘−1)3 ] so that πΈ(π 2 ) = ππ‘ 2 ππ (0) = 2 [(0+1)3 − (0−1)3 ] = 2 (2 + 2) = 2. 1 Question #13: Let π be a random variable with density ππ (π₯) = 4 1{−2 < π₯ < 2}. Compute the probability density function of the transformed random variable π = π 3 . ο· 1 1 We have πΉπ (π¦) = π(π ≤ π¦) = π(π 3 ≤ π¦) = π (π ≤ π¦ 3 ) = πΉπ (π¦ 3 ) so that ππ (π¦) = π 1 1 1 2 11 2 πΉ (π¦ 3 ) = ππ (π¦ 3 ) 3 π¦ −3 = 4 3 π¦ −3 = ππ¦ π 1 2 12π¦ 3 1 if −2 < π¦ 3 < 2 → −8 < π¦ < 8. Question #14: Let π1 , … , ππ be a random sample from the density ππ (π₯) = ππ₯ π−1 whenever 0 < π₯ < 1 and zero otherwise. Find a complete sufficient statistic for the parameter π. ο· We begin by verifying that the density is a member of the Regular Exponential Class (REC) by showing that ππ (π₯) = exp{ln(ππ₯ π−1 )} = exp{ln(π) + (π − 1) ln(π₯)} = π exp{(π − 1) ln(π₯)}, where π1 (π) = π − 1 and π‘1 (π₯) = ln(π₯). Then we know that the statistic π = ∑ππ=1 π‘1 (ππ ) = ∑ππ=1 ln(ππ ) is complete sufficient for the parameter π. Question #15: Let π1 , … , ππ be a random sample from the density ππ (π₯) = ππ₯ π−1 whenever 0 < π₯ < 1 and zero otherwise. Find the Maximum Likelihood Estimator (MLE) for π > 0. ο· We have πΏ(π) = ∏ππ=1 ππ (π₯π ) = ∏ππ=1 ππ₯ππ−1 = π π (∏ππ=1 π₯π )π−1 so that ln[πΏ(π)] = π ln(π) + (π − 1) ∑ππ=1 ln(π₯π ) and π π ln[πΏ(π)] = π + ∑ππ=1 ln(π₯π ) = 0 → π = ∑π ππ −π . π=1 ln(π₯π ) Thus, we have found that the Maximum Likelihood Estimator is πΜππΏπΈ = ∑π −π . π=1 ln(ππ ) Question #16: Let π1 , … , ππ be a random sample from the density ππ (π₯) = ππ₯ π−1 whenever 0 < π₯ < 1 and zero otherwise. Find the Method of Moments Estimator (MME) for π > 0. ο· 1 1 1 π 1 π We have πΈ(π) = ∫0 π₯ππ (π₯) ππ₯ = ∫0 π₯ππ₯ π−1 ππ₯ = π ∫0 π₯ π ππ₯ = π+1 [π₯ π+1 ]0 = π+1, so π πΜ that π1′ = π1′ → π+1 = πΜ → π(1 − πΜ ) = πΜ so the desired estimator is πΜπππΈ = 1−πΜ .