Homework 3 – STAT 543 On campus: Due Friday, February 2 by 5:00 pm (TA’s office); you also may turn in the assignment in class on the same Friday Distance students: Due Wednesday, February 7 by 12:00 pm (TA’s email) 1. Suppose that the random variables Y1 , . . . , Yn satisfy Yi = βxi + εi , i = 1, . . . , n where x1 , . . . , xn are fixed constants and ε1 , . . . , εn are iid N (0, σ 2 ); here we assume σ 2 > 0 is known. (a) Find the MLE of β. (b) Find the distribution of the MLE. (c) Find the CRLB for estimating β. (Hint: you’ll have to work with the joint distribution f (y1 , . . . , yn |β) directly, since Y1 , . . . , Yn are not iid.) (d) Show the MLE is the UMVUE of β. 2. Problem 7.40, Casella & Berger (You may assume 0 < p < 1 is the parameter space). Pn 3. Problem 7.41, Casella & Berger Hint: By Jensen’s inequality, it holds that ( 2 i=1 ai /n) ≤ Pn 2 i=1 ai /n. 4. Prove that: for an estimator T of a parameter function it holds that M SEθ (T ) = Varθ (T ) + [bθ (T )]2 , where M SEθ (T ) = E{[T − γ(θ)]2 } is the mean-squared error of T and bθ (T ) = T − γ(θ) is the bias of T . 5. Problem 7.49, Casella & Berger 6. Let X1 , . . . , Xn be iid Bernoulli(θ), θ ∈ (0, 1). Find the Bayes estimator of θ with respect to the uniform(0,1) prior under the loss function (t − θ)2 L(t, θ) = θ(1 − θ) 1