Econ0701-2280 (A/B) Introductory Econometrics Tutorial Problem Set 3 Answer Key September 28, 2019 1 Question 1(Properties of BLUE estimator) For the simple linear regression model y = 0 + 1 +u; we can obtain the OLS and respectively. estimater b and b , where b and b are estimating 0 1 0 1 0 1 Please prove: 1. b 1 = Pn i=1 xi (yi y) Pn i=1 xi (xi x) and so c1 = Proof. We …rst derive c1 . Pn (x x)(yi y) i=1 Pn i : 2 i=1 (xi x) The simple linear regression model y= 0 + 1x + u; suggests u=y 0 1 x: We assume E (u) = 0 and E (ujx) = 0:1 The assumption E (u) = 0 suggests E (y 1 As long as the intercept assume E (u) = 0: 0 1 x) 0 =0 (1) is included in the equation, it is without generality to 1 and the assumption E (ujx) = 0 implies E (ux) = 0, which further suggests E (x (y 1 x)) 0 = 0: (2) Given a sample data, we choose b 0 and b 1 to solve the sample counter- part of (1) and (2), i.e., n 1X yi n b i=1 n 1X xi yi n b i=1 Equation (3) can be rewritten as y b =y 0 Plugging in (5) into (4), xi yi b x 1 y i=1 n X xi (yi i=1 y) + b 1 n X b xi 1 0 b xi 1 xi (x Xn i=1 i=1 (xi xi (yi y) = x) (yi y) = = (3) = 0: (4) b x: 1 = 0 = 0 i=1 b Xn Xi=1 n Xni=1 i=1 (5) () xi ) Moreover, Xn = 0; b + b x; 0 1 = () n X b xi 1 0 Pn xi (xi Pi=1 n i=1 xi (yi = 1 Xn y xi yi x xi yi 2nxy + nxy = 2 Xi=1 n i=1 xi = Xn xi yi yi y i=1 X n x) : y) xi yi Xi=1 n i=1 xi + xi yi nyx Xn i=1 nyx xy That is Xn i=1 Likewise, Xn i=1 X n xi (xi x) = (xi x)2 = i=1 xi (yi y) = Xn Xi=1 n i=1 x2i x2i Xn i=1 (xi x) (yi y) : nx2 Xn Xn 2x xi + nx2 = x2i i=1 i=1 Therefore, Pn b = Pni=1 xi (yi 1 i=1 xi (xi y) = x) 3 Pn x) (yi i=1 (xi Pn x)2 i=1 (xi y) : nx2 : 2. E b 1 = 1. Proof. b 1 1 = = = E b1 1 Pn Pn i 1 (xi 0 Pn x) ( i 1 (xi Pn + Pi n 1 = Pn 0 i 1 (xi Pn x) + (xi Pn i 1 E ((xi Pn i 1 (xi 1 Pn x) i 1 xi (xi Pn 2 x) i 1 (xi 1 + Pn i 1 (xi : x) ui 2 x) x) ui ) 2 x) 4 ! = = E Pn Pn i 1 (xi x) ui i 1 (xi x)2 Pn x) E (ui ) i 1 (xi Pn x)2 i 1 (xi x) ui 1 1 x)2 x)2 i 1 (xi Pn i 1 (xi + ui ) x) ui x) ui i 1 (xi Pn 1 xi 2 x) i 1 (xi = E + i 1 (xi Pn 1 1 x)2 i 1 (xi = = x) yi i 1 (xi Pn = 0: 3. V ar b 1 = 2 SSTx , where SSTx = Proof. b 1 1 V ar b 1 = Pn i 1 (xi Pn i 1 (xi = V ar b 1 x) ui x)2 1 Pn i=1 (xi x)2 . : = V ar Pn i 1 (xi Pn i 1 (xi x) ui x)2 ! 2 = SSTX 4. Cov b 0 ; b 1 < 0 when x > 0: Proof. Cov b 0 ; b 1 = Cov y = 0 b x; b = Cov y; b 1 1 1 xCov b ; b = 5 xV ar b = Cov b 1 x; b 1 2 x SSTX < 0: : 2 Question 2 (Special case of the OLS estimator) For the simple linear regression model y = 0 + 1 x + u, we can obtain the OLS estimator b 0 and b 1 , where b 0 and b 1 are estimating 0 and 1 respectively. Under the case of b = 0, please judge the validity of the 1 following statements and explain. 1. b 0 = 0: Answer: b 2. 1 0 = 0: = y b x=y 1 =) b 0 = 0 i¤ y = 0: Answer: This is untrue in general because the 1 is the population parameter which value is unobservable. We don’t know the true value of 1. 3 SSR = 0: Answer: SSR = n X i=1 (yi 2 ybi ) = n X b yi i=1 0 b xi 1 2 = n X (yi y)2 = SST: i=1 The statement SSR = 0 is true only if SST = 0, which means there is no variation on the value of y. All data points are on the horizontal line. 4 R2 = 0: Answer: Since SSR = SST : SSE SST SSR SST SST = = = 0: R2 = SST SST SST 6 This statement is correct. Grephically speaking, b 1 implies a horizon- tal …tted line. The independent variable x gives no explanatory power to the dependent variable y. 7