Sample Exam Questions in Introduction to Econometrics This is gathered from many econometric exams and exercises I have seen. There may be some mistakes. Perhaps trying it before seeing mine would be most bene…cial. You might be able to catch that I did the wrong answers somewhere. 1. (Inception Exam, Aj. Pongsa’section, June 2003) You get estimates from Yi = + Xi + ui as b = 2; b = 0:8; b2 = 4: You know that the correlation between x and y, rxy = 0:6. Test the hypothesis that = 0: b= Pn (Xi X )(Yi Y ) ; Pn 2 i=1 (Xi X ) i=1 Pn (Xi Xq)(Yi Y ) qP Pn 2 n i=1 (Xi X ) i=1 (Yi (Yi Thus, rxy qPn i=1 (Xi i=1 qP n (Yi i=1 (Xi qP i=1 n 2 Y) 2 X) = 4 3 tcal = (Yi i=1 (Xi Y) 2 Y) 2 X) = (Xi X )(Yi Y ) Pn 2 i=1 (Xi X ) i=1 b: =) 0:6 qP i=1 (Yi 2 = n X) i=1 (Xi p Pn 2 Y i=1 ( i 1 Y) = p Pn n X) X i=1 ( i 2 = SY SX 2 Y) Pn qP n 2 Y) =) 2 Y) 2 X) SY2 2 SX = 0:8 = 16 9 n 1 RSS ; but RSS T SS Pn 2 i=1 (Yi Y ) n 1 (n 2) c2 ; (n 1)Sy2 b q var(b) i=1 qP i=1 n 2 r2 = 1 0:62 = 1 Pn (Xi Xq)(Yi Y ) Pn 2 i=1 (Xi X ) i=1 (Yi qP n qP n i=1 qP n Sy2 = rxy = = = Pn u2i ) ; b2 = i=1 (b 0:36 = 1 r c2 P b n i=1 1 (Xi 98 4 ; 99 Sy2 = X) 2 Reject the null hypothesis that Sy2 = s b c 2 Pn (ub2i ) ; n 2 i=1 S2 98 4 ; X 99 0:64 SY2 (n 1)S 2 X = r T SS = = 9 ; 16 Pn i=1 2 SX = 0:8 4 9 98 4 (99) 16 99 0:64 Yi 9 98 4 16 99 0:64 ' 8: = 0: 2. (Compre June 2004) A student turns in an assignment showing the following estimate bivariate linear regression 1 Y 2 ; Yb = 5 + 2X F-stat = 25 n = 102 Pn i=1 Yi Y 2 = 10; but neglects to write anything else. Using the reported information, answer the following questions (a) What is the R2 ? (b) What is the standard error of the slope coe¢ cient? (c) What is the standard error of the regression? Fcal = R2 =k 1 (1 R2 )=n k = R2 =2 1 (1 R2 )=102 2 = 25 =) R2 = 0:25 2 F1;100 = t2df =100 =) 25 = s:e: of regression is b = T SS RSS RSS 10 RSS RSS (n 2) = 25 q 2 s:e:(b) RSS n 2 =) s:e: b = 0:4 =) Fcal = (100) = 25 =) RSS = 8 =) b = 3. (Goldberger Ch. 23.1) R2 (1 R2 ) q RSS n 2 (n = 2) = q 8 100 1 = RSS T SS RSS T SS 1 5 p (n 2) = 2 These results were found for LS regression of Y = executive salaries on X 1 = sales and X 2 = pro…ts, across a sample of 102 …rms. b0u b = 250; X0 X = ybi = 0:5 x1i + 0:4 x2i ; u (0:83) (0:83) 10 8 8 10 (All variables had been expressed as deviations about means for convenience.) Assume that the model applies to the salary function Ey = 1 x1 + 2 x2 : Evidently, the high collinearity between sales and pro…ts has prevented precise estimation of the parameters of the salary function. To elilminate this problem, it has been proposed that we proceed as follows. First, regress pro…ts on sales, and obtain the residuals x2 : Second, regress y on x1 and x2 to estimate the parameter of the salary function. Denote the results of the second step by yb = c1 x1 + c2 x2 (a) Calculate c 1 and c 2 and calculate their standard errors. 2 This question is di¢ cult and long. However, it is worth seeing what happens after using this scheme to alleviate multicollinearity problem. This and the following questions should be done using matrix algebra. (b) Evaluate the proposal as a device for eliminating collinearity. (c) Evaluate the proposal as a device for obtaining more precise parameter estimates 4. (Goldberger Ch.23.2) The CR model applies and the X matrix shows high collinearity. The sample size is doubled by getting two observations on y, rather than one, at each of the rows of the original X. (a) What happens to the degree of collinearity? Another interesting question. Increasing the sample size is said to alleviate multicollinearity problem by reducing the degree of micronumerosity. But what about increasing samples by identical observations? (b) What happens to the variance of the coe¢ cients? (c) Comment on the claim that more data is no remedy for the multicollinearity problem if the data are simply "more of the same". 3