Uploaded by 李怡然

2021 - Econ 326 - Problem Set 1 - Solutions

advertisement
Econ 326 (002)
Winter Session, Term II, January 2021
M. Vaney
Problem Set 1 - Solutions
1. Show the following equalities hold: (x =
(a)
Pn
i=1 (xi
1
n
x) = 0
n
X
(xi
Pn
i=1
n
X
x) =
i=1
xi
i=1
n
X
=
i=1
n
X
=
Pn
i=1
xi (yi
y) =
Pn
i=1 (xi
x)(yi
n
X
n
X
xi
xi
Pn
i=1
Pn
n
X
i=1
n
X
1
n
yi )
x
i=1
i=1
(b)
xi ; y =
i=1
xi
n
xi = 0
i=1
y)
(xi
x)(yi
y)
i=1
=
n
X
[xi (yi
y)
x(yi
y)]
i=1
=
=
n
X
i=1
n
X
xi (yi
xi (yi
y)
y)
i=1
but
Pn
i=1 (yi
y) =
Pn
i=1
yi
n
X
x(yi
i=1
n
X
x
(yi
y)
y)
i=1
Pn
i=1
=
y = ny
n
X
xi (yi
ny = 0
y)
i=1
2. Consider the following random sample of size n = 5 drawn from a population described
by the population regression function y = 0 + 1 x + u :
xi
2
5
8
10
15
1
yi
8
14
16
18
24
(a) Compute x and y.
x = 8
y = 16
(b) Complete a table that shows (xi
each observation.
x), (yi
xi
2
5
8
10
15
y)
yi (xi
8
6
14
3
16 0
18 2
24 7
x)
(yi
8
2
0
2
8
y); (xi
(xi
36
9
0
4
49
x)2
x)2 and (xi
(xi
48
6
0
4
56
x)(yi
x)(yi
y) for
y)
(c) Compute estimators ^ 1 and ^ 0 .
^
1
=
=
^
0
P5
x)(yi
i=1 (xi
P5
x)2
i=1 (xi
y)
114
= 1:16
98
= y ^ 1x
= 16 1:16(8)
= 6:69
(d) Complete a table that shows y^i ; u^i and xi u^i for each observation.
xi
2
5
8
10
15
yi y^i
u^i
xi u^i
8 9:02
1:02
2:01
14 12:51 1:49
7:44
16 16
0
0
18 18:33
0:33
3:27
24 24:14
:14
2:14
3. For the regression in question #2:
xi
2
5
8
10
15
yi y^i
u^i
SST SSR
x2i
8 9:02
1:02 64
1:0404 4
14 12:51 1:49
4
2:2201 25
16 16
0
0
0
64
18 18:33
0:33 4
0:108 9 100
24 24:14
:14 64
0:019 6 225
(a) Determine the R2 for the regression. SST = 136; SSE = 132:612; SSR = 3:388
R2 =
2
132:612
= :975
136
(b) Compute the Standard Error of the regression (estimate of the standard deviation
of the disturbances).
Pn 2
u^
SSR
3:389
2
^ u = i=1 i =
=
= 1:13
n 2
n 2
5 2
q
p
^ 2u = 1:1207 = 1:06
(c) Compute the variance of the ^ 1 OLS estimator.
var( ^ 1 ) = P5
2
u
i=1 (xi
^ 2u
P5
i=1 (xi
=
x)2
x)2
1:13
= :0115
98
=
(d) Compute the variance of the ^ 0 OLS estimator.
var( ^ 0 ) =
Pn
2
x2
Pn i=1 i 2
n i=1 (xi x)
(1:13) (418)
(5) (98)
= :963
var( ^ 0 ) =
4. For the regression in question #2 and using results from question #3:
(a) Construct the 90% con…dence interval for ^ 0 .
For the 90% con…dence interval,
tcrit
:05;3 = 2:353:
The ols estimate of
0
= :10: We will use the critical value tcrit;n
2
2
=
is used as the center point and we add and subtract
q
crit
t ;n 2 var( ^ 0 )
2
^0
6:69
6:69
q
var( ^ 0 )
p
(2:353)( :963)
tcrit;n 2
2
p
(2:353)( :963)
4:38
0
0
p
6:69 + (2:353)( :963)
9:00
(b) Test the hypothesis that the slope coe¢ cient is equal to zero at the = :05 level
of signi…cance. Be sure to specify both null and alternative hypotheses.
We are conducting a two-tailed test with hypotheses:
3
H0 :
0
=0
H1 :
0
6= 0
Using a t-test:
null
^
1:16 0
1
tcalc = q1
= p
= 10:817
:0115
^
var( 1 )
The test statistic follows a T -distribution with n 2 = 3 degrees of freedom. At the
= :05 level of signi…cance the critical value is
tcrit;n
2
2
= tcrit
:025;3 = 3:182
The null hypothesis can be rejected at the
= :05 level of signi…cance.
The slope coe¢ cient is signi…cantly di¤erent from zero.
5. Consider the savings function
sav = 0 + 1 inc + u
p
u =
inc e
where e is a random variable with Efeg = 0 and var(e) =
independent of inc.
2
e.
Assume that e is
(a) Show that Efujincg = 0: Is the conditional mean assumption satis…ed?
Efujincg
p
= Ef inc ejincg
conditioning on inc then
p
inc will be known and constant so
p
= incEf ejincg
and with e independent of inc; Ef ejincg = Efeg = 0
Efujincg = 0
(b) Show that var(ujinc) =
2
e inc:
Is the assumption of homoskedasticity satis…ed?
p
var(ujinc) = var( inc ejinc)
and again, with the conditioning on inc
=
p
2
inc
var(ejinc)
= (inc) var(ejinc)
2
e inc
this is not constant, variance will increase with income so homoskedasticity is not
satis…ed.
4
6. ^ 0 is the o.l.s. estimator of
Prove the following:
(a) Ef ^ 0 g =
0
from the population regression function y =
0
^ =y
0
0+
1 x+u.
^ x
1
from the population regression function (PRF)
yi =
+
0
1 xi
+ ui
taking the average over a sample of size n
y=
0
+
1x
+
Pn
i=1
ui
n
and substituting this into the expression for ^ 0
Pn
i=1 ui
^ 1x
^0 =
0 + 1x +
n
Pn
i=1 ui
^
= 0 ( 1
1 )x +
n
and taking expectations conditional on x1 ; : : : ; xn
1X
)g
+
Efui g
1
n i=1
n
Ef ^ 0 g =
and with Ef( ^ 1
have
1 )g
0
xEf( ^ 1
= 0 ( ^ 1 is unbiased estimator of
Ef ^ 0 g =
1)
and Efui g = 0 we
0
unbiased
7. Consider a random sample of size n = 6.
f(xi ; yi ) : i = 1; 2; 3; 4; 5; 6g
(a) What are the 2 restrictions placed on the residual terms (^
ui ) in constructing o.l.s.
estimators for ^ 0 and ^ 1 ?
6
X
u^i = 0
i=1
and
6
X
xi u^i = 0
i=1
(b) What number, , of the u^i are independent?
There are 2 restrictions so there will be = n
5
2 = 4 independent residuals.
(c) Let xi = i for i = 1; : : : ; 6 (i.e. x1 = 1; x2 = 2; : : : ; x6 = 6). Let the …rst
residuals be independent. Use the restrictions from (a) to write expressions for
the dependent residuals (^
u +1 ; : : : ; u^6 ) as a function of the independent residuals
(^
u1 ; : : : ; u^ ).
From the …rst restriction we can write an expression for u^6 as a function of
u^1 ; : : : ; u^5
6
X
u^i = 0
i=1
u^6 =
u^1 u^2
5
X
u^i
=
u^3
u^4
u^5
i=1
and the second restriction along with the values of the x0i s gives
6
X
xi u^i = 0
i=1
using the known values of xi (1; 2; : : : ; 6)
u^1 + 2^
u2 + 3^
u3 + 4^
u4 + 5^
u5 + 6^
u6 = 0
substituting the expression for u^6 into this
u^1 + 2^
u2 + 3^
u3 + 4^
u4 + 5^
u5 + 6( u^1 u^2 u^3 u^4 u^5 ) = 0
u^1 + 2^
u2 + 3^
u3 + 4^
u4 + 5^
u5 6^
u1 6^
u2 6^
u3 6^
u4 6^
u5 = 0
5^
u1 4^
u2 3^
u3 2^
u4 u^5 = 0
which gives
u^5 =
5^
u1
4^
u2
3^
u3
2^
u4
u^6 =
u^1 u^2 u^3 u^4 ( 5^
u1
u^6 = 4^
u1 + 3^
u2 + 2^
u3 + u^4
4^
u2
and
3^
u3
2^
u4 )
The two dependent residuals as functions of the other 4 :
u^5 =
5^
u1 4^
u2 3^
u3 2^
u4
u^6 = 4^
u1 + 3^
u2 + 2^
u3 + u^4
The same problem is solved again, this time keeping the 4 known residuals together
in a summation. Separating out the two unknown residuals from the summation
we can write the restrictions as
!
4
X
u^i + u^5 + u^6 = 0
i=1
6
and the second restriction
6
X
xi u^i = 0
i=1
with xi = i for all i
6
X
i^
ui = 0
i=1
4
X
i^
ui + 5^
u5 + 6^
u6 = 0
i=1
this gives 2 equations with 2 unknowns u^5 and u^6
4
X
u^i + u^5 + u^6 = 0
i=1
4
X
i^
ui
i=1
!
+ 5^
u5 + 6^
u6 = 0
Taking 5 times the …rst equation and subtracting the two equations (eliminating u^5
and solving for u^6 )
!
!
4
4
X
X
0 =
5^
ui + 5^
u5 + 5^
u6
i^
ui + 5^
u5 + 6^
u6
u^6 =
u^6 =
i=1
4
X
5^
ui
i=1
4
X
(5
i=1
4
X
i^
ui
i=1
i)^
ui
i=1
u^6 = 4^
u1 + 3^
u2 + 2^
u3 + u^4
Taking 6 times the …rst equation and subtracting the two equations (eliminating u^6
and solving for u^5 )
!
!
4
4
X
X
0 =
6^
ui + 6^
u5 + 6^
u6
i^
ui + 5^
u5 + 6^
u6
0 =
u^5 =
i=1
4
X
(6
i=1
4
X
i=1
i)^
ui + u^5
(i
6)^
ui
5^
u1
4^
u2
i=1
u^5 =
3^
u3
7
2^
u4
8. Given
y=
then adding and subtracting
0
0
+
1x
+u
from the RHS
y = 0 + 1x + u + 0
y = ( 0 + 0 ) + 1 x + (u
Let the new error be e = u
The new intercept is ( 0 +
The slope term is still 1 .
0;
0 ).
0
0)
then E feg = E fu
0g
=
0
0
= 0.
9. Consider the simple regression model y = 0 + 1 x + u and o.l.s. estimators ^ 0 and ^ 1
(unbiased). Let 1 be the estimator of 1 obtained when the intercept is assumed to
be zero: yi = 1 xi + ei .
n o
(a) Find E 1 . Is 1 always a biased estimator of 1 ?
For a regression through the origin the residuals will be ei = yi
min
n
X
1 xi .
Using o.l.s.
e2i
i=1
n
X
min
(yi
1
@
@
:
1
1
=
2
1 xi )
2
i=1
n
X
(yi
1 xi )xi
=0
Pni=1
xi y i
Pi=1
n
2
i=1 xi
Before taking expectations, we can substitute yi = 0 + 1 xi + ui from the PRF:
Pn
i=1 xi (P0 + 1 xi + ui )
n
1 =
2
i=1 xi
Pn 2 Pn
Pn
x
+
x + i=1 xi ui
i
1
0
i=1
Pn i=12 i
1 =
i=1 xi
Pn
Pn
xi
xi u i
i=1
+ 1 + Pi=1
n
1 =
0 Pn
2
2
i=1 xi
i=1 xi
Taking expectations conditional on the xi :
Pn
n o
xi
E 1 = 0 Pni=1 2 +
i=1 xi
The estimator will be biased unless
0
= 0 or x = 0:
(b) Compare the variances of the two estimators for
8
1
1
( ^ 1 and
1 ):
We know that
var( ^ 1 ) = Pn
2
i=1 (xi
To …nd var(
1)
we can use
1
Pn
xi
= 0 Pni=1 2 +
i=1 xi
With population parameters
var(
1)
0
=
=
=
=
Pn
i=1 xi ui
1 + Pn
2
i=1 xi
constant and conditional on the x0 s
Pn
xi u i
var Pi=1
n
2
i=1 xi
!
n
X
1
xi u i
P
2 var
( ni=1 x2i )
i=1
n
X
1
x2i var(ui )
Pn 2 2
( i=1 xi ) i=1
n
X
1
x2i 2
Pn 2 2
( i=1 xi ) i=1
Pn 2
2
xi
Pn i=12 2
( i=1 xi )
and
=
x)2
1
2
= Pn
i=1
x2i
^ 1 ) can be written as Pn (xi
Comparing
the
variances:
the
denominator
of
var(
i=1
P
x)2 = ni=1 x2i nx2 and provided x 6= 0; this will be less than the denominator
P
of var( 1 ) = ni=1 x2i . var( 1 ) var( ^ 1 ).
10. The simple regression model y =
0
+
1x
+ u gives o.l.s. estimators ^ 0 and ^ 1 .
(a) Consider a regression of c1 yi on c2 xi . Find the estimators ~ 0 and ~ 1 and show
that they can be expressed as function of ^ 0 and ^ 1 .
Given c1 and c2 are constants then the average of c1 yi = c1 y and the average of c2 x1
will be c2 x. We can write:
Pn
x
c2 x)(c1 yi c1 y)
~ = i=1 (c
P2n i
1
c2 x)2
i=1 (c2 xi
9
factoring out
~
1
=
=
=
=
=
Pn
c c (x
x)(yi
i=1
P2n 1 2i
x)2
c (x
Pni=1 2 i
c2 c1 i=1 (xi x)(yi
P
c22 ni=1 (xi x)2
P
c2 c1 ni=1 (xi x)(yi
P
c22 ni=1 (xi x)2
P
c1 ni=1 (xi x)(yi
Pn
c2
x)2
i=1 (xi
c1 ^
c2 1
y)
y)
y)
y)
For ~ 0
~ 0 = c1 y
=
=
=
=
~ 1 c2 x
c1 ^
c1 y
c2 x
c2 1
c1 y c1 ^ 1 x
c1 (y c1 ^ 1 x)
c1 ^
0
(b) What does this imply about the e¤ect of a linear scaling of variables, such as a
change of units, on estimates.
Slope coe¢ cient will be scaled in direct proportion to any scaling of the dependent
variable (c1 ), and in an inverse fashion to scaling of the explanatory variable ( c12 ).
Intercept term will be scaled in direct proportion to the dependent variable scaling
(c1 ).
(c) Consider a regression of c1 + yi on c2 + xi . Find the estimators
that they can be expressed as function of ^ 0 and ^ 1 .
0
and
1
and show
Similar to (a) above. Note that the average of c1 + yi will be c1 + y (likewise for
c2 + xi ). The equation for 1 will have ((c2 + xi ) (c2 + x)) = (xi x) and
((c1 + yi ) (c1 + y)) = (yi y). Both c1 and c2 will drop out of the equation for
^
1 so we …nd that 1 = 1 . For 0
0
with
1
= (c1 + y)
1 (c2 + x)
= y
1 x + c1
1 c2
= ^1
0
= y ^ 1 x + c1 ^ 1 c2
= ^ 0 + c1 c2 ^ 1
10
Download