Final Exam in Econometrics • Exercise 1 [10]

advertisement
Final Exam in Econometrics
ECON 837
April 15th, 2015
• Part 1 [20 points]:
solve both exercises.
Exercise 1 [10]
Consider the linear regression model
yi = x′i θ0 + ui ,
i = 1, · · · , n
(1)
d1
d2
where xi ∈ Rd can be partitioned into 2 subvectors x(1)
and x(2)
(d = d1 + d2 )
i ∈ R
i ∈ R
with corresponding coecient vectors θ0,1 and θ0,2 . Suppose that the object of principal
interest is θ0,1 . Suppose that xi is endogenous and that a vector of mean zero instruments
(2)′
zi ∈ Rdz is available such that E[zi ui ] = 0 and for which also E[zi xi ] = 0. Assume iid
data.
If d1 ≤ dz < d then E[zi x′i ] cannot possibly have full column rank because the linear system
is underidentied. Moreover, zi is uncorrelated with the regressors x(2)
i .
Show that θ0,1 can nevertheless be consistently estimated by proposing a consistent estimator θ̂1 of θ0,1 and derive its limit distribution.
Exercise 2 [10]
Questions (a) and (b) are independent.
(a) Let X1 , X2 , ..., Xn be an iid random sample from a population with mean µ and variance
σ 2 . Find an unbiased estimator for µ2 . Explain your reasoning carefully.
(b) Let X1 , X2 , ... be a random sequence that converges in probability to a constant a.
Assume that Pr [Xi > 0] = 1 for all i. Without using the Slutsky Theorem, show that
√
√
the random sequence Y1 , Y2 , ... dened by Yi = Xi converges in probability to a.
1
• Part 2 [40 points]:
solve 2 out of the following 3 exercises
.
Exercise 3 [20]
Suppose the DGP is yi = α + βxi + εi , but yi and xi are observed with error. In particular,
you observe yi∗ = yi + νi and x∗i = xi + ηi . Assume that

xi


µ
 
σx2
  


  
 εi 
 ∼ iid  0  ,  0

 0   0
 ν 
  
 i 
0
0
ηi
0
0
σε2
0
0
σν2
0
0
0



0 
 .

0 

ση2
Consider the least squares estimator of β in the regression of yi∗ on x∗i and a constant.
(a) Suppose ση2 = 0 and σν2 ̸= 0. Is the least squares estimator of β consistent? Show it.
(b) Suppose ση2 ̸= 0 and σν2 ̸= 0. Is the least squares estimator of β consistent? Show it.
(c) Which measurement error is causing the problem? Explain.
(d) Now suppose you have access to another set of measurements on the variable that is
causing the problem. This new measurement is subject to errors ξi that are independent of
xi , εi , ηi , and vi . Can you use this to get a consistent estimate of β? Explain.
Exercise 4 [20]
Consider the linear model, y = Xβ+ϵ, with E(ϵ|X) = 0 and E(ϵϵ′ |X) = Ω, where the matrix
Ω is unknown. We want to test the simple hypothesis H0 : β1 = 0 against the alternative
H1 : β1 ̸= 0, where β1 is the rst component of the vector β . The quasi-generalized least
squares (hereafter QGLS) estimator is the most ecient and is dened as:
)−1
(
X ′ Ω̂−1 y
β̃ = X ′ Ω̂−1 X
(a) Provide a test (test statistic, distribution and decision rule) for H0 that uses the QGLS
estimator β̃1 .
(b) What is the asymptotic power function of the test stated in (a)?
(c) Assume that the alternative hypothesis is local to the null hypothesis:
c
H2 : β 1 = √
n
2
for some c ̸= 0. Find the power function of the test stated in (a) as a function of c.
(d) Suppose that the applied researcher is thinking about designing a test based on the OLS
estimator, β̂ = (X ′ X)−1 X ′ y . Provide a valid test (test statistic, distribution, decision rule)
for H0 . What is the asymptotic power function of this test against alternative H1 ?
(e) Find the power function of the test stated in (d) against the local alternative H2 .
(f) Given your previous answers, which test is preferred? Justify your answer.
Exercise 5 [20]
Dene the GMM objective function as Qn (θ) = gn (θ)′ Vn−1 gn (θ) and the ecient GMM
estimator as θ̂ = arg minθ∈Θ Qn (θ), where θ is a k × 1 parameter vector, gn (θ) are m (with
m > k ) sample moment conditions with
1∑
gn (θ) =
g(xi , θ)
n i=1
1∑
Vn =
g(xi , θ)g(xi , θ)′
n i=1
n
n
and
for some consistent estimator θ of the true (unknown) parameter θ0 .
In general, we know that
√
n(θ̂ − θ0 ) → N (0, (Γ′ V −1 Γ)−1 ), where
]
[
∂g(xi , θ0 )
and
V = E [g(xi , θ0 )g(xi , θ0 )′ ]
Γ=E
∂θ′
d
Suppose that we have a random sample of n observations (y1 , · · · , yn ) from a normal distribution with mean θ0 and variance θ0 .
(a) Construct two moment conditions (for the mean and the variance) in terms of the
unknown parameter θ. Use these moment conditions to nd the limiting expression for the
variance matrix of the moment conditions V in terms of the true value θ0 and show that
this matrix is diagonal. Use this matrix to dene the ecient two-step GMM estimator θ̂
and derive its rst-order condition.
(b) Derive the asymptotic distribution of
2θ02 /(1 + 2θ0 ).
Recall:
then:
√
n(θ̂ − θ0 ) and show that its variance is given by
if X is a random variable distributed as a normal with mean µ and variance σ 2 ,
[
]
E (X − µ)3 = 0
and
3
[
]
E (X − µ)4 = 3σ 2
Download