Testing Nonlinear Restrictions Aaron Enriquez Rohini Ghosh James Inwood

advertisement
Testing Nonlinear Restrictions
Aaron Enriquez
Rohini Ghosh
James Inwood
The Goal
Test a hypothesis of the form:
π»π‘œ : 𝐢 𝛽 = π‘ž
𝐻1 : 𝐢 𝛽 ≠ π‘ž,
where C 𝛽 is a single non-linear function.
Real World Application
Long-Run Marginal Propensity to Consume
• Consumption function with different short- and long-run MPCs:
𝑙𝑛𝐢𝑑 = 𝛼 + π›½π‘™π‘›π‘Œπ‘‘ + 𝛾𝑙𝑛𝐢𝑑−1 + πœ€π‘‘
• Short-run MPC is 𝛽, long-run MPC is 𝛿 =
• We could test the hypothesis 𝛿 = 1:
𝛽
1−𝛾
𝛽
π»π‘œ :
=1
1−𝛾
(Greene, pg. 132)
How is it done?
• Form the Wald-like statistic:
π‘Š = 𝐢 𝛽 −π‘ž
′
−1
var 𝐢 𝛽
𝐢 𝛽 −π‘ž
~ 𝑋 2 (𝐽)
(Aadland. Inference and Prediction. pg. 7)
That variance… gross.
var 𝐢 𝛽
Calculating the variance of a nonlinear function is
• potentially time-consuming
• potentially tricky
So, let’s approximate it!
Taylor Series Approximation
𝛿𝐢 𝛽
𝐢 𝛽 ≈𝐢 𝛽 +(
)′(𝐡 − 𝐡)
𝛿𝛽
Recall:
• The Taylor Series expansion of a function around a value 𝛼 is equal to:
𝑓′
𝑓 ′′
π‘₯−𝛼 2
2!
𝑓 π‘₯ =𝑓 𝛼 +
𝛼 π‘₯−𝛼 +
𝛼
+β‹―
• We can usually drop the higher order terms, which leaves us with:
𝑓 π‘₯ ≈ 𝑓 𝛼 + 𝑓′(𝛼)(π‘₯ − 𝛼)
Note:
^
^
• If plim 𝛽 =𝛽, we can use C(𝛽) as an estimate of C(𝛽) (Slutsky’s theorem)
• This is an application of the Delta Method
(Greene. (5-34). pg. 132)
Variance Estimation
𝛿𝐢 𝛽
𝐢 𝛽 = 𝐢 𝛽 +(
)′(𝛽 − 𝛽)
𝛿𝛽
π‘£π‘Žπ‘Ÿ[𝐢(𝛽)] ≈
𝛿𝐢 𝛽
𝛿𝛽
We can use 𝑠 2 𝑋 ′ 𝑋
′
−1
π‘£π‘Žπ‘Ÿ[𝛽]
𝛿𝐢 𝛽
(
𝛿𝛽
)
to estimate the variance of the estimator.
(Greene. (5-34). pg. 132)
Let’s see a math application.
Consider the nonlinear function π‘Œ = 𝑋 2 .
We know: 𝑓 π‘₯ = π‘₯ 2 and 𝑓 ′ π‘₯ = 2π‘₯.
We then have:
π‘£π‘Žπ‘Ÿ π‘Œ ≈ 2πœ‡π‘₯ 2 π‘£π‘Žπ‘Ÿ π‘₯ = 4πœ‡π‘₯2 𝜎π‘₯2
Note:
• See the MATLAB Example.
(http://www.math.montana.edu/~parker/PattersonStats/Delta.pdf)
Testing a nonlinear hypothesis
• Generalize the Wald test to test a single nonlinear restriction on β:
π»π‘œ : 𝐢 𝛽 = π‘ž
𝐻1 : 𝐢 𝛽 ≠ π‘ž,
Where 𝐢 𝛽 is a nonlinear function of the regression coefficient(s)
• Assume
• 𝐢 𝛽 is continuously differentiable
• 𝐺=
πœ•πΆ(𝛽)
is
πœ•π›½ ′
full rank
The Wald test
• Using our estimated variance form the Wald-statistic:
π‘Š = 𝐢 𝛽 −π‘ž
′
−1
𝐸𝑠𝑑. 𝐴𝑠𝑦. Var 𝐢 𝛽
𝐢 𝛽 −π‘ž
asy
~ 𝑋 2 (𝐽)
Note: if the sample size is large enough (large-sample properties)
• Test does not need assumption of normality
• Test is invariant to how the hypothesis is formulated
• Test-statistic has a chi-squared distribution with degrees of freedom equal to the number of
restrictions
Generalized Wald test (multiple restrictions)
• Let 𝐢(𝛽) be a set of 𝐽 functions of the estimated parameter vector
• Let the 𝐽 x 𝐾 matrix of derivatives of 𝐢 𝛽 be
𝐺=
πœ•πΆ(𝛽)
πœ•π›½′
• Estimate of the asymptotic covariance matrix is
𝐸𝑠𝑑. 𝐴𝑠𝑦. π‘‰π‘Žπ‘Ÿ. [𝐢 𝛽 ] = 𝐺 (π‘£π‘Žπ‘Ÿ 𝛽 )𝐺’
• Form the Wald-statistic
• Reject the null if the statistic is significantly large
Wald-statistic properties
• For example, consider the restriction, 𝐢 𝛽 = q, which could be
specified by
𝛽1 𝛽2 = 1 or 𝛽1 𝛽2 − 1 = 0
• If 𝑛 is large, both formulations will result in the same statistic
• If the Wald-statistics are statistically different it could affect the result
of the test
Note:
• See the MATLAB Example.
Thank you!
Download