Testing Nonlinear Restrictions Aaron Enriquez Rohini Ghosh James Inwood The Goal Test a hypothesis of the form: π»π : πΆ π½ = π π»1 : πΆ π½ ≠ π, where C π½ is a single non-linear function. Real World Application Long-Run Marginal Propensity to Consume • Consumption function with different short- and long-run MPCs: πππΆπ‘ = πΌ + π½ππππ‘ + πΎπππΆπ‘−1 + ππ‘ • Short-run MPC is π½, long-run MPC is πΏ = • We could test the hypothesis πΏ = 1: π½ 1−πΎ π½ π»π : =1 1−πΎ (Greene, pg. 132) How is it done? • Form the Wald-like statistic: π = πΆ π½ −π ′ −1 var πΆ π½ πΆ π½ −π ~ π 2 (π½) (Aadland. Inference and Prediction. pg. 7) That variance… gross. var πΆ π½ Calculating the variance of a nonlinear function is • potentially time-consuming • potentially tricky So, let’s approximate it! Taylor Series Approximation πΏπΆ π½ πΆ π½ ≈πΆ π½ +( )′(π΅ − π΅) πΏπ½ Recall: • The Taylor Series expansion of a function around a value πΌ is equal to: π′ π ′′ π₯−πΌ 2 2! π π₯ =π πΌ + πΌ π₯−πΌ + πΌ +β― • We can usually drop the higher order terms, which leaves us with: π π₯ ≈ π πΌ + π′(πΌ)(π₯ − πΌ) Note: ^ ^ • If plim π½ =π½, we can use C(π½) as an estimate of C(π½) (Slutsky’s theorem) • This is an application of the Delta Method (Greene. (5-34). pg. 132) Variance Estimation πΏπΆ π½ πΆ π½ = πΆ π½ +( )′(π½ − π½) πΏπ½ π£ππ[πΆ(π½)] ≈ πΏπΆ π½ πΏπ½ We can use π 2 π ′ π ′ −1 π£ππ[π½] πΏπΆ π½ ( πΏπ½ ) to estimate the variance of the estimator. (Greene. (5-34). pg. 132) Let’s see a math application. Consider the nonlinear function π = π 2 . We know: π π₯ = π₯ 2 and π ′ π₯ = 2π₯. We then have: π£ππ π ≈ 2ππ₯ 2 π£ππ π₯ = 4ππ₯2 ππ₯2 Note: • See the MATLAB Example. (http://www.math.montana.edu/~parker/PattersonStats/Delta.pdf) Testing a nonlinear hypothesis • Generalize the Wald test to test a single nonlinear restriction on β: π»π : πΆ π½ = π π»1 : πΆ π½ ≠ π, Where πΆ π½ is a nonlinear function of the regression coefficient(s) • Assume • πΆ π½ is continuously differentiable • πΊ= ππΆ(π½) is ππ½ ′ full rank The Wald test • Using our estimated variance form the Wald-statistic: π = πΆ π½ −π ′ −1 πΈπ π‘. π΄π π¦. Var πΆ π½ πΆ π½ −π asy ~ π 2 (π½) Note: if the sample size is large enough (large-sample properties) • Test does not need assumption of normality • Test is invariant to how the hypothesis is formulated • Test-statistic has a chi-squared distribution with degrees of freedom equal to the number of restrictions Generalized Wald test (multiple restrictions) • Let πΆ(π½) be a set of π½ functions of the estimated parameter vector • Let the π½ x πΎ matrix of derivatives of πΆ π½ be πΊ= ππΆ(π½) ππ½′ • Estimate of the asymptotic covariance matrix is πΈπ π‘. π΄π π¦. πππ. [πΆ π½ ] = πΊ (π£ππ π½ )πΊ’ • Form the Wald-statistic • Reject the null if the statistic is significantly large Wald-statistic properties • For example, consider the restriction, πΆ π½ = q, which could be specified by π½1 π½2 = 1 or π½1 π½2 − 1 = 0 • If π is large, both formulations will result in the same statistic • If the Wald-statistics are statistically different it could affect the result of the test Note: • See the MATLAB Example. Thank you!