8. Hypothesis Testing in the Linear Regression Model

advertisement
Econometrics I
Professor William Greene
Stern School of Business
Department of Economics
8-1/50
Part 8: Hypothesis Testing
Econometrics I
Part 8 – Interval Estimation
and Hypothesis Testing
8-2/50
Part 8: Hypothesis Testing
Interval Estimation
b = point estimator of 
 We acknowledge the sampling variability.



8-3/50
Estimated sampling variance
b =  + sampling variability induced by 
Part 8: Hypothesis Testing
Point estimate is only the best single guess
 Form an interval, or range of plausible values
 Plausible  likely values with acceptable
degree of probability.
 To assign probabilities, we require a distribution
for the variation of the estimator.
 The role of the normality assumption for 

8-4/50
Part 8: Hypothesis Testing
Confidence Interval
bk = the point estimate
Std.Err[bk] = sqr{[σ2(XX)-1]kk} = vk
Assume normality of ε for now:


bk ~ N[βk,vk2] for the true βk.
(bk-βk)/vk ~ N[0,1]
Consider a range of plausible values of βk given the point
estimate bk. bk  sampling error.





8-5/50
Measured in standard error units,
|(bk – βk)/ vk| < z*
Larger z*  greater probability (“confidence”)
Given normality, e.g., z* = 1.96  95%, z*=1.64590%
Plausible range for βk then is bk ± z* vk
Part 8: Hypothesis Testing
Computing the Confidence Interval
Assume normality of ε for now:


bk ~ N[βk,vk2] for the true βk.
(bk-βk)/vk ~ N[0,1]
vk = [σ2(X’X)-1]kk is not known because σ2 must be
estimated.
Using s2 instead of σ2, (bk-βk)/Est.(vk) ~ t[N-K].
(Proof: ratio of normal to sqr(chi-squared)/df is pursued in
your text.)
Use critical values from t distribution instead of standard
normal. Will be the same as normal if N > 100.
8-6/50
Part 8: Hypothesis Testing
Confidence Interval
Critical t[.975,29] = 2.045
Confidence interval based on t:
1.27365  2.045 * .1501
Confidence interval based on normal: 1.27365  1.960 * .1501
8-7/50
Part 8: Hypothesis Testing
Bootstrap Confidence Interval
For a Coefficient
8-8/50
Part 8: Hypothesis Testing
Bootstrap CI for Least Squares
8-9/50
Part 8: Hypothesis Testing
Bootstrap CI for Least Absolute Deviations
8-10/50
Part 8: Hypothesis Testing
Testing a Hypothesis Using
a Confidence Interval
Given the range of plausible values
Testing the hypothesis that a coefficient equals
zero or some other particular value:
Is the hypothesized value in the confidence
interval?
Is the hypothesized value within the range of
plausible values? If not, reject the hypothesis.
8-11/50
Part 8: Hypothesis Testing
Test a Hypothesis About a Coefficient
8-12/50
Part 8: Hypothesis Testing
Classical Hypothesis Testing
We are interested in using the linear regression
to support or cast doubt on the validity of a
theory about the real world counterpart to our
statistical model. The model is used to test
hypotheses about the underlying data
generating process.
8-13/50
Part 8: Hypothesis Testing
Types of Tests
Nested Models: Restriction on the parameters
of a particular model
y = 1 + 2x + 3z + , 3 = 0
 Nonnested models: E.g., different RHS variables
yt = 1 + 2xt + 3xt-1 + t
yt = 1 + 2xt + 3yt-1 + wt
 Specification tests:
 ~ N[0,2] vs. some other distribution

8-14/50
Part 8: Hypothesis Testing
Methodology


Bayesian
 Prior odds compares strength of prior beliefs in two states of the
world
 Posterior odds compares revised beliefs
 Symmetrical treatment of competing ideas
 Not generally practical to carry out in meaningful situations
Classical
 “Null” hypothesis given prominence
 Propose to “reject” toward default favor of “alternative”
 Asymmetric treatment of null and alternative
 Huge gain in practical applicability
8-15/50
Part 8: Hypothesis Testing
Neyman – Pearson Methodology
Formulate null and alternative hypotheses
 Define “Rejection” region = sample evidence
that will lead to rejection of the null hypothesis.
 Gather evidence
 Assess whether evidence falls in rejection region
or not.

8-16/50
Part 8: Hypothesis Testing
Inference in the Linear Model
Formulating hypotheses: linear restrictions as a general
framework
Hypothesis Testing J linear restrictions
Analytical framework: y = X + 
Hypothesis:
R - q = 0,
Substantive restrictions: What is a "testable hypothesis?"
 Substantive restriction on parameters
 Reduces dimension of parameter space
 Imposition of restriction degrades estimation criterion
8-17/50
Part 8: Hypothesis Testing
Testable Implications of a Theory
Investors care about interest rates and expected
inflation:
I = b1 + b2r + b3dp + e
 Investors care about real interest rates
I = c1 + c2(r-dp) + c3dp + e
No testable restrictions implied.
c1 =b1, c2=b2-b3, c3=b3.
 Investors care only about real interest rates
I = f1 + f2(r-dp) + f3dp + e. f3 = 0

8-18/50
Part 8: Hypothesis Testing
The General Linear Hypothesis: H0: R - q = 0
A unifying departure point: Regardless of the hypothesis, least squares
is unbiased.
E[b] = 
The hypothesis makes a claim about the population
R – q = 0. Then, if the hypothesis is true, E[Rb – q] = 0.
The sample statistic, Rb – q will not equal zero.
Two possibilities:
Rb – q is small enough to attribute to sampling variability
Rb – q is too large (by some measure) to be plausibly attributed to
sampling variability
Large Rb – q is the rejection region.
8-19/50
Part 8: Hypothesis Testing
Approaches to Defining the Rejection Region
(1) Imposing the restrictions leads to a loss of fit. R2
must go down. Does it go down “a lot?” (I.e., significantly?).
Ru2 = unrestricted model, Rr2 = restricted model fit.
F = { (Ru2 – Rr2)/J } / [(1 – Ru2)/(N-K)] = F[J,N-K].
(2) Is Rb - q close to 0? Basing the test on the discrepancy
vector: m = Rb - q.
Using the Wald criterion: m(Var[m])-1m
has a chi-squared distribution with J degrees of freedom
But, Var[m] = R[2(X’X)-1]R.
If we use our estimate of 2, we get an F[J,N-K], instead.
(Note, this is based on using ee/(N-K) to estimate 2.)
These are the same for the linear model
8-20/50
Part 8: Hypothesis Testing
Testing Fundamentals - I
SIZE of a test = Probability it will incorrectly
reject a “true” null hypothesis.
 This is the probability of a Type I error.

Under the null hypothesis, F(3,100) has
an F distribution with (3,100) degrees of
freedom. Even if the null is true, F will
be larger than the 5% critical value of
2.7 about 5% of the time.
8-21/50
Part 8: Hypothesis Testing
Testing Procedures
How to determine if the statistic is 'large.'
Need a 'null distribution.'
If the hypothesis is true, then the statistic will have a certain
distribution. This tells you how likely certain values are,
and in particular, if the hypothesis is true, then 'large
values' will be unlikely.
If the observed statistic is too large, conclude that the
assumed distribution must be incorrect and the
hypothesis should be rejected.
For the linear regression model with normally distributed
disturbances, the distribution of the relevant statistic is F
with J and N-K degrees of freedom.
8-22/50
Part 8: Hypothesis Testing
Distribution Under the Null
Density of F[3,100]
.750
FDENSITY
.500
.250
.000
0
1
2
3
4
X
8-23/50
Part 8: Hypothesis Testing
A Simulation Experiment
sample
matrix
proc$
create
regress
; 1 - 100 $
; fvalues=init(1000,1,0)$
;
;
;
;
;
fakelogc = rnn(-.319557,1.54236)$ Coefficients all = 0
quietly ; lhs = fakelogc
Compute regression
rhs=one,logq,logpl_pf,logpk_pf$
fstat = (rsqrd/3)/((1-rsqrd)/(n-4))$ Compute F
fvalues(i)=fstat$
Save 1000 Fs
calc
matrix
endproc
execute
; i= 1,1000 $
1000 replications
histogram ; rhs = fvalues
; title=F Statistic for H0:b2=b3=b4=0$
8-24/50
Part 8: Hypothesis Testing
Simulation Results
48 outcomes to the right of
2.7 in this run of the
experiment.
8-25/50
Part 8: Hypothesis Testing
Testing Fundamentals - II
POWER of a test = the probability that it will
correctly reject a “false null” hypothesis
 This is 1 – the probability of a Type II error.
 The power of a test depends on the specific
alternative.

8-26/50
Part 8: Hypothesis Testing
Power of a Test
.4 1 9
.3 3 5
Variable
.2 5 1
.1 6 8
.0 8 4
.0 0 0
-4
-3
-2
-1
0
1
2
3
4
5
B E TA
N 2
N 0
N 1
Null: Mean = 0. Reject if observed mean < -1.96 or > +1.96.
Prob(Reject null|mean=0) = 0.05
Prob(Reject null|mean=.5)=0.07902
Prob(Reject null|mean=1)=0.170066. Increases as the (alternative) mean rises.
8-27/50
Part 8: Hypothesis Testing
Test Statistic
For the fit measures, use a normalized measure of
the loss of fit:
F[J,n - K] =
R
2
u
- Rr2  / J
1- R  / (N - K)
2
u
 0 since Ru2  Rr2
Often useful
eueu
er e r
2
2
Ru = 1and R r = 1S yy
S yy
Insert these in F and it becomes
er e r  eueu  / J

F[J,n - K] =
 eueu  / (N - K)
8-28/50
 0 since er e r  eueu
Part 8: Hypothesis Testing
Test Statistics
Forming test statistics:
For distance measures use Wald type of distance
measure, W = m[Est.Var(m)]-1m
An important relationship between t and F
For a single restriction, m = r’b - q. The variance
is r’(Var[b])r
The distance measure is m / standard error of m.
8-29/50
Part 8: Hypothesis Testing
An important relationship between t and F
Chi  Squared[J] / J
Chi  squared[N  K] / (N  K)
where the two chi-squared variables are independent.
If J = 1, i.e., testing a single restriction,
Chi  Squared[1] /1
F
Chi  squared[N  K] / (N  K)
F
(N[0,1]) 2

Chi  squared[N  K] / (N  K)
2


N[0,1]
2
= 

t[1]



Chi

squared[N

K]
/
(N

K)


For a single restriction, F[1,N-K] is the square of the t ratio.
8-30/50
Part 8: Hypothesis Testing
Application
Time series regression,
LogG = 1 + 2logY + 3logPG
+ 4logPNC + 5logPUC + 6logPPT
+ 7logPN + 8logPD + 9logPS + 
Period = 1960 - 1995. Note that all coefficients in the
model are elasticities.
8-31/50
Part 8: Hypothesis Testing
Full Model
---------------------------------------------------------------------Ordinary
least squares regression ............
LHS=LG
Mean
=
5.39299
Standard deviation
=
.24878
Number of observs.
=
36
Model size
Parameters
=
9
Degrees of freedom
=
27
Residuals
Sum of squares
=
.00855 <*******
Standard error of e =
.01780 <*******
Fit
R-squared
=
.99605 <*******
Adjusted R-squared
=
.99488 <*******
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------Constant|
-6.95326***
1.29811
-5.356
.0000
LY|
1.35721***
.14562
9.320
.0000
9.11093
LPG|
-.50579***
.06200
-8.158
.0000
.67409
LPNC|
-.01654
.19957
-.083
.9346
.44320
LPUC|
-.12354*
.06568
-1.881
.0708
.66361
LPPT|
.11571
.07859
1.472
.1525
.77208
LPN|
1.10125***
.26840
4.103
.0003
.60539
LPD|
.92018***
.27018
3.406
.0021
.43343
LPS|
-1.09213***
.30812
-3.544
.0015
.68105
--------+-------------------------------------------------------------
8-32/50
Part 8: Hypothesis Testing
Test About One Parameter
Is the price of public transportation really relevant? H0 : 6 = 0.
Confidence interval: b6  t(.95,27)  Standard error
= .11571  2.052(.07859)
= .11571  .16127 = (-.045557 ,.27698)
Contains 0.0. Do not reject hypothesis
Regression fit if drop? Without LPPT, R-squared= .99573
Compare R2, was .99605,
F(1,27) = [(.99605 - .99573)/1]/[(1-.99605)/(36-9)]
= 2.187 = 1.4722 (with some rounding difference)
8-33/50
Part 8: Hypothesis Testing
Hypothesis Test: Sum of Coefficients = 1?
---------------------------------------------------------------------Ordinary
least squares regression ............
LHS=LG
Mean
=
5.39299
Standard deviation
=
.24878
Number of observs.
=
36
Model size
Parameters
=
9
Degrees of freedom
=
27
Residuals
Sum of squares
=
.00855 <*******
Standard error of e =
.01780 <*******
Fit
R-squared
=
.99605 <*******
Adjusted R-squared
=
.99488 <*******
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------Constant|
-6.95326***
1.29811
-5.356
.0000
LY|
1.35721***
.14562
9.320
.0000
9.11093
LPG|
-.50579***
.06200
-8.158
.0000
.67409
LPNC|
-.01654
.19957
-.083
.9346
.44320
LPUC|
-.12354*
.06568
-1.881
.0708
.66361
LPPT|
.11571
.07859
1.472
.1525
.77208
LPN|
1.10125***
.26840
4.103
.0003
.60539
LPD|
.92018***
.27018
3.406
.0021
.43343
LPS|
-1.09213***
.30812
-3.544
.0015
.68105
--------+-------------------------------------------------------------
8-34/50
Part 8: Hypothesis Testing
Imposing the Restriction
---------------------------------------------------------------------Linearly restricted regression
LHS=LG
Mean
=
5.392989
Standard deviation
=
.2487794
Number of observs.
=
36
Model size
Parameters
=
8 <*** 9 – 1 restriction
Degrees of freedom
=
28
Residuals
Sum of squares
=
.0112599 <*** With the restriction
Residuals
Sum of squares
=
.0085531 <*** Without the restriction
Fit
R-squared
=
.9948020
Restrictns. F[ 1,
27] (prob) =
8.5(.01)
Not using OLS or no constant.R2 & F may be < 0
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t] Mean of X
--------+------------------------------------------------------------Constant|
-10.1507***
.78756
-12.889
.0000
LY|
1.71582***
.08839
19.412
.0000
9.11093
LPG|
-.45826***
.06741
-6.798
.0000
.67409
LPNC|
.46945***
.12439
3.774
.0008
.44320
LPUC|
-.01566
.06122
-.256
.8000
.66361
LPPT|
.24223***
.07391
3.277
.0029
.77208
LPN|
1.39620***
.28022
4.983
.0000
.60539
LPD|
.23885
.15395
1.551
.1324
.43343
LPS|
-1.63505***
.27700
-5.903
.0000
.68105
--------+------------------------------------------------------------F = [(.0112599 - .0085531)/1] / [.0085531/(36 – 9)] = 8.544691
8-35/50
Part 8: Hypothesis Testing
Joint Hypotheses
Joint hypothesis: Income elasticity = +1, Own price elasticity = -1.
The hypothesis implies that logG = β1 + logY – logPg + β4 logPNC + ...
Strategy: Regress logG – logY + logPg on the other variables and
Compare the sums of squares
With two restrictions imposed
Residuals
Sum of squares
=
.0286877
Fit
R-squared
=
.9979006
Unrestricted
Residuals
Sum of squares
=
.0085531
Fit
R-squared
=
.9960515
F = ((.0286877 - .0085531)/2) / (.0085531/(36-9)) = 31.779951
The critical F for 95% with 2,27 degrees of freedom is 3.354.
The hypothesis is rejected.
8-36/50
Part 8: Hypothesis Testing
Basing the Test on R2
After building the restrictions into the model and computing
restricted and unrestricted regressions: Based on R2s,
F = ((.9960515 - .997096)/2)/((1-.9960515)/(36-9))
= -3.571166 (!)
What's wrong? The unrestricted model used LHS = logG.
The restricted one used logG-logY. The calculation is safe
using the sums of squared residuals.
8-37/50
Part 8: Hypothesis Testing
Wald Distance Measure
Testing more generally about a single parameter.
Sample estimate is bk
Hypothesized value is βk
How far is βk from bk? If too far, the hypothesis is
inconsistent with the sample evidence.
Measure distance in standard error units
t = (bk - βk)/Estimated vk.
If t is “large” (larger than critical value), reject the
hypothesis.
8-38/50
Part 8: Hypothesis Testing
The Wald Statistic
Most test statistics are Wald distance measures
W = (random vector - hypothesized value)' times
[Variance of difference]-1
(random vector - hypothesized value)
= Normalized distance measure
times
= (q - q0 )'[Var(q - q0 )]-1 (q - q0 )
Distributed as chi-squared(J) if (1) the distance is
normally distributed and (2) the variance matrix is
the true one, not the estimate.
8-39/50
Part 8: Hypothesis Testing
Robust Tests
The Wald test generally will (when properly
constructed) be more robust to failures of the
narrow model assumptions than the t or F
 Reason: Based on “robust” variance estimators
and asymptotic results that hold in a wide range
of circumstances.
 Analysis: Later in the course – after developing
asymptotics.

8-40/50
Part 8: Hypothesis Testing
Particular Cases
Some particular cases:
One coefficient equals a particular value:
F = [(b - value) / Standard error of b ]2 = square of familiar t ratio.
Relationship is F [ 1, d.f.] = t2[d.f.]
A linear function of coefficients equals a particular value
(linear function of coefficients - value)2
F = ---------------------------------------------------Variance of linear function
Note square of distance in numerator
Suppose linear function is k wk bk
Variance is kl wkwl Cov[bk,bl]
This is the Wald statistic. Also the square of the somewhat
familiar t statistic.
Several linear functions. Use Wald or F. Loss of fit measures may be easier to
compute.
8-41/50
Part 8: Hypothesis Testing
Hypothesis Test: Sum of Coefficients
Do the three aggregate price elasticities sum to zero?
H0 :β7 + β8 + β9 = 0
R = [0, 0, 0, 0, 0, 0, 1, 1, 1], q = [0]
Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------LPN|
1.10125***
.26840
4.103
.0003
.60539
LPD|
.92018***
.27018
3.406
.0021
.43343
LPS|
-1.09213***
.30812
-3.544
.0015
.68105
8-42/50
Part 8: Hypothesis Testing
Wald Test
8-43/50
Part 8: Hypothesis Testing
Using the Wald Statistic
Joint hypothesis:
b(LY)
= 1
b(LPG) = -1
8-44/50
--> Matrix ; R = [0,1,0,0,0,0,0,0,0 /
0,0,1,0,0,0,0,0,0]$
--> Matrix ; q = [1/-1]$
--> Matrix ; list ; m = R*b - q $
Matrix M
has 2 rows and 1 columns.
1
+-------------+
1|
.35721
2|
.49421
+-------------+
--> Matrix ; list ; vm = R*varb*R' $
Matrix VM
has 2 rows and 2 columns.
1
2
+-------------+-------------+
1|
.02120
.00291
2|
.00291
.00384
+-------------+-------------+
--> Matrix ; list ; w = m'<vm>m $
Matrix W
has 1 rows and 1 columns.
1
+-------------+
1|
63.55962
+-------------+
Part 8: Hypothesis Testing
Application: Cost Function
8-45/50
Part 8: Hypothesis Testing
Regression Results
---------------------------------------------------------------------Ordinary
least squares regression ............
LHS=C
Mean
=
3.07162
Standard deviation
=
1.54273
Number of observs.
=
158
Model size
Parameters
=
9
Degrees of freedom
=
149
Residuals
Sum of squares
=
2.56313
Standard error of e =
.13116
Fit
R-squared
=
.99314
Adjusted R-squared
=
.99277
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------Constant|
5.22853
3.18024
1.644
.1023
K|
-.21055
.21757
-.968
.3348
4.25096
L|
-.85353***
.31485
-2.711
.0075
8.97280
F|
.18862
.20225
.933
.3525
3.39118
Q|
-.96450***
.36798
-2.621
.0097
8.26549
Q2|
.05250***
.00459
11.430
.0000
35.7913
QK|
.04273
.02758
1.550
.1233
35.1677
QL|
.11698***
.03728
3.138
.0021
74.2063
QF|
.05950**
.02478
2.401
.0176
28.0108
--------+------------------------------------------------------------Note: ***, **, * = Significance at 1%, 5%, 10% level.
----------------------------------------------------------------------
8-46/50
Part 8: Hypothesis Testing
Price Homogeneity: Only Price Ratios Matter
β2 + β3 + β4 = 1. β7 + β8 + β9 = 0.
---------------------------------------------------------------------Linearly restricted regression....................
LHS=C
Mean
=
3.07162
Standard deviation
=
1.54273
Number of observs.
=
158
Model size
Parameters
=
7
Degrees of freedom
=
151
Residuals
Sum of squares
=
2.85625
Standard error of e =
.13753
Fit
R-squared
=
.99236
Restrictns. F[ 2,
149] (prob) =
8.5(.0003)
Not using OLS or no constant. Rsqrd & F may be < 0
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------Constant|
-7.24078***
1.01411
-7.140
.0000
K|
.38943**
.16933
2.300
.0228
4.25096
L|
.19130
.19530
.979
.3289
8.97280
F|
.41927**
.20077
2.088
.0385
3.39118
Q|
.45889***
.12402
3.700
.0003
8.26549
Q2|
.06008***
.00441
13.612
.0000
35.7913
QK|
-.02954
.02125
-1.390
.1665
35.1677
QL|
-.00462
.02364
-.195
.8454
74.2063
QF|
.03416
.02489
1.373
.1720
28.0108
--------+-------------------------------------------------------------
8-47/50
Part 8: Hypothesis Testing
Imposing the Restrictions
Alternatively, compute the restricted regression by converting to price ratios and
Imposing the restrictions directly. This is a regression of log(c/pf) on log(pk/pf),
log(pl/pf) etc.
---------------------------------------------------------------------Ordinary
least squares regression ............
LHS=LOGC_PF Mean
=
-.31956
Standard deviation
=
1.54236
Number of observs.
=
158
Model size
Parameters
=
7
Degrees of freedom
=
151
Residuals
Sum of squares
=
2.85625 (restricted)
Residuals
Sum of squares
=
2.56313 (unrestricted)
--------+------------------------------------------------------------Variable| Coefficient
Standard Error t-ratio P[|T|>t]
Mean of X
--------+------------------------------------------------------------Constant|
-7.24078***
1.01411
-7.140
.0000
LOGQ|
.45889***
.12402
3.700
.0003
8.26549
LOGQSQ|
.06008***
.00441
13.612
.0000
35.7913
LOGPK_PF|
.38943**
.16933
2.300
.0228
.85979
LOGPL_PF|
.19130
.19530
.979
.3289
5.58162
LQ_LPKPF|
-.02954
.02125
-1.390
.1665
7.15696
LQ_LPLPF|
-.00462
.02364
-.195
.8454
46.1956
--------+------------------------------------------------------------F Statistic = ((2.85625 – 2.56313)/2) / (2.56313/(158-9)) = 8.52
(This is a problem. The underlying theory requires linear homogeneity in prices.)
8-48/50
Part 8: Hypothesis Testing
Wald Test of the Restrictions
Chi squared = J*F
wald ; fn1 = b_k + b_l + b_f - 1
; fn2 = b_qk + b_ql + b_qf – 0 $
----------------------------------------------------------WALD procedure. Estimates and standard errors
for nonlinear functions and joint test of
nonlinear restrictions.
Wald Statistic
=
17.03978
Prob. from Chi-squared[ 2] =
.00020
--------+-------------------------------------------------Variable| Coefficient
Standard Error b/St.Er. P[|Z|>z]
--------+-------------------------------------------------Fncn(1)|
-1.87546***
.45621
-4.111
.0000
Fncn(2)|
.21921***
.05353
4.095
.0000
--------+-------------------------------------------------Note: ***, **, * = Significance at 1%, 5%, 10% level.
----------------------------------------------------------Recall:
F Statistic = ((2.85625 – 2.56313)/2) / (2.56313/(158-9)) = 8.52
The chi squared statistic is JF. Here, 2F = 17.04 = the chi squared.
8-49/50
Part 8: Hypothesis Testing
Test of Homotheticity
Cross Product Terms = 0
Homothetic Production: β7 = β8 = β9 = 0.
With linearity homogeneity in prices already imposed, this is β6 = β7 = 0.
----------------------------------------------------------WALD procedure. Estimates and standard errors
for nonlinear functions and joint test of
nonlinear restrictions.
Wald Statistic
=
2.57180
Prob. from Chi-squared[ 2] =
.27640
--------+-------------------------------------------------Variable| Coefficient
Standard Error b/St.Er. P[|Z|>z]
--------+-------------------------------------------------Fncn(1)|
-.02954
.02125
-1.390
.1644
Fncn(2)|
-.00462
.02364
-.195
.8451
--------+--------------------------------------------------
The F test would produce F = ((2.90490 – 2.85625)/1)/(2.85625/(158-7))
= 1.285
8-50/50
Part 8: Hypothesis Testing
Download