Multiple Regression

advertisement
Multiple Regression
Economic relationships tend to be more complicated than the basic bi-variate model
discussed so far, as there are usually more than a single effect on the dependent
variable. Such a model can take the following form:
yt   0  1 x1t   2 x2t   3 x3t .......   k xkt  ut
In theory the only limit on the number of explanatory variables is the need to have
positive degrees of freedom.(i.e. n>k). However in practise small models
(parsimonious models) are often the best. The parameter estimates are derived in the
same way as in the bi-variate case, in so far as we wish to minimise the squared error
terms of all the observations. This gives:
yt   0  1 x1t   2 x 2t  ut
ut  yt   0  1 x1t   2 x 2t
This is the bi-varaite case but the same principal applies to more complicated models.
We then need to minimise s as before , where:
s   z 2 ; z  yt   0  1 x1t   2 x2t
The first order conditions are:
n
s
 2 ( yt   0  1 x1t   2 x2t )  0
0
t 1
n
s
 2 x1t ( yt   0  1 x1t   2 x2t )  0
1
t 1
n
s
 2 x 2t ( yt   0  1 x1t   2 x 2t )  0
2
t 1
These can then be rearranged to produce expressions for the constant and slope
parameters.
 0  y  1 x1   2 x2
The expression for the α1can then be produced:
1 
cov( x1 , y ) var( x 2 )  cov( x 2 , y ) cov( x1 , x 2 )
var( x1 ) var( x 2 )  {cov( x1 , x 2 )}2
(You do not need to know how to derive this, just be aware of the effect of the
covariance between the explanatory variables).
The F-test
This test is aimed at examining joint significance of all the variables or a group of
variables in a multiple regression. It can also be used to test a specific assumption or
for structural breaks (The Chow test). The most basic F-test is for the significance of
the goodness of fit of a regression.
F
R2 / k 1
(1  R 2 ) /( n  k )
This test has a null hypothesis of the explanatory variables being jointly equal to 0. It
follows the F distribution with the degrees of freedom equal to:
Fnkk1
To find the critical value for the above degrees of freedom, you need to look along
the top row of the tables until you find the value for k-1, then down the first column
until you find the value for n-k. The critical value can be found where these values
intersect in the table. We can also use the F-test to test for the joint significance of a
group of explanatory variables. If we take a model with three explanatory variables
and then decide to test for the joint significance of two of them:
yt   0  1 xt   2 pt   3 rt  ut
If we decide to test for the joint significance of prices pt and the real interest rate rt
we need to run two separate regressions. One with all the explanatory variables in
(unrestricted model) and one with just the single variable in (restricted model). In
effect in the latter model the two variables pt and rt are restricted to being jointly equal
to 0. To conduct the test involves the following steps where the hypothesis is:
H0 :  2  3  0
H1 :  2   3  0
1) Run a regression of the full model as above and collect the RSS (RSSu).
2) Run a second model as follows:
yt   0  1 xt  ut
Again collect the RSS (RSSR), then put these values into the following formula,
where n is number of observations and k are number of parameters from the
unrestricted model:
F
RSS R  RSS U / m
RSS u / n  k
3) Find the critical value given the degrees of freedom are: F(m, n-k) and draw the
conclusion (m is number of restrictions, 2 in the example above).
Download