Partial Regression and Partial Correlation

advertisement
Econometrics I
Professor William Greene
Stern School of Business
Department of Economics
4-1/24
Part 4: Partial Regression and Correlation
Econometrics I
Part 4 – Partial Regression
and Correlation
4-2/24
Part 4: Partial Regression and Correlation
I have a simple question for you. Yesterday, I was
estimating a regional production function with yearly
dummies. The coefficients of the dummies are usually
interpreted as a measure of technical change with
respect to the base year (excluded dummy variable).
However, I felt that it could be more interesting to
redefine the dummy variables in such a way that the
coefficient could measure technical change from one
year to the next. You could get the same result by
subtracting two coefficients in the original regression but
you would have to compute the standard error of the
difference if you want to do inference.
This is the code I used in LIMDEP. My question:
Is this a well known procedure?
4-3/24
Part 4: Partial Regression and Correlation
Frisch-Waugh (1933) Theorem
Context: Model contains two sets of variables:
X = [ (1,time) : (other variables)]
= [X1 X2]
Regression model:
y = X11 + X22 +  (population)
= X1b1 + X2b2 + e (sample)
Problem: Algebraic expression for the second set
of least squares coefficients, b2
4-4/24
Part 4: Partial Regression and Correlation
Partitioned Solution
Method of solution (Why did F&W care? In 1933, matrix
computation was not trivial!)
Direct manipulation of normal equations produces
( XX )b = X y
 X1 X1 X1 X 2 
 X1 y 
X = [X 1 , X 2 ] so X X = 
and X y = 





X
X
X
X
X
y
2 2
 2 1
 2 
 X1 X1 X1 X 2   b1   X 1 y 
(X X)b = 
=







X
X
X
X
b
X
y
2 2  2 
 2 1
 2 
X1 X1b1  X1 X 2b2  X1 y
X 2 X1b1  X 2 X 2b2  X 2 y ==> X 2 X 2b2  X 2 y - X 2 X 1b1
= X 2 (y - X 1b1 )
4-5/24
Part 4: Partial Regression and Correlation
Probit = -1(%) + 5
4-6/24
Part 4: Partial Regression and Correlation
Partitioned Solution
Direct manipulation of normal equations produces
b2 = (X2X2)-1X2(y - X1b1)
What is this? Regression of (y - X1b1) on X2
If we knew b1, this is the solution for b2.
Important result (perhaps not fundamental). Note
the result if X2X1 = 0.
Useful in theory: Probably
Likely in practice? Not at all.
4-7/24
Part 4: Partial Regression and Correlation
Partitioned Inverse
Use of the partitioned inverse result
produces a fundamental result: What is
the southeast element in the inverse of
the moment matrix?
-1
 X1'X1
 X 'X
 2 1
4-8/24
X1'X2 

X2'X2 
Part 4: Partial Regression and Correlation
Partitioned Inverse
The algebraic result is:
[ ]-1(2,2) = {[X2’X2] - X2’X1(X1’X1)-1X1’X2}-1
= [X2’(I - X1(X1’X1)-1X1’)X2]-1
= [X2’M1X2]-1
Note the appearance of an “M” matrix.
How do we interpret this result?
 Note the implication for the case in which
X1 is a single variable. (Theorem, p. 34)
 Note the implication for the case in which
X1 is the constant term. (p. 35)

4-9/24
Part 4: Partial Regression and Correlation
Frisch-Waugh Result
Continuing the algebraic manipulation:
b2 = [X2’M1X2]-1[X2’M1y].
This is Frisch and Waugh’s famous result - the “double residual
regression.”
How do we interpret this? A regression of residuals on residuals.
“We get the same result whether we (1) detrend the other variables
by using the residuals from a regression of them on a constant
and a time trend and use the detrended data in the regression or
(2) just include a constant and a time trend in the regression and
not detrend the data”
“Detrend the data” means compute the residuals from the
regressions of the variables on a constant and a time trend.
4-10/24
Part 4: Partial Regression and Correlation
Important Implications




Isolating a single coefficient in a regression.
(Corollary 3.2.1, p. 34). The double residual
regression.
Regression of residuals on residuals – ‘partialling’
out the effect of the other variables.
It is not necessary to ‘partial’ the other Xs out of
y because M1 is idempotent. (This is a very
useful result.) (i.e., X2M1’M1y = X2M1y)
(Orthogonal regression) Suppose X1 and X2 are
orthogonal; X1X2 = 0. What is M1X2?
4-11/24
Part 4: Partial Regression and Correlation
Applying Frisch-Waugh
Using gasoline data from Notes 3.
X = [1, year, PG, Y], y = G as before.
Full least squares regression of y on X.
4-12/24
Part 4: Partial Regression and Correlation
Detrending the Variables - Pg
4-13/24
Part 4: Partial Regression and Correlation
Regression of Detrended G on Detrended
Pg and Detrended Y
4-14/24
Part 4: Partial Regression and Correlation
Partial Regression
Important terms in this context:
Partialing out the effect of X1.
Netting out the effect …
“Partial regression coefficients.”
To continue belaboring the point: Note the interpretation of partial
regression as “net of the effect of …”
Now, follow this through for the case in which X1 is just a constant term,
column of ones.
What are the residuals in a regression on a constant. What is M1?
Note that this produces the result that we can do linear regression on
data in mean deviation form.
'Partial regression coefficients' are the same as 'multiple regression
coefficients.' It follows from the Frisch-Waugh theorem.
4-15/24
Part 4: Partial Regression and Correlation
Partial Correlation
Working definition. Correlation between sets of residuals.
Some results on computation: Based on the M matrices.
Some important considerations:
Partial correlations and coefficients can have signs and
magnitudes that differ greatly from gross correlations and
simple regression coefficients.
Compare the simple (gross) correlation of G and PG with the partial
correlation, net of the time effect. (Could you have predicted
the negative partial correlation?)
4 .5 0
CALC;list;Cor(g,pg)$
Result = .7696572
4 .0 0
3 .5 0
PG
3 .0 0
CALC;list;cor(gstar,pgstar)$
Result = -.6589938
2 .5 0
2 .0 0
1 .5 0
1 .0 0
.5 0
70
80
90
100
110
120
G
4-16/24
Part 4: Partial Regression and Correlation
Partial Correlation
4-17/24
Part 4: Partial Regression and Correlation
THE Application of Frisch-Waugh
The Fixed Effects Model
A regression model with a dummy variable for
each individual in the sample, each observed Ti times.
yi = Xi + diαi + εi, for each individual
N columns
 y1 
 
 y2  
 
 
 yN 
 X1
X
 2


 X N
d1
0
0
d2
0
0
0
0
0
0
0   β 
ε
  α 

dN 
N may be thousands. I.e., the
regression has thousands of
variables (coefficients).
β
= [X, D]    ε
 α
= Zδ  ε
4-18/24
Part 4: Partial Regression and Correlation
Estimating the Fixed Effects Model
The FEM is a linear regression model but
with many independent variables
b
 
 a
Using
b
4-19/24
1
 X X X D   X y 
DX DD  Dy 

 

the Frisch-Waugh theorem
1

=[X MD X ] X MD y 
Part 4: Partial Regression and Correlation
Application – Health and Income
German Health Care Usage Data, 7,293 Individuals, Varying Numbers of Periods
Variables in the file are
Data downloaded from Journal of Applied Econometrics Archive. This is an unbalanced
panel with 7,293 individuals. There are altogether 27,326 observations. The number of
observations ranges from 1 to 7 per family. (Frequencies are: 1=1525, 2=2158, 3=825,
4=926, 5=1051, 6=1000, 7=987). The dependent variable of interest is
DOCVIS = number of visits to the doctor in the observation period
HHNINC = household nominal monthly net income in German marks / 10000.
(4 observations with income=0 were dropped)
HHKIDS = children under age 16 in the household = 1; otherwise = 0
EDUC = years of schooling
AGE
= age in years
We desire also to include a separate family effect (7293 of them) for each family. This
requires 7293 dummy variables in addition to the four regressors.
4-20/24
Part 4: Partial Regression and Correlation
Fixed Effects Estimator (cont.)
M1D 0

2
0
M
D
MD  


0
 0
0 

0 
(The dummy variables are orthogonal)

N
MD 
MDi  I Ti  di (didi ) 1 d = I Ti  (1/Ti )did
1 - T1i - T1i
 1
1
 - Ti 1 - Ti
=
...
 ...
1
 - T1
Ti
 i
4-21/24
...
- T1i 

1
... - Ti 

... ... 
... 1 - T1i 
Part 4: Partial Regression and Correlation
‘Within’ Transformations
XMD X = Ni=1 XiMDi X i ,
XMD y =  XiM y i ,
N
i=1
4-22/24
i
D


XM y 
X iMDi X i
i
i
D
i k
k,l
T
i
 t=1
(x it,k -x i.,k )(x it,l -x i.,l )
T
i
 t=1
(x it,k -x i.,k )(y it -y i. )
Part 4: Partial Regression and Correlation
Least Squares Dummy Variable Estimator


b is obtained by ‘within’ groups least squares
(group mean deviations)
Normal equations for a are D’Xb+D’Da=D’y
a = (D’D)-1D’(y – Xb)
ai=(1/Ti )Σ
4-23/24
Ti
t=1
(yit -xitb)=ei
Part 4: Partial Regression and Correlation
Fixed Effects Regression
4-24/24
Part 4: Partial Regression and Correlation
Time Invariant Variable
f = a variable that is the same in every period (FEMALE)
N
i=1 i
i
D i
N
i=1
T
i
fMD f =  fM f    t=1
(fit -fi. )2
but fit = fi , so fit  fi. and (fit -fi. )  0
This is a simple case of multicollinearity.
f = diag(fi )*D
4-25/24
Part 4: Partial Regression and Correlation
4-26/24
Part 4: Partial Regression and Correlation
In the millennial edition of its World Health
Report, in 2000, the World Health
Organization published a study that
compared the successes of the health care
systems of 191 countries. The results
notoriously ranked the United States a
dismal 37th, between Costa Rica and
Slovenia. The study was widely
misrepresented , universally misunderstood
and was, in fact, unhelpful in understanding
the different outcomes across countries.
Nonetheless, the result remains
controversial a decade later, as policy
makers argue about why the world’s most
expensive health care system isn’t
the world’s best.
4-27/24
Part 4: Partial Regression and Correlation
The COMPOSITE Index
Equation
logCOMPi = Maximum Attainablei - Inefficiency i
= α +β1logHealthExpi
+β2logEduci +β3 (logEduci ) - ui
2
i = 1,...,191 countries
4-28/24
Part 4: Partial Regression and Correlation
Estimated Model
β1
β2
β3
α
4-29/24
Part 4: Partial Regression and Correlation
Implications of results: Increases in Health
Expenditure and increases in Education are both
associated with increases in health outcomes.
These are the policy levers in the analysis!
4-30/24
Part 4: Partial Regression and Correlation
WHO Data
4-31/24
Part 4: Partial Regression and Correlation
Download