Uploaded by gabri0621

Econometrics Formulas

advertisement
Formulas for Econometrics Part
Ovidijus Stauskas
Many of the formulas used and needed in this course you can find in the Statistics Formulas sheet. However, here are a few additional relevant formulas that we will use only for econometrics.
1. b1 = yn − b2 x n . Least squares estimator of B1 , where x n and yn are the sample averages of observations for x and y variables.
2. b2 =
∑in=1 ( xi − x n )(yi −yn )
∑in=1 ( xi − x n )2
=
1
n −1
vided by sample variance).
3. b2 =
∑in=1 ( xi − x n )(yi −yn )
∑iN=1 ( xi − x n )2
∑in=1 ( xi − x n )(yi −yn )
.
n
1
2
n −1 ∑ i =1 ( x i − x n )
n
1
n −1 ∑ i =1 ( x i − x n ) y i
N
1
2
n −1 ∑ i =1 ( x i − x n )
=
Least squares estimator of B2 (sample covariance di-
. Alternative (and perhaps surprising!) formula for the
estimator of B2 for the simple constant and one variable model.
n
1
2
4. b
σ2 = n−
2 ∑i =1 ei , where ei = yi − b1 − b2 xi . Estimator of regression equation variance (i.e. estimator
of variance of ui ). This is for a simple model with B1 and B2 .
∑in=1 ei2 , where ei = yi − b1 − b2 x2,i − . . . − bk xk,i . For the model with B1 , . . . , Bk .
r
1
∑n x2
\
= Var (b1 ) = b
σ ∑nn (xi=−1 xi )2 : variance estimator for B1 from the simple B1 , B2 model.
5. b
σ2 =
6. s2b1
1
n−k
i =1
\
7. s2b2 = Var
(b2 ) =
i
b
σ2
.
∑in=1 ( xi − x n )2
n
Variance estimator for b2 from the simple B1 , B2 model.
8. TSS (total sum of squares): ∑in=1 (yi − yn )2 .
9. ESS (explained sum of squares): ∑in=1 (yi − ŷi )2 , where ŷi = b1 + b2 x2,i + . . . + bk xk,i (obviously, k = 2
in a simple constant and one explanatory variable model).
10. RSS (residual sums of squares): ∑in=1 ei2 , where ei is defined as above, depending if we have a simple
2 variable or k variable model.
11. TSS = ESS + RSS (an important result showing that TSS can be split into two parts).
12. R2 =
ESS
TSS
=
TSS− RSS
TSS
= 1−
RSS
TSS ,
where the second equality follows straight from point 11.
2
1
2
13. R = 1 − (1 − R2 ) nn−
−k : adjusted R .
\
14. s2bj = Var
(b j ) =
b
σ2
1
,
∑in=1 ( xi − x n )2 1− R2j
where R2j is the R2 from the regression where x j,i is the dependent
variable and the rest of the variables are used as predictors.
15. F =
RSSr − RSSu
m
RSSu
n−k
. This is F statistic for joint hypotheses. Here, r stands for “restricted model”, u stands
for “unrestricted model”. Also, m is the number of restrictions; n and k stand for the sample size an
the number of coefficients in the unrestricted model.
1
16. F =
R2u − R2r
m
1− R2u
n−k
: an alternative formula for F statistic. The letters have the same meaning as in the original
formulations.
17. F =
R2u
m
1− R2u
n−k
: a special case of F statistic when the restricted model involves no explanatory variables
(predictors) at all.
18. E(Yt ) = 1−δ ρ : expected value of AR(1) process, where δ is an intercept and ρ is the autoregressive
coefficient.
19. Var (Yt ) =
σ2
:
1− ρ2
variance of AR(1) process, where σ2 is the variance of the error term.
2
20. Cov(Yt , Yt−s ) = ρs 1−σ ρ2 : covariance between two different time points in the AR(1) process, where
the “time step” (or, difference in time) is s.
2
Download