Uploaded by wcszeto

regression analysiss

advertisement
MATH3424 Regression Analysis
Assignment 1
1. Using the following summary statistics
∑20
n = 20,
∑20
i=1
∑20
i=1
i=1
∑20
x2i1 = 860,
i=1
xi2 yi = −1824,
∑20
i=1
∑20
xi1 = 114,
i=1
xi1 xi2 = −1025,
∑20
i=1
∑20
xi2 = −136,
i=1
∑20
x2i2 = 1228,
i=1
yi = 222,
xi1 yi = 1537,
yi2 = 2950,
Sx1 x1 = 210.2,
Sx1 x2 = −249.8,
Sx2 y = −314.4,
Syy = 485.8.
Sx2 x2 = 303.2,
Sx1 y = 271.6,
and
(
210.2
-249.8
-249.8
303.2
(
)−1
=
0.227525
0.187453
0.187453
0.157737
)
,
to fit a model of y on x1 and x2 , i.e., do the following regression model,
yi = β0 + β1 xi1 + β2 xi2 + ei ,
ei ∼ N (0, σ 2 ).
(a) Assume that β0 = 2.
i. Find the least squares estimates of the unknown parameters β1 and β2 . Then,
write down the fitted line.
ii. Find the Residual Sum of Squares and the unbiased estimate of the unknown
parameter σ 2 . No need to show that it is unbiased.
(b) Assume that 2β1 = β2 .
i. Find the least squares estimates of the unknown parameters β0 and β1 . Then,
write down the fitted line..
ii. Find the Residual Sum of Squares and the unbiased estimate of the unknown
parameter σ 2 . No need to show that it is unbiased.
iii. Test H0 : β1 = 2 by t-test at significance level of α = 0.05. Write down your test
statistic, critical value and your conclusions clearly.
(c) Assume that β0 , β1 and β2 are unknown.
i. Find the least squares estimates of the unknown parameters β0 , β1 and β2 . Then,
write down the fitted line.
1
ii. Find Residual Sum of Squares and the unbiased estimate of the unknown parameter σ 2 . No need to show that it is unbiased.
iii. Test the assumption in Part II that H0 : 2β1 = β2 against the alternative hypothesis that H1 : 2β1 ̸= β2 at the significant level of α = 0.05 by t-test. Write
down the test statistic, the critical value and your conclusion clearly.
2. Consider a situation in which the regression data set is divided into two parts as follows.
x
x1
x2
..
.
y
y1
y2
..
.
xn1
xn1 +1
..
.
yn1
yn1 +1
..
.
xn1 +n2
yn1 +n2
The model is given by
(1)
for i = 1, . . . , n1
(2)
for i = n1 + 1, . . . , n1 + n2
yi = β0 + β1 xi + ei
= β0 + β1 xi + ei
In other words there are two regression lines with common slope. Using the centered
model,
(1)∗
+ β1 (xi − x̄1 ) + ei
for i = 1, . . . , n1
(2)∗
+ β1 (xi − x̄2 ) + ei
for i = n1 + 1, . . . , n1 + n2
yi = β0
= β0
where x̄1 =
n1
∑
i=1
xi
n1
and x̄2 =
n1∑
+n2
i=n1 +1
xi
.
n2
Show that the least squares estimate of β1 is given by
n1
n1∑
+n2
∑
(xi − x̄1 )yi +
(xi − x̄2 )yi
i=1
i=n1 +1
.
β̂1 = n1
n1∑
+n2
∑
2
2
(xi − x̄2 )
(xi − x̄1 ) +
i=n1 +1
i=1
Hint: Write the model in matrix form.
3. Consider the simple linear regression model, yi = β0 + β1 xi + εi for i = 1, 2, . . . , n. Show
that the least squares slope is given by
n
∑
xi − x̄
β̂1 = β1 +
di εi where di =
Sxx
i=1
Also show that
β̂0 = β0 + ε̄ − x̄
2
n
∑
i=1
d i εi
4. For the model of yi = β0 + β1 xi1 + ei for i = 1, . . . , n, where ei iid N (0, σ 2 ), find E(yi − ŷi )
∼
and V ar(yi − ŷi ).
Hint: Define
êi = yi − β̂0 − β̂1 xi1
n
∑
= yi −
(cj + dj xi1 )yj
j=1
=
n
∑
(δij − (cj + dj xi1 )) yj
j=1
1 − (xj1 − x̄1 )x̄1 , d = xj1 − x̄1 and δ =
where cj = n
j
ij
Sx1 x1
Sx1 x1
{
1 if i = j
0 if i ̸= j
5. Fit the model of y on x1 and x2 , i.e.,
yi = β0 + β1 xi1 + β2 xi2 + ei ,
ei ∼ N (0, σ 2 )
and get its fitted line by the method of least squares
ŷi = β̂0 + β̂1 xi1 + β̂2 xi2
Now fit another model of x2 on x1 and ŷi , i.e.,
xi2 = γ0 + γ1 xi1 + γ2 ŷi + ϵi
Write down β̂1 , β̂2 , γ̂1 and γ̂2 in terms of Sx1 x1 , Sx1 x2 , Sx2 x2 , Sx1 y , Sx2 y and Syy . Hence or
otherwise, prove that ϵ̂i = 0 for all i.
6. Consider the simple linear regression model, yi = β0 + β1 xi + εi for i = 1, 2, . . . , n. Show
that Cov(êi , β̂1 ) = 0 and Cov(ȳ, β̂1 ) = 0.
3
Download