Reading and Comprehension Questions for Chapter 11

advertisement
Reading and Comprehension Questions for Chapter 12
1. Any regression model that is linear in the unknown parameters is a linear regression
model.
True False
True
2. The least squares estimator of the model parameters in multiple linear regression is
βˆ = (XX)-1 Xy .
True False
True
3. If a multiple linear regression model with three regressors is fit to a sample of 20
observations and the residual or error sum of squares is 32, the estimate of the variance of
the model errors is
a. 16.0
b. 2.0
c. 4.0
d. None of the above.
Answer – b. The estimate of the error variance is ˆ 2  SS E /(n  p)  32 /(20  4)  2.0
4. When using the method of least squares to estimate the parameters in multiple linear
regression, we assume that the model errors are normally and independently distributed
with mean zero and constant variance.
True False
False – the normality assumption is not required for parameter estimation, but it is
required for hypothesis tests and confidence intervals.
5. The test for significance of regression in multiple regression involves testing the
hypotheses H 0 : 1   2  ...   k  0 versus H1 : at least one  j  0 .
True False
True
6. The ANOVA is used to test for significance of regression in multiple regression.
True False
True
7. The R2 statistic can decrease when a new regressor variable is added to a multiple
linear regression model.
True False
False - The R2 statistic can never decrease with the addition of a new regressor variable to
the model.
8. If SST  100, SSE  15, n  20, and p  2 the adjusted R2 statistic is
a. 0.8875
b. 0.8324
c. 0.8525
d. None of the above.
2
 1
Answer – b. RAdj
SS E /(n  p)
15 /(20  3)
 1
 0.8324 .
SST /(n  1)
100 /(20  1)
9. The adjusted R2 statistic can decrease when a new regressor variable is added to a
multiple linear regression model.
True False
True
10. The test statistic for testing the contribution of an individual regressor variable to the
multiple linear regression model is
ˆ j
T0 
se( ˆ )
j
True False
True
11. When testing the contribution of an individual regressor variable to the model if we
find that the null hypothesis H 0 :  j  0 cannot be rejected, we usually should
a. Remove the variable from the model
b. Do nothing.
c. Add a quadratic term in xj to the model.
Answer – a. Removing a nonsignificant regressor may improve the fit of the model to the
data.
12. The extra sum of squares method is used to test hypotheses about a subset of
parameters in the multiple regression model.
True False
True
13. A 95% confidence interval on the mean response at a specified point in the regressor
variable space is 34  Y |x  36 . The length of this interval is constant for all points in
the regressor variable space so long as the confidence level doesn’t change.
True False
False – The standard deviation of the predicted response depends on the point and that
this standard deviation determines the length of the CI.
14. Standardized residuals have been scaled so that they have unit standard deviation.
True False
False – Studentized residuals have been scaled so that they have unit standard deviation.
15. Cook’s distance is a measure of influence on the regression model for the individual
observation in a sample.
True False
True
16. A polynomial regression model is a nonlinear regression model.
True False
False – any polynomial regression model is a linear regression model because it is a
linear function of the unknown parameters.
17. Regression models can only be used with continuous regressor variables.
True False
False – Indicator variables can be used to represent categorical regressors.
18. All possible regressions can be use to find the subset regression model that minimizes
the error or residual mean square.
True False
True
19. The C p statistic is a measure of the bias remaining in a subset regression model that
has p parameters because the correct regressors are not in the current model.
True False
True
20. If there are eight candidate regressors then the number of possible regression models
that need to be considered if we are using all possible regressions is:
a. 128
b. 64
c. 96
d. 16.
Answer – b. The number of equations is 28 = 64.
21. The PRESS statistic is a measure of how well a regression model will predict new
observations.
True False
True
22. Large values of the PRESS statistic indicate that the regression model will be a good
predictor of new observations.
True False
False – small values of the PRESS statistic indicate that the regression model will be a
good predictor of new observations.
23. Stepwise regression is a procedure that will find the subset regression model that
minimizes the residual mean square.
True False
False – while stepwise regression usually finds a good model, there is not guarantee that
it finds a model that satisfies any specific optimality criterion.
24. Forward selection is a variation of stepwise regression that enters variables into the
model one-at-a-time until no further variables can be added that produce a significant
increase in the regression sum of squares.
True False
True
25. Multicollinearity is a condition where there are strong near-linear dependencies
among the regressor variables.
True False
True
Download