0538479825_312305

advertisement
Chapter 17
Page 1 of 9
WebQuizzing – Ch. 17
Book ISBN-10 0538477490
Book ISBN-13 9780538477499
Author: Gerald Keller
Title: Statistics for Management and Economics
Ed: 9e
# Questions Submitted: 20 Multiple Choice
1. In a multiple regression analysis, if the model provides a poor fit, this indicates that:
A. the sum of squares for error will be large.
B. the standard error of estimate will be large.
C. the coefficient of determination will be close to zero.
D. All of these choices are true.
Analysis:
A. Incorrect. All of these choices are true.
B. Incorrect. All of these choices are true.
C. Incorrect. All of these choices are true.
D. Correct. All of these choices are true
ANSWER: D Ref: Section 17.1-2
2. In a multiple regression model, the mean of the probability distribution of the error variable
 is assumed to be:
A. 1.0
B. 0.0
C. k, where k is the number of independent variables included in the model.
D. None of these choices.
Analysis:
A. Incorrect. The probability distribution of the error variable  is assumed to be 0.0
B. Correct. The probability distribution of the error variable  is assumed to be 0.0
C. Incorrect. The probability distribution of the error variable  is assumed to be 0.0
D. Incorrect. The probability distribution of the error variable  is assumed to be 0.0
ANSWER: B Ref: Section 17.1-2
3. The adjusted coefficient of determination is adjusted for the:
A. number of regression parameters including the y-intercept.
B. number of dependent variables and the sample size.
C. number of independent variables and the sample size.
D. coefficient of correlation and the significance level.
Chapter 17
Page 2 of 9
Analysis:
A. Incorrect. The adjusted coefficient of determination is adjusted for the number of independent
variables and the sample size.
B. Incorrect. The adjusted coefficient of determination is adjusted for the number of independent
variables and the sample size.
C. Correct. The adjusted coefficient of determination is adjusted for the number of independent
variables and the sample size.
D. Incorrect. The adjusted coefficient of determination is adjusted for the number of independent
variables and the sample size.
ANSWER: C Ref: Section 17.1-2
4. In multiple regression analysis, the ratio MSR/MSE yields the:
A. t-test statistic for testing each individual regression coefficient.
B. F-test statistic for testing the validity of the regression equation.
C. coefficient of determination.
D. adjusted coefficient of determination.
Analysis:
A. Incorrect. In multiple regression analysis, the ratio MSR/MSE yields the F-test statistic for
testing the validity of the regression equation.
B. Correct. In multiple regression analysis, the ratio MSR/MSE yields the F-test statistic for
testing the validity of the regression equation.
C. Incorrect. In multiple regression analysis, the ratio MSR/MSE yields the F-test statistic for
testing the validity of the regression equation.
D. Incorrect. In multiple regression analysis, the ratio MSR/MSE yields the F-test statistic for
testing the validity of the regression equation.
ANSWER: B Ref: Section 17.1-2
5. In a multiple regression analysis involving k independent variables and n data points, the
number of degrees of freedom associated with the sum of squares for error is:
A. k-1
B. n-k
C. n-1
D. n-k-1
Analysis:
A. Incorrect. The number of degrees of freedom associated with the sum of squares for error
is n-k-1
B. Incorrect. The number of degrees of freedom associated with the sum of squares for error
is n-k-1
C. Incorrect. The number of degrees of freedom associated with the sum of squares for error
is n-k-1
D. Correct. The number of degrees of freedom associated with the sum of squares for error
is n-k-1
ANSWER: D Ref: Section 17.1-2
Chapter 17
Page 3 of 9
6. To test the validity of a multiple regression model, we test the null hypothesis that the
regression coefficients are all zero by applying the:
A. t-test
B. z-test
C. F-test
D. None of these choices.
Analysis:
A. Incorrect. By applying the F-test.
B. Incorrect. By applying the F-test.
C. Correct. By applying the F-test.
D. Incorrect. By applying the F-test.
ANSWER: X Ref: Section 17.1-2
7. The coefficient of determination ranges from:
A. to  .
B. to 1.0.
C. to k, where k is the number of independent variables in the model.
D. to n, where n is the number of observations in the dependent variable.
Analysis:
A. Incorrect. The coefficient of determination ranges from 0.0 to 1.0
B. Correct. The coefficient of determination ranges from 0.0 to 1.0
C. Incorrect. The coefficient of determination ranges from 0.0 to 1.0
D. Incorrect. The coefficient of determination ranges from 0.0 to 1.0
ANSWER: B Ref: Section 17.1-2
8. If all the points for a multiple regression model with two independent variables were right on
the regression plane, then the coefficient of determination would equal:
A. 0
B. 1
C. 2, since there are two independent variables.
D. None of these choices.
Analysis:
A. Incorrect. The coefficient of determination would equal 1
B. Correct. The coefficient of determination would equal 1
C. Incorrect. The coefficient of determination would equal 1
D. Incorrect. The coefficient of determination would equal 1
ANSWER: B Ref: Section 17.1-2
9. In a multiple regression model, the probability distribution of the error variable  is assumed
to be:
A. normal.
B. non-normal.
C. positively skewed.
D. negatively skewed.
Chapter 17
Page 4 of 9
Analysis:
A. Correct. In a multiple regression model, the probability distribution of the error variable  is
assumed to be normal.
B. Incorrect. In a multiple regression model, the probability distribution of the error variable  is
assumed to be normal.
C. Incorrect. In a multiple regression model, the probability distribution of the error variable  is
assumed to be normal.
D. Incorrect. In a multiple regression model, the probability distribution of the error variable 
is assumed to be normal.
ANSWER: A Ref: Section 17.1-2
10. For a multiple regression model, the total variation in y can be expressed as:
A. SSR + SSE.
B. SSR – SSE.
C. SSE – SSR.
D. SSR / SSE.
Analysis:
A. Correct. For a multiple regression model, the total variation in y can be expressed
as SSR + SSE.
B. Incorrect. For a multiple regression model, the total variation in y can be expressed
as SSR + SSE.
C. Incorrect. For a multiple regression model, the total variation in y can be expressed
as SSR + SSE.
D. Incorrect. For a multiple regression model, the total variation in y can be expressed
as SSR + SSE.
ANSWER: A Ref: Section 17.1-2
11. In testing the validity of a multiple regression model in which there are four independent
variables, the null hypothesis is:
A. H 0 : 1   2   3   4  1
B. H 0 :  0  1   2   3   4
C. H 0 : 1   2   3   4  0
D. H 0 :  0  1   2   3   4  0
Analysis:
A. Incorrect. In testing the validity of a multiple regression model in which there are four
independent variables, the null hypothesis is H 0 : 1   2   3   4  0
B. Incorrect. In testing the validity of a multiple regression model in which there are four
independent variables, the null hypothesis is H 0 : 1   2   3   4  0
C. Correct. In testing the validity of a multiple regression model in which there are four
independent variables, the null hypothesis is H 0 : 1   2   3   4  0
D. Incorrect. In testing the validity of a multiple regression model in which there are four
independent variables, the null hypothesis is H 0 : 1   2   3   4  0
Chapter 17
Page 5 of 9
ANSWER: C Ref: Section 17.1-2
12. Which of the following statements regarding multicollinearity is not true?
A. It exists in virtually all multiple regression models.
B. It is also called collinearity and intercorrelation.
C. It is a condition that exists when the independent variables are highly correlated with the
dependent variable.
D. All of these choices are true.
Analysis:
A. Incorrect. It is a condition that exists when the independent variables are highly correlated
with the dependent variable is not true.
B. Incorrect. It is a condition that exists when the independent variables are highly correlated
with the dependent variable is not true.
C. Correct. It is a condition that exists when the independent variables are highly correlated with
the dependent variable is not true.
D. Incorrect. It is a condition that exists when the independent variables are highly correlated
with the dependent variable is not true.
ANSWER: C Ref: Section 17.3
13. When the independent variables are correlated with one another in a multiple regression
analysis, this condition is called:
A. heteroscedasticity.
B. homoscedasticity.
C. multicollinearity.
D. None of these choices.
Analysis:
A. Incorrect. The condition is called multicollinearity.
B. Incorrect. The condition is called multicollinearity.
C. Correct. The condition is called multicollinearity.
D. Incorrect. The condition is called multicollinearity.
ANSWER: C Ref: Section 17.3
14. If a group of independent variables are not significant individually but are significant as a
group at a specified level of significance, this is most likely due to:
A. heteroscedasticity.
B. an error in the analysis.
C. multicollinearity.
D. None of these choices.
Analysis:
A. Incorrect. This is most likely due to multicollinearity.
B. Incorrect. This is most likely due to multicollinearity.
C. Correct. This is most likely due to multicollinearity.
D. Incorrect. This is most likely due to multicollinearity.
ANSWER: C Ref: Section 17.3
Chapter 17
Page 6 of 9
15. If multicollinearity exists among the independent variables included in a multiple regression
model, then:
A. the regression coefficients will be difficult to interpret.
B. the standard errors of the regression coefficients for the correlated independent variables
will increase.
C. one or more of the coefficients may have the wrong sign.
D. All of these choices are true.
Analysis:
A. Incorrect. All of these choices are true.
B. Incorrect. All of these choices are true.
C. Incorrect. All of these choices are true.
D. Correct. All of these choices are true.
ANSWER: D Ref: Section 17.3
16. The problem of multicollinearity arises when the:
A. dependent variables are highly correlated with one another.
B. independent variables are highly correlated with one another.
C. independent variables are highly correlated with the dependent variable.
D. None of these choices.
Analysis:
A. Incorrect. The problem of multicollinearity arises when the independent variables are highly
correlated with one another.
B. Correct. The problem of multicollinearity arises when the independent variables are highly
correlated with one another.
C. Incorrect. The problem of multicollinearity arises when the independent variables are highly
correlated with one another.
D. Incorrect. The problem of multicollinearity arises when the independent variables are highly
correlated with one another.
ANSWER: B Ref: Section 17.3
17. If the Durbin-Watson statistic has a value close to 0, which assumption is violated?
A. Normality of the errors.
B. Independence of errors.
C. Homoscedasticity.
D. None of these choices.
Analysis:
A. Incorrect. The independence of errors assumption is violated.
B. Correct. The independence of errors assumption is violated.
C. Incorrect. The independence of errors assumption is violated.
D. Incorrect. The independence of errors assumption is violated.
ANSWER: B Ref: Section 17.4
Chapter 17
Page 7 of 9
18. If the Durbin-Watson statistic has a value close to 4, which assumption is violated?
A. Normality of the errors
B. Independence of errors
C. Homoscedasticity
D. None of these choices.
Analysis:
A. Incorrect. The independence of errors assumption is violated.
B. Correct. The independence of errors assumption is violated.
C. Incorrect. The independence of errors assumption is violated.
D. Incorrect. The independence of errors assumption is violated.
ANSWER: B Ref: Section 17.4
19. The range of the values of the Durbin-Watson statistic d is:
A. 4  d  4
B. 2  d  2
C. 0  d  4
D. 0  d  2
Analysis:
A. Incorrect. The range of the values of the Durbin-Watson statistic d is 0  d  4
B. Incorrect. The range of the values of the Durbin-Watson statistic d is 0  d  4
C. Correct. The range of the values of the Durbin-Watson statistic d is 0  d  4
D. Incorrect. The range of the values of the Durbin-Watson statistic d is 0  d  4
ANSWER: C Ref: Section 17.4
20. If the Durbin-Watson statistic d has values smaller than 2, this indicates
A. a positive first – order autocorrelation.
B. a negative first – order autocorrelation.
C. no first – order autocorrelation at all.
D. None of these choices.
Analysis:
A. Correct. If the Durbin-Watson statistic d has values smaller than 2, this indicates a positive
first – order autocorrelation.
B. Incorrect. If the Durbin-Watson statistic d has values smaller than 2, this indicates a positive
first – order autocorrelation.
C. Incorrect. If the Durbin-Watson statistic d has values smaller than 2, this indicates a positive
first – order autocorrelation.
D. Incorrect. If the Durbin-Watson statistic d has values smaller than 2, this indicates a positive
first – order autocorrelation.
ANSWER: A Ref: Section 17.4
21. The standardized residual is defined as:
A. residual divided by the standard error of estimate.
B. residual multiplied by the square root of the standard error of estimate.
Chapter 17
Page 8 of 9
C. residual divided by the square of the standard error of estimate.
D. residual multiplied by the standard error of estimate.
Analysis:
A. Correct. The standardized residual is defined as the residual divided by the standard error of
estimate.
B. Incorrect. The standardized residual is defined as the residual divided by the standard error of
estimate.
C. Incorrect. The standardized residual is defined as the residual divided by the standard error of
estimate.
D. Incorrect. The standardized residual is defined as the residual divided by the standard error of
estimate.
ANSWER: A Ref: Section 17.3
22. The least squares method requires that the variance  2 of the error variable  is a constant
no matter what the value of x is. When this requirement is violated, the condition is called:
A. non-independence of  .
B. homoscedasticity.
C. heteroscedasticity.
D. influential observation.
Analysis:
A. Incorrect. The condition is called heteroscedasticity.
B. Incorrect. The condition is called heteroscedasticity.
C. Correct. The condition is called heteroscedasticity.
D. Incorrect. The condition is called heteroscedasticity.
ANSWER: C Ref: Section 17.3
23. If the plot of the residuals is fan shaped, which assumption of regression analysis (if any) is
violated?
A. Normality
B. Homoscedasticity
C. Independence of errors
D. No assumptions are violated.
Analysis:
A. Incorrect. The assumption of homoscedasticity is violated.
B. Correct. The assumption of homoscedasticity is violated.
C. Incorrect. The assumption of homoscedasticity is violated.
D. Incorrect. The assumption of homoscedasticity is violated.
ANSWER: B Ref: Section 17.3
24. When the variance  2 of the error variable  is a constant no matter what the value of x is,
this condition is called:
A. homocausality.
Chapter 17
Page 9 of 9
B. heteroscedasticity.
C. homoscedasticity.
D. heterocausality.
Analysis:
A. Incorrect. This condition is called homoscedasticity.
B. Incorrect. This condition is called homoscedasticity.
C. Correct. This condition is called homoscedasticity.
D. Incorrect. This condition is called homoscedasticity.
ANSWER: C Ref: Section 17.3
Download