Kuwait University College of Business Administration Dept. of Economics Econ. 403: Econometrics Prof. Mohammed I. El-Sakka Syllabus COURSE REQUIREMENTS: Prior knowledge of statistics and mathematics for economists is crucial. Basic knowledge of Excel is necessary. Course contents Introduction, 1 o I.1. What Is Econometrics?, o I.2. Why a Separate Discipline?, o I.3. Methodology of Econometrics, Statement of Theory or Hypothesis, Specification of the Mathematical Model of Consumption, Specification of the Econometric Model of Consumption, Obtaining Data, Estimation of the Econometric Model, Hypothesis Testing, Forecasting or Prediction, Use of the Model for Control or Policy Purposes, Choosing among Competing Models, o I.4. Types of Econometrics, o I.5. Mathematical and Statistical Prerequisites, o I.6. The Role of the Computer, Part ONE. SINGLE-EQUATION REGRESSION MODELS, Chapter 1. The Nature of Regression Analysis, Historical Origin of the Term Regression, The Modern Interpretation of Regression, Examples, Statistical versus Deterministic Relationships, Regression versus Causation, Regression versus Correlation, Terminology and Notation, The Nature and Sources of Data for Economic Analysis, Types of Data, The Sources of Data, The Accuracy of Data, A Note on the Measurement Scales of Variables, Chapter 2. Two-Variable Regression Analysis: Some Basic Ideas, 2.1. A Hypothetical Example, 2.2. The Concept of Population Regression Function (PRF), 2.3. The Meaning of the Term Linear, o Linearity in the Variables, o Linearity in the Parameters, 2.4. Stochastic Specification of PRF, 2.5. The Significance of the Stochastic Disturbance Term, 2.6. The Sample Regression Function (SRF), 2.7. Illustrative Examples, Chapter 3. Two-Variable Regression Model: The Problem of Estimation, 3.1. The Method of Ordinary Least Squares, 3.2. The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares, o A Word about These Assumptions, 3.3. Precision or Standard Errors of Least-Squares Estimates, 3.4. Properties of Least-Squares Estimators: The Gauss–Markov Theorem, 3.5. The Coefficient of Determination r2: A Measure of “Goodness of Fit”, 3.6. A Numerical Example, 3.7. Illustrative Examples, Chapter 4. Classical Normal Linear Regression Model (CNLRM), 4.1. The Probability Distribution of Disturbances ui, 4.2. The Normality Assumption for ui, o Why the Normality Assumption?, 4.3. Properties of OLS Estimators under the Normality Assumption, 4.4. The Method of Maximum Likelihood (ML), Chapter 5. Two-Variable Regression: Interval Estimation and Hypothesis Testing, 5.1. Statistical Prerequisites, 5.2. Interval Estimation: Some Basic Ideas, 5.3. Confidence Intervals for Regression Coefficients β1 and β2, o Confidence Interval for β2, o Confidence Interval for β1 and β2 Simultaneously, 5.4. Confidence Interval for σ2, 5.5. Hypothesis Testing: General Comments, 5.6. Hypothesis Testing: The Confidence-Interval Approach, o Two-Sided or Two-Tail Test, o One-Sided or One-Tail Test, 5.7. Hypothesis Testing: The Test-of-Significance Approach, o Testing the Significance of Regression Coefficients: The t Test, o Testing the Significance of σ2: The χ2 Test, 5.8. Hypothesis Testing: Some Practical Aspects, o The Meaning of “Accepting” or “Rejecting” a Hypothesis, o The “Zero” Null Hypothesis and the “2-t” Rule of Thumb, o Forming the Null and Alternative Hypotheses, o Choosing α, the Level of Significance, o The Exact Level of Significance: The p Value, o Statistical Significance versus Practical Significance, o The Choice between Confidence-Interval and Test-of-Significance Approaches to Hypothesis Testing, 124 5.9. Regression Analysis and Analysis of Variance, 5.10. Application of Regression Analysis: The Problem of Prediction, o Mean Prediction, o Individual Prediction, 5.11. Reporting the Results of Regression Analysis, 5.12. Evaluating the Results of Regression Analysis, o Normality Tests, o Other Tests of Model Adequacy, Chapter 6. Extensions of the Two-Variable Linear Regression Model, 6.1. Regression through the Origin, o r2 for Regression-through-Origin Model, 6.2. Scaling and Units of Measurement, o A Word about Interpretation, 6.3. Regression on Standardized Variables, 6.4. Functional Forms of Regression Models, 6.5. How to Measure Elasticity: The Log-Linear Model, 6.6. Semilog Models: Log–Lin and Lin–Log Models, o How to Measure the Growth Rate: The Log–Lin Model, o The Lin–Log Model, 6.7. Reciprocal Models, o Log Hyperbola or Logarithmic Reciprocal Model, 6.8. Choice of Functional Form, 6.9. A Note on the Nature of the Stochastic Error Term: Additive versus Multiplicative Stochastic Error Term, Chapter 7. Multiple Regression Analysis: The Problem of Estimation, 7.1. The Three-Variable Model: Notation and Assumptions, 7.2. Interpretation of Multiple Regression Equation, 7.3. The Meaning of Partial Regression Coefficients, 7.4. OLS and ML Estimation of the Partial Regression Coefficients, o OLS Estimators, o Variances and Standard Errors of OLS Estimators, o Properties of OLS Estimators, o Maximum Likelihood Estimators, 7.5. The Multiple Coefficient of Determination R2 and the Multiple Coefficient of Correlation R, 196 7.6. An Illustrative Example, o Regression on Standardized Variables, o Impact on the Dependent Variable of a Unit Change in More than One Regressor, 7.7. Simple Regression in the Context of Multiple Regression: Introduction to Specification Bias, 7.8. R2 and the Adjusted R2, o Comparing Two R2 Values, o Allocating R2 among Regressors, o The “Game'' of Maximizing R̄2, 7.9. The Cobb–Douglas Production Function: More on Functional Form, 7.10. Polynomial Regression Models, 7.11. Partial Correlation Coefficients, o Explanation of Simple and Partial Correlation Coefficients, o Interpretation of Simple and Partial Correlation Coefficients, Chapter 8. Multiple Regression Analysis: The Problem of Inference, The Normality Assumption Once Again, Hypothesis Testing in Multiple Regression: General Comments, Hypothesis Testing about Individual Regression Coefficients, Testing the Overall Significance of the Sample Regression, The Analysis of Variance Approach to Testing the Overall Significance of an Observed Multiple Regression: The F Test, Testing the Overall Significance of a Multiple Regression: The F Test, An Important Relationship between R2 and F, Testing the Overall Significance of a Multiple Regression in Terms of R2, The “Incremental” or “Marginal” Contribution of an Explanatory Variable, Testing the Equality of Two Regression Coefficients, Restricted Least Squares: Testing Linear Equality Restrictions, The t-Test Approach, The F-Test Approach: Restricted Least Squares, General F Testing, Testing for Structural or Parameter Stability of Regression Models: The Chow Test, Prediction with Multiple Regression, The Troika of Hypothesis Tests: The Likelihood Ratio (LR), Wald (W), and Lagrange Multiplier (LM) Tests, Testing the Functional Form of Regression: Choosing between Linear and Log–Linear Regression Models, Chapter 9. Dummy Variable Regression Models, 9.1. The Nature of Dummy Variables, 9.2. ANOVA Models, Caution in the Use of Dummy Variables, 9.3. ANOVA Models with Two Qualitative Variables, 9.4. Regression with a Mixture of Quantitative and Qualitative Regressors: The ANCOVA Models, 9.5. The Dummy Variable Alternative to the Chow Test, 9.6. Interaction Effects Using Dummy Variables, 9.7. The Use of Dummy Variables in Seasonal Analysis, 9.8. Piecewise Linear Regression, 295 9.9. Panel Data Regression Models, 297 9.10. Some Technical Aspects of the Dummy Variable Technique, o The Interpretation of Dummy Variables in Semilogarithmic Regressions, o Dummy Variables and Heteroscedasticity, o Dummy Variables and Autocorrelation, o What Happens If the Dependent Variable Is a Dummy Variable?, 9.11. Topics for Further Study, 9.12. A Concluding Example, Chapter 10. Multicollinearity: What Happens If the Regressors Are Correlated?, 10.1. The Nature of Multicollinearity, 10.2. Estimation in the Presence of Perfect Multicollinearity, 10.3. Estimation in the Presence of “High” but “Imperfect” Multicollinearity, 10.4. Multicollinearity: Much Ado about Nothing? Theoretical Consequences of Multicollinearity, 10.5. Practical Consequences of Multicollinearity, Large Variances and Covariances of OLS Estimators, Wider Confidence Intervals, “Insignificant” t Ratios, A High R2 but Few Significant t Ratios, Sensitivity of OLS Estimators and Their Standard Errors to Small Changes in Data, Consequences of Micronumerosity, 10.6. An Illustrative Example, 10.7. Detection of Multicollinearity, 10.8. Remedial Measures, Do Nothing, Rule-of-Thumb Procedures, 10.9. Is Multicollinearity Necessarily Bad? Maybe Not, If the Objective Is Prediction Only, 10.10. An Extended Example: The Longley Data, Chapter 11. Heteroscedasticity: What Happens If the Error Variance Is Nonconstant?, 365 11.1. The Nature of Heteroscedasticity, 11.2. OLS Estimation in the Presence of Heteroscedasticity, 11.3. The Method of Generalized Least Squares (GLS), o Difference between OLS and GLS, 11.4. Consequences of Using OLS in the Presence of Heteroscedasticity, o OLS Estimation Allowing for Heteroscedasticity, o OLS Estimation Disregarding Heteroscedasticity, o A Technical Note, 11.5. Detection of Heteroscedasticity, o Informal Methods, o Formal Methods, 11.6. Remedial Measures, o When σ2i Is Known: The Method of Weighted Least Squares, o When σ2i Is Not Known, 11.7. Concluding Examples, 11.8. A Caution about Overreacting to Heteroscedasticity, Chapter 12. Autocorrelation: What Happens If the Error Terms Are Correlated?, 12.1. The Nature of the Problem, 12.2. OLS Estimation in the Presence of Autocorrelation, 12.3. The BLUE Estimator in the Presence of Autocorrelation, 12.4. Consequences of Using OLS in the Presence of Autocorrelation, o OLS Estimation Allowing for Autocorrelation, o OLS Estimation Disregarding Autocorrelation, 12.5. Relationship between Wages and Productivity in the Business Sector of the United States, 1960– 2005, 12.6. Detecting Autocorrelation, o Graphical Method, o The Runs Test, o Durbin–Watson d Test, o A General Test of Autocorrelation: The Breusch–Godfrey (BG) Test, o Why So Many Tests of Autocorrelation?, 12.7. What to Do When You Find Autocorrelation: Remedial Measures, 12.8. Model Mis-Specification versus Pure Autocorrelation, 12.9. Correcting for (Pure) Autocorrelation: The Method of Generalized Least Squares (GLS), o When ρ Is Known, o When ρ Is Not Known, 12.10. The Newey–West Method of Correcting the OLS Standard Errors, 12.11. OLS versus FGLS and HAC, 12.12. Additional Aspects of Autocorrelation, o Dummy Variables and Autocorrelation, o ARCH and GARCH Models, o Coexistence of Autocorrelation and Heteroscedasticity, 12.13. A Concluding Example, Chapter 13. Econometric Modeling: Model Specification and Diagnostic Testing, 13.1. Model Selection Criteria, 13.2. Types of Specification Errors, 13.3. Consequences of Model Specification Errors, o Underfitting a Model (Omitting a Relevant Variable), o Inclusion of an Irrelevant Variable (Overfitting a Model), 13.4. Tests of Specification Errors, o Detecting the Presence of Unnecessary Variables (Overfitting a Model), o Tests for Omitted Variables and Incorrect Functional Form, 13.5. Errors of Measurement, o Errors of Measurement in the Dependent Variable Y, o Errors of Measurement in the Explanatory Variable X, 13.6. Incorrect Specification of the Stochastic Error Term, 13.7. Nested versus Non-Nested Models, 13.8. Tests of Non-Nested Hypotheses, o The Discrimination Approach, o The Discerning Approach, 13.9. Model Selection Criteria, o The R2 Criterion, o Adjusted R2, o Akaike's Information Criterion (AIC), o Schwarz's Information Criterion (SIC), o Mallows's Cp Criterion, 494 o A Word of Caution about Model Selection Criteria, o Forecast Chi-Square (χ2), 13.10. Additional Topics in Econometric Modeling, o Outliers, Leverage, and Influence, o Recursive Least Squares, o Chow's Prediction Failure Test, o Missing Data, 13.11. Concluding Examples, o A Model of Hourly Wage Determination, o Real Consumption Function for the United States, 1947–2000, 13.12. Non-Normal Errors and Stochastic Regressors, o What Happens If the Error Term Is Not Normally Distributed?, o Stochastic Explanatory Variables, 13.13. A Word to the Practitioner, EVALUATION SYSTEM 1st mid-term 16/3/2014 20 2nd mid-term 27/4/2014 20 8 home works (7 to be selected) 15 Final exam (comprehensive) 45 _____________ Total GRADING 100 A 95+ A- 90-94.5 B+ 87-89.5 B 83.5-86.5 B- 80-83 C+ 77-79.5 C 73.5-76.5 C- 70-73 D+ 65-69.5 D 60-64.5 F < 60 REFERENCES: Gujarati, Damodar N “Basic Econometrics”, 5th Edition (Required) COMPUTER SOFTWARE: E-views. Excel OFFICE HOURS Sunday, Tuesday, and Thursday 12.00–1.00 pm Students are served on a first-come first-serve basis. If more than one student are seeking help, each student will be allocated a maximum of ten minutes. In case a student needs additional time, an appointment at a later time is fixed so as to allow other waiting students to be assisted. Telephone calls are only allowed in emergencies. WEBSITE There is a web site for this subject in my home page: http://www.cba.edu.kw/elsakka GENERAL INFORMATION Absenteeism: university regulations governing absenteeism are applied to all students. This involves a first warning after 3 hours, a second warning after additional 3 hours absence and a failure notice for any absence beyond the six hours. It should be noted, however, that students may lose marks as a result of absenteeism Tardiness: students are expected to be in the lecture hall on time. Students showing signs of tardiness shall be warned for the first time and subsequently barred from the lecture. Make up policy: students who are unable to attend an exam due to illness or other extenuating circumstances (appropriate documents are required to verify the indicated circumstances) may request a make-up exam. Plagiarism and cheating are strictly prohibited by university policies as well as academic ethics. This course heavily uses Internet as a teaching aid. Students should be equipped with basic skills of WWW Internet search and electronic mail system. Those who want to receive Internet assignments and answers by e-mail should register their e-mail address in my home page “http://www.cba.edu.kw/elsakka“. For electronic mail message you can use one of the following E-mail addresses: elsakka@cba.edu.kw or elsakka2006@gmail.com