Course: STA 342 – Statistical Inference II (3 Credits – Compulsory) Course Duration: Three hours per week for 15 weeks (45 hours) As taught in 2011/2012 session. Lecturer: E-mail: ADEJUMO, Adebowale Olusola B.Sc., M.Sc. Statistics (Ilorin), Ph.D. Statistics (Dr. rer. nat) (Munich). aodejumo@unilorin.edu.ng, ao123adejumo@yahoo.co.uk Department of Statistics, Faculty of Science, University of Ilorin, Ilorin, Kwara State, Nigeria. Office Location: – F4, Department of Statistics, Statistics Building, Faculty of Science. Consultation Hours: 3 - 4.30pm Tuesdays & Thursdays. Course Content: Methods of estimation: Mini-max, Bayes method, method of Maximum likelihood, method of moment. Optimum properties of estimators: Unbiasedness, Consistency, Efficiency, Sufficiency of a statistics and factorization theorem. Rao-Blackwell theorem. Testing of hypotheses. Discussion of Optimality properties of tests. Methods of finding test functions. 30h (T); 45h (P) C; PR: STA 341 Course Description: This course shall be subdivided into three major sections: In first section we shall consider Game theory in details, by look at the following terms: Parameter space, Action space, Loss function, Pure game, Random variable, Range space, Probability distribution function, Decision rule, Decision space, Risk function, Statistical game, Bayes risk, Bayes decision rule, Optimist, Pessimist, Dominant and Inadmissible decision rule. Section two will be on estimation: Point and interval estimation, unbiasedness, Efficiency, UMVUE or UMRUE, Consistency, Complete, Sufficiency, Rao Blackwell theorem, Lehma Scheffe Uniqueness theorem, Method of Moments, Method of Maximum Likelihood Estimation, Method of Bayes Estimation. The third section shall be on Hypothesis testing: Best test, Most Powerful test, Uniformly Most Powerful test, Likelihood ratio test. Course Justification: The course is designed to introduce students in the Department of Statistics to the application of statistics in decision making. This course is to develop in students the ability to apply their knowledge and skills to the solution of theoretical and practical problems in Statistics. To provide students with a knowledge and skills base from which they can proceed to further studies in specialized areas of statistics or multidisciplinary areas involving Statistics in the future. Course Objectives: The general objective of this course is for the students to know how to use statistical tools in making decision. In addition, to instil in students a sense of enthusiasm for Statistics, an 1 appreciation of its application in different areas and to involve them in an intellectually stimulating and satisfying experience of learning and studying. At the end of the course, the students will be able to know: different games and how to calculate their loss and risk functions, the concepts of optimality criteria, how to for or obtain an efficient estimator, different estimation methods, the concepts of Hypothesis testing, how to obtain different test functions as well as calculating the errors committed by each, the concepts of Montone likelihood ratio and Likelihood ratio test. Course Requirements: This is a compulsory course for students in Departments of Statistics. Students are expected to participate in all the course activities and have minimum of 75% attendance to be able to write the final examination. They will also be expected to treat the study questions and assignments. Students are also expected to have e-mail accounts. Methods of grading: No Item 1. Class assignments/ test 2. Comprehensive final examination Total Score % 30 70 100 Course Delivery Strategies: The lecture will be delivered through face-to-face method, theoretical material (lecture note) provided during lecture with working examples to demonstrate the theory lessons. Students will be encouraged and required to read around the topics and follow current issues in the media and internet. Web-interactions will be employed by requesting each student to have yahoo email address to enable them participate in the yahoo discussion group that had been created for the course (ao123adejumo@yahoo.co.uk). Additional materials and links will be provided on the board. The delivery strategies will also be supported by tutorial sessions and review of study questions. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. 2 LECTURE CONTENTS Week 1: Game Theory Objective: The student will be able to explain what Game is, how to determine the risk function table for any given game as well as the right decision rule to take in order to attain certain optimality criteria. Description: The course outline will be introduced with emphasis on the objectives and delivery strategies, the introduction, important and expectation for the study of the course. Different games, definition of terms, with practical example to illustrate what game is all about. Study Questions: 1. What are Games? 2. Distinguish between a Pure game and Statistical game. 3. Distinguish between Loss function and Risk function 4. Distinguish between Optimist action and Pessimist action. 5. Distinguish between Decision rule and Decision space. 6. Distinguish between Range Space and Sample Space. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 2: Optimality Criteria Objective: The student will be able to explain what Game is, how to determine the risk function table for any given game as well as the right decision rule to take in order to attain certain optimality criteria. Description: Optimality criteria, Prior distribution, Bayes risk, Bayes Decision, with practical example to illustrate. Study questions: 1. What is Bayes risk? 2. What is Bayes decision rule? 3. Distinguish between Bayes risk and Bayes decision rule? Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. 3 Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 3: Estimation Objective: The objective is for the student to know some of the properties of a good estimator. Description: What is Point estimator, Interval estimator, Properties: Unbiasedness, Biased, Mean square error, Efficient: Regularity conditions of an estimator. Study questions: (1) What is an estimator? (2) What is an unbiasedness? (3) Show that X is an unbiased estimator for ? (4) What are the regularity conditions of an estimator? Assignments: University of Ilorin authority has to decide whether the proportion, , of students who are in support of the NEW DRESS CODE is 0.50 or 0.45. Let X be the number of those in support among a random sample of thirteen students interviewed. Let the Loss function (in millions of naira) be given by L (0.50, 0.50) = -40 L (0.50, 0.45) = 65 L (0.45, 0.45) = -35 The decision rules of interest to the authority are as follows: d1 (X) = 0.50 no matter the outcome of the interview. d2 (X) = 0.45 no matter the outcome of the interview. 0.5 if x 6 d 3 (X) 0.45 if x 6 0.45 if x 6 d 4 (X) 0.5 if x 6 Obtain the (i) Pessimist action (ii) Optimist action (iii) Bayes action (iv) range space of X (v) Probability distribution of X (vi) Risk function (vii) Pessimist decision rule (viii) Optimist decision rule (ix) inadmissible decision rule (x) Bayes decision rule, Using prior (0.50) = 0.65 and (0.45) = 0.35 4 L (0.45, 0.50) = 73 Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 4: Efficient Estimator Objective: The objective is for the student to know how to obtain and test for an efficient estimator. Description: Efficient estimator: Definition, Fishers information, Additivity property of Fisher information, sufficient estimator, Cramar Rao inequality theorem, Applications. Study questions: 1. State Cramar Rao inequality theorem? 2. State factorization theorem? 3. Define Fisher information, I(), in a single observation of a random variable X, given that the p.d.f (p.m.f) of X is f(x/), C R. 4. Prove that Fisher information, In (), in a simple random sample, X1, X2 ,…,Xn , n≥ 1, is nI(). 5. Given a random sample X1, X2, …, Xn from a Bernoulli (θ) process, show that the sample mean X is an efficient estimator for m(θ)= θ, given that the loss function is the squared error loss function. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 5: Objective: Uniformly Minimum Variance Unbiased Estimator (UMVUE) The objective is for the student to know how to determine a UMVUE or UMRUE estimators. Description: UMVUE or UMRUE, Rao Black well theorem, Applications. Study questions: 1. Let X1, X2, …, Xn, n>1 be a random sample from Normal N(0, σ2) process. If the loss function is the square error loss function, (i) Determine whether or not your estimator in (i) is efficient for σ2. (ii) Find the fisher information for σ2 2. What is Cramer-Rao Inequality and what is it used for? 5 4. Let X1, …, X25, be iid from N(, 2) process. Consider the estimate 1 25 X i for . Does it 25 i 1 attain the Cramer-Rao lower bound? Comment about the estimator. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 6: Complete, UMVUE and Consistence Estimator Objective: The objective is for the student to know more properties of a good estimator and how to find UMVUE estimators. Description: Complete, Lehman Scheffe Uniqueness theorem: Applications, and Consistency, Study questions: 1. What is a complete estimator? 2. Define Lehman-Scheffe Theorem? 3. What is a consistence estimator? Reading List: 2 C. 1 O Week 7: Some Methods of Finding Estimators. Objective: The objective is for the student to know different means of finding estimators. Description: Method of moment estimation, Method of maximum likelihood estimation, Bayes method of point estimation. Study Questions: 1. Obtain the rth moment of a random variable that has a Gamma distribution? 2. Obtain the rth moment of a random variable that has a Beta distribution? 3. Let X1, X2, …, Xn, n>1 be a random sample from Normal N(0, σ2) process. If the loss function is the square error loss function, Find the Maximum likelihood estimator of σ2 4. Let X1, …, X25, be iid from N(, 2) process. Obtain the Maximum Likelihood estimator for ? 6 5. Given a random sample, X1, X2,…,Xn ,n ≥ 1, from a Bernoulli() process. Assuming that has the Uniform (0, 1) distribution and that the Loss function is the squared error loss function, find the Bayes estimator of . Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 8: Hypothesis Testing Objective: To know different test functions and how to calculate the errors committed by each. Description: Introduction on hypothesis, types of hypothesis, types of errors, randomized and nonrandomized Tests. Study questions: Define each of the following terms: 1. Randomized test 2. Non-randomized test 3. Simple Hypothesis 4. Composite Hypothesis 5. Power of a test Reading List: Week 9: Best Test Objective: To know the how to determine Best test function and its total error. Description: How to obtain a Best test, its errors and applications. Study questions: Given the random sample X1, X2 ,…,X15 from a Bernoulli(θ) process, and the hypotheses Ho: θ= ¼ against H1: θ=½., (i) Find the test Ф that minimize the total probability of error. (ii) Compute the total errors committed by the test Ф in a(i) above. Reading List: 7 Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 10: Most Powerful Test Objective: To know Most Powerful test function and how to calculate the errors. Description: How to obtain a Most powerful test, its errors and applications. Study questions: Given the random sample X1, X2 ,…,X10 from a Bernoulli(θ) process, and the hypotheses Ho: θ= ¼ against H1: θ=½., (i) Determine the Most powerful test Ф of size α = 0.05. (ii) Compute the total errors committed by the test Ф in b(i) above. Assignment: 1). Given X1, X2,…,X6 from a simple random sample with a know probability density function f(x|), such that f ( x | ) x 2 exp x for x > 0, > 0. (i). Obtain the Maximum Likelihood Estimator for . (ii). If xi = 2.3, 1.4, 7.0, 3.7, 2.1, and 4.3, compute ˆ 2). Given the random sample X1, X2 ,…,X10 from a Poisson(θ) process, and the hypotheses Ho: θ= 2 against H1: θ=2.5, a). (i) Find the test Ф that minimize the total probability of error. (ii) Compute the total errors committed by the test Ф in a(i) above. b). (i) Determine the Most powerful test Ф of size α = 0.05. (ii) Compute the total errors committed by the test Ф in b(i) above. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. 8 Week 11: Uniformly Most Powerful Test Objective: To know how to determine UMP test function and how to calculate the errors. Description: Monotone likelihood ratio, 1-parameter exponential family. How to obtain a Uniformly Most powerful test, its errors and applications. Study questions: a). Define the concept of one parameter exponential family? b). Given a random sample, X1, X2,…,Xn ,n ≥ 1, from a Bernoulli() process. Let θo < θ1, show that the family of Bernoulli() p.m.f.’s has a one parameter exponential family of p.m.f’s. c). Given a random sample, X1, X2,…, X15, from a Bernoulli() process. Find the size α = 0.05 Uniformly Most Powerful test of Ho: θ ≤ 0.4 against H1: θ > 0.4. What is the power of your test when θ = 0.3. Reading list: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 12: Likelihood Ratio Test when and 2 are Known Objective: To know how to determine LR test function when and variance 2 known. Description: Likelihood ratio test: how to obtain the test, its errors and applications when the both the population mean and variance 2 are known. Study questions: a). Define Monotone Likelihood Ratio (MLR). b). Let X1, X2,…, X25 be a random sample from a N(μ, σ2) process, μ and σ2 known. Find the Likelihood ratio (LR) size α = 0.05 test of Ho: μ = μ0 against H1: μ ≠ μ0. c). Define Likelihood Ratio Test? Reading list: 9 Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 13: Likelihood Ratio Test when and 2 are Unknown Objective: To know how to determine LR test function when and variance 2 unknown. Description: Likelihood ratio test: how to obtain the test, its errors and applications when the both the population mean and variance 2 are unknown. Study Questions: a). Define the concept of Likelihood Ratio (LR). b). Let X1, X2,…, X25 be a random sample from a N(μ, σ2) process, μ and σ2 unknown. Find the Likelihood ratio (LR) size α = 0.05 test of Ho: μ = 0 against H1: μ ≠ 0. Reading List: Mood, A. M., Graybill, F. A., and Boes, D. C. (1963). Introduction to the Theory of Statistics. McGraw Hill, New York. Rousas, G. G. (1973). A First Course in Mathematical Statistics. Adelison - Wesley, Massachusets. Bickel, P. J. and Doksum, J. A. (1973). Mathematical Statistics. Holden Day, San Francisco. Week 14: Class Test Description: The students will be assessed on the whole course for 1 hour. Week 15: Revision/ Tutorial Exercises Description: Solution to the test will be considered with other tutorial questions. Study Questions: a). Given a random sample, X1, X2,…,Xn ,n ≥ 1, from a Bernoulli() process. Find the UMVUE of . b). Given a random sample, X1, X2,…,Xn ,n ≥ 1, from a Bernoulli() process. Find the efficient estimator for m () = 1-. 10 from Normal N(0, σ2) process. If the loss c). Let X1, X2, …, Xn, n>1 be a random sample function is the square error loss function, (i) Find the Maximum likelihood estimator of σ2 (ii) Determine whether or not your estimator in (i) is efficient for σ2. (iii) Find the fisher information for σ2 d). Given a random sample, X1, X2,…,Xn ,n ≥ 1, from a Poisson() process. Let θo < θ1, show that the family of Poisson() p.m.f.’s has Monotone Likelihood Ratio in n Tn X i i 1 e). Given a random sample, X1, X2,…,X5, from a Poisson() process. Find the size α = 0.05 Uniformly Most Powerful test of Ho: θ ≤ 1.4 against H1: θ > 1.4. What is the power of your test when θ = 2. 11