Point Estimation Point Estimation Example (a variant of Problem 62, Ch5) Manufacture of a certain component requires three different maching operations. The total time for manufacturing one such component is known to have a normal distribution. However, the mean µ and variance σ 2 for the normal distribution are unknown. If we did an experiment in which we manufactured 10 components and record the operation time, and the sample time is given as 1 2 3 4 5 time 63.8 60.5 65.3 65.7 61.9 following: 6 7 8 9 10 time 68.2 68.1 64.8 65.8 65.4 What can we say about the population mean µ and population variance σ 2 ? Point Estimation Point Estimation Example (a variant of Problem 64, Ch5) Suppose the waiting time for a certain bus in the morning is uniformly distributed on [0, θ], where θ is unknown. If we record 10 waiting times as follwos: 1 2 3 4 5 time 7.6 1.8 4.8 3.9 7.1 6 7 8 9 10 time 6.1 3.6 0.1 6.5 3.5 What can we say about the parameter θ? Point Estimation Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic and computing its value from the given sample data. The selected statistic is called the point estimator of θ. Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic and computing its value from the given sample data. The selected statistic is called the point estimator of θ. P e.g. X = 10 i=1 Xi /10 is a point estimator for µ for the normal distribution example. Point Estimation Definition A point estimate of a parameter θ is a single number that can be regarded as a sensible value for θ. A point estimate is obtained by selecting a suitable statistic and computing its value from the given sample data. The selected statistic is called the point estimator of θ. P e.g. X = 10 i=1 Xi /10 is a point estimator for µ for the normal distribution example. The largest sample data X10,10 is a point estimator for θ for the uniform distribution example. Point Estimation Point Estimation Problem: when there are more then one point estimator for parameter θ, which one of them should we use? Point Estimation Problem: when there are more then one point estimator for parameter θ, which one of them should we use? There are a few criteria for us to select the best point estimator: Point Estimation Problem: when there are more then one point estimator for parameter θ, which one of them should we use? There are a few criteria for us to select the best point estimator: unbiasedness, Point Estimation Problem: when there are more then one point estimator for parameter θ, which one of them should we use? There are a few criteria for us to select the best point estimator: unbiasedness, minimum variance, Point Estimation Problem: when there are more then one point estimator for parameter θ, which one of them should we use? There are a few criteria for us to select the best point estimator: unbiasedness, minimum variance, and mean square error. Point Estimation Point Estimation Definition A point estimator θ̂ is said to be an unbiased estimator of θ if E (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, the difference E (θ̂) − θ is called the bias of θ̂. Point Estimation Definition A point estimator θ̂ is said to be an unbiased estimator of θ if E (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, the difference E (θ̂) − θ is called the bias of θ̂. Principle of Unbiased Estimation When choosing among several different estimators of θ, select one that is unbiased. Point Estimation Point Estimation Proposition Let X1 , X2 , . . . , Xn be a random sample from a distribution with mean µ and variance σ 2 . Then the estimators Pn Pn (Xi − X )2 2 2 i=1 Xi µ̂ = X = and σ̂ = S = i=1 n n−1 are unbiased estimator of µ and σ 2 , respectively. e If in addition the distribution is continuous and symmetric, then X and any trimmed mean are also unbiased estimators of µ. Point Estimation Point Estimation Principle of Minimum Variance Unbiased Estimation Among all estimators of θ that are unbiased, choose the one that has minimum variance. The resulting θ̂ is called the minimum variance unbiased estimator ( MVUE) of θ. Point Estimation Principle of Minimum Variance Unbiased Estimation Among all estimators of θ that are unbiased, choose the one that has minimum variance. The resulting θ̂ is called the minimum variance unbiased estimator ( MVUE) of θ. Theorem Let X1 , X2 , . . . , Xn be a random sample from a normal distribution with mean µ and variance σ 2 . Then the estimator µ̂ = X is the MVUE for µ. Point Estimation Point Estimation Definition Let θ̂ be a point estimator of parameter θ. Then the quantity E [(θ̂ − θ)2 ] is called the mean square error (MSE) of θ̂. Point Estimation Definition Let θ̂ be a point estimator of parameter θ. Then the quantity E [(θ̂ − θ)2 ] is called the mean square error (MSE) of θ̂. Proposition MSE = E [(θ̂ − θ)2 ] = V (θ̂) + [E (θ̂) − θ]2 Point Estimation Point Estimation Definition The standard error of an estimator θ̂ is its standard deviation q σθ̂ = V (θ̂). If the standard error itself involves unknown parameters whose values can be estimated, substitution of these estimates into σθ̂ yields the estimated standard error (estimated standard deviation) of the estimator. The estimated standard error can be denoted either by σ̂θ̂ or by sθ̂ .