Corollary Proposition

advertisement
Distribution for Linear Combinations
Distribution for Linear Combinations
Corollary
E (X1 − X2 ) = E (X1 ) − E (X2 ) and, if X1 and X2 are independent,
V (X1 − X2 ) = V (X1 ) + V (X2 ).
Distribution for Linear Combinations
Corollary
E (X1 − X2 ) = E (X1 ) − E (X2 ) and, if X1 and X2 are independent,
V (X1 − X2 ) = V (X1 ) + V (X2 ).
Proposition
If X1 , X2 , . . . , Xn are independent, normally distributed rv’s (with
possibly different means and/or variances), then any linear
combination of the Xi s also has a normal distribution. In
particular, the difference X1 − X2 between two independent,
normally distributed variables is itself normally distributed.
Point Estimation
Point Estimation
Example (a variant of Problem 62, Ch5)
Manufacture of a certain component requires three different
maching operations. The total time for manufacturing one such
component is known to have a normal distribution. However, the
mean µ and variance σ 2 for the normal distribution are unknown.
If we did an experiment in which we manufactured 10 components
and record the operation time, and the sample time is given as
1
2
3
4
5
time 63.8 60.5 65.3 65.7 61.9
following:
6
7
8
9
10
time 68.2 68.1 64.8 65.8 65.4
What can we say about the population mean µ and population
variance σ 2 ?
Point Estimation
Point Estimation
Example (a variant of Problem 64, Ch5)
Suppose the waiting time for a certain bus in the morning is
uniformly distributed on [0, θ], where θ is unknown. If we record 10
waiting times as follwos:
1
2
3
4
5
time 7.6 1.8 4.8 3.9 7.1
6
7
8
9
10
time 6.1 3.6 0.1 6.5 3.5
What can we say about the parameter θ?
Point Estimation
Point Estimation
Definition
A point estimate of a parameter θ is a single number that can be
regarded as a sensible value for θ. A point estimate is obtained by
selecting a suitable statistic and computing its value from the
given sample data. The selected statistic is called the point
estimator of θ.
Point Estimation
Definition
A point estimate of a parameter θ is a single number that can be
regarded as a sensible value for θ. A point estimate is obtained by
selecting a suitable statistic and computing its value from the
given sample data. The selected statistic is called the point
estimator of θ.
P
e.g. X = 10
i=1 Xi /10 is a point estimator for µ for the normal
distribution example.
Point Estimation
Definition
A point estimate of a parameter θ is a single number that can be
regarded as a sensible value for θ. A point estimate is obtained by
selecting a suitable statistic and computing its value from the
given sample data. The selected statistic is called the point
estimator of θ.
P
e.g. X = 10
i=1 Xi /10 is a point estimator for µ for the normal
distribution example.
The largest sample data X10,10 is a point estimator for θ for the
uniform distribution example.
Point Estimation
Point Estimation
Problem: when there are more then one point estimator for
parameter θ, which one of them should we use?
Point Estimation
Problem: when there are more then one point estimator for
parameter θ, which one of them should we use?
There are a few criteria for us to select the best point estimator:
Point Estimation
Problem: when there are more then one point estimator for
parameter θ, which one of them should we use?
There are a few criteria for us to select the best point estimator:
unbiasedness,
Point Estimation
Problem: when there are more then one point estimator for
parameter θ, which one of them should we use?
There are a few criteria for us to select the best point estimator:
unbiasedness,
minimum variance,
Point Estimation
Problem: when there are more then one point estimator for
parameter θ, which one of them should we use?
There are a few criteria for us to select the best point estimator:
unbiasedness,
minimum variance,
and mean square error.
Point Estimation
Point Estimation
Definition
A point estimator θ̂ is said to be an unbiased estimator of θ if
E (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, the
difference E (θ̂) − θ is called the bias of θ̂.
Point Estimation
Definition
A point estimator θ̂ is said to be an unbiased estimator of θ if
E (θ̂) = θ for every possible value of θ. If θ̂ is not unbiased, the
difference E (θ̂) − θ is called the bias of θ̂.
Principle of Unbiased Estimation
When choosing among several different estimators of θ, select one
that is unbiased.
Point Estimation
Point Estimation
Proposition
Let X1 , X2 , . . . , Xn be a random sample from a distribution with
mean µ and variance σ 2 . Then the estimators
Pn
Pn
(Xi − X )2
2
2
i=1 Xi
µ̂ = X =
and σ̂ = S = i=1
n
n−1
are unbiased estimator of µ and σ 2 , respectively.
e
If in addition the distribution is continuous and symmetric, then X
and any trimmed mean are also unbiased estimators of µ.
Download