Sampling Distributions & Point Estimation

advertisement
Sampling Distributions & Point
Estimation
Questions
• What is a sampling distribution?
• What is the standard error?
• What is the principle of maximum
likelihood?
• What is bias (in the statistical sense)?
• What is a confidence interval?
• What is the central limit theorem?
• Why is the number 1.96 a big deal?
Population
• Population & Sample Space
• Population vs. sample
• Population parameter, sample statistic
Parameter Estimation
We use statistics to estimate parameters,
e.g., effectiveness of pilot training,
effectiveness of psychotherapy.
X 
SD  
Sampling Distribution (1)
• A sampling distribution is a distribution of a
statistic over all possible samples.
• To get a sampling distribution,
– 1. Take a sample of size N (a given number like
5, 10, or 1000) from a population
– 2. Compute the statistic (e.g., the mean) and
record it.
– 3. Repeat 1 and 2 a lot (infinitely for large pops).
– 4. Plot the resulting sampling distribution, a
distribution of a statistic over repeated samples.
Suppose
• Population has 6 elements: 1, 2, 3, 4, 5,
6 (like numbers on dice)
• We want to find the sampling
distribution of the mean for N=2
• If we sample with replacement, what
can happen?
1st
2nd
1
1
1
2
1
3
1
4
1
5
1
6
2
1
2
2
2
3
2
4
2
5
2
6
M
1st
2nd
1 3
1.5 3
1
2 3
2.5 3
3
3 3
3.5 3
5
1.5 4
2 4
1
2.5 4
3 4
3
3.5 4
4 4
5
2
4
6
2
4
6
1st
M
2nd
M
2 5
2.5 5
1
3
2
3.5
3 5
3.5 5
3
4
4
4.5
4 5
4.5 5
5
5
6
5.5
2.5 6
3 6
1
3.5
2
4
3.5 6
4 6
3
4.5
4
5
4.5 6
5 6
5
5.5
6
6
7
6
5
Possible Outcomes
4
Series1
3
2
1
0
1 3 5 7 9 11 13 15 17 19 21 23 25 27 29 31 33 35
Histogram
Sampling
distribution for
mean of 2 dice.
1+2+3+4+5+6 = 21.
21/6 = 3.5
There is only 1
way to get a
mean of 1, but 6
ways to get a
mean of 3.5.
Sampling Distribution (2)
• The sampling distribution shows the relation
between the probability of a statistic and the
statistic’s value for all possible samples of
size N drawn from a population.
f(M)
Hy pothetical Distribution of Sample Means
Mean Value
Sampling Distribution Mean
and SD
• The Mean of the sampling distribution is
defined the same way as any other
distribution (expected value).
• The SD of the sampling distribution is the
Standard Error. Important and useful.
• Variance of sampling distribution is the
expected value of the squared difference – a
mean square.
• Review
 G2  E (G  G ) 2
Review
• What is a sampling distribution?
• What is the standard error of a statistic?
Statistics as Estimators
• We use sample data compute statistics.
• The statistics estimate population values, e.g.,
–
X 
• An estimator is a method for producing a best
guess about a population value.
• An estimate is a specific value provided by an
estimator.
• We want good estimates. What is a good
estimator? What properties should it have?
Maximum Likelihood (1)
• Likelihood is a conditional probability.
•L  p( x  value |  )
• L is the probability (say) that x has some
value given that the parameter theta has some
value. L1 is the probability of observing
heights of 68 inches and 70 [data] inches
given adult males[theta]. L2 is the
probability of 68 and 70 inches given adult
females.
• Theta ( ) could be continuous or discrete.
Maximum Likelihood (2)
• Suppose we know the function (e.g.,
binomial, normal) but not the value of
theta.
• Maximum likelihood principle says
take the estimate of theta that makes the
likelihood of the data maximum.
• MLP says: Choose the value of theta
that makes this maximum:
L( x1 , x2 ,...xN |  )
Maximum Likelihood (3)
• Suppose we have 2 values hypothesized for
proportions of male grad students at USF, 50
and 40. We randomly sample 15 students and
find that 9 are male.
• Calculate likelihood for each using binomial:
15  9 6
L( x  9; p  .50, N  15)   .50 .50  .153
9 
15  9 6
L( x  9; p  .40, N  15)   .40 .60  .061
9 
• The .50 estimate is better because the data are
more likely.
Likelihood Function
The binomial distribution computes probabilities
Likelihood
0.25
0.2
0.15
0.1
0.05
0
-0.05 0
0.2
0.4
0.6
Theta (p value)
0.8
1
Maximum Likelihood (4)
• In example, best (max like) estimate
would be 9/15 = .60.
• There is a general class called
maximum likelihood estimators that
find values of theta that maximizes the
likelihood of a sample result.
• ML is one principle of ‘goodness’ of an
estimator
More Goodness (1)
• Bias. If E(statistic)=parameter, the
estimator is unbiased. If it’s unbiased,
the mean of the sampling distribution
equals the parameter. The sample mean
has this property: E (X )   . Sample
variance is biased.
More Goodness (2)
• Efficiency – size of the sampling variance.
• Relative Efficiency. Relative efficiency is the
ratio
of two sampling variances.
2
H
 efficiency of G relative to H
2
G
• More efficient statistics have smaller
sampling variances, smaller standard error,
and are preferred because if both are
unbiased, one is closer than the other to the
parameter on average.
Goodness (3)
• Sometimes we trade off bias and
efficiency. A biased estimator is
sometime preferred if it is more
efficient, especially if the magnitude of
bias is known.
• Resistance. Indicates minimal
influence of outliers. Median is more
resistant than the mean.
Sampling Distribution of the
Mean
• Unbiased: E (X )  
• Variance of sampling distribution 2of

2
means based on N obs: VM   M 
N

M 
• Standard Error of the Mean:
N
• Law of large numbers: Large samples
produce sample estimates very close to
the parameter.
Unbiased Estimate of
Variance
• It can be shown that: E(S
2
2
 N 1  2
)  


N  N 
2
• The sample variance is too small by a
factor of (N-1)/N.
(X  X )
• We fix with
 N 

s 
S 
2
2
2
 N 1 
N 1
• Although the variance is unbiased, the
SD is still biased, but most inferential
work is based on the variance, not SD.
Review
• What is the principle of maximum
likelihood?
• Define
– Bias
– Efficiency
– Resistance
• Is the sample variance (SS divided by
N) a biased estimator?
Interval Estimation
• Use the standard error of the mean to create a
bracket or confidence interval to show where
good estimates of the mean are.
• The sampling distribution of the mean is
nice* when N>20. Therefore:
p( X  3 M    X  3 M )  .95
• Suppose M=100, SD=14, N=49. Then
SDM=14/7=2. Bracket = 100-6 =94 to 100+6
= 106 is 94 to 106. P is probability of sample
not mu.
* Unimodal and symmetric
Review
• What is a confidence interval?
• Suppose M = 50, SD = 10, and N =100.
What is the confidence interval?
SEM = 10/sqrt(100) = 10/10 = 1
CI (lower) = M-3SEM = 50-3 = 47
CI (upper) = M+3SEM = 50+3 = 53
CI = 47 to 53
Central Limit Theorem
– 1. Sampling distribution of means becomes
normal as N increases, regardless of shape of
original distribution.
– 2. Binomial becomes normal as N increases.
– 3. Applies to other statistics as well (e.g.,
variance)
Properties of the Normal
• If a distribution is normal, the sampling
distribution of the mean is normal
regardless of N.
• If a distribution is normal, the sampling
distributions of the mean and variance
are independent.
Confidence Intervals for the
Mean
• Over samples of size N, the probability is .95
for
 1.96 M  X    1.96 M
• Similarly for sample values of the mean, the
probability is .95 that
X  1.96 M    X  1.96 M
• The population mean is likely to be within 2
standard errors of the sample mean.
• Can use the Normal to create any size
confidence interval (85, 99, etc.)
Size of the Confidence
Interval
• The size of the confidence interval depends
on desired certainty (e.g., 95 vs 99 pct) and
the size of std error of mean ( M ).
• Std err of mean is controlled by population
SD and sample size. Can control sample size.
M 

N
• SD 10. If N=25 then SEM = 2 and CI width
is about 8. If N=100, then SEM = 1 and CI
width is about 4. CI shrinks as N increases.
As N gets large, decreasing change in CI
because of square root. Less bang for buck as
N gets big.
Review
• What is the central limit theorem?
• Why is the number 1.96 a big deal?
• Assume that scores on a curiosity scale are
normally distributed. If the sample mean is
50 based on 100 people and the population
SD is 10, find an approx 99 pct CI for the
population mean.
Download