Week1

advertisement
Parameter, Statistic and Random Samples
• A parameter is a number that describes the population. It is a fixed
number, but in practice we do not know its value.
• A statistic is a function of the sample data, i.e., it is a quantity
whose value can be calculated from the sample data. It is a random
variable with a distribution function. Statistics are used to make
inference about unknown population parameters.
• The random variables X1, X2,…, Xn are said to form a (simple)
random sample of size n if the Xi’s are independent random
variables and each Xi has the sample probability distribution. We say
that the Xi’s are iid.
week1
1
Example – Sample Mean and Variance
• Suppose X1, X2,…, Xn is a random sample of size n from a population
with mean μ and variance σ2.
•
The sample mean is defined as
1 n
X   Xi.
n i 1
• The sample variance is defined as
1 n
2
X i  X  .
S 

n  1 i 1
2
week1
2
Goals of Statistics
• Estimate unknown parameters μ and σ2.
• Measure errors of these estimates.
• Test whether sample gives evidence that parameters are (or are
not) equal to a certain value.
week1
3
Sampling Distribution of a Statistic
• The sampling distribution of a statistic is the distribution of values
taken by the statistic in all possible samples of the same size from
the same population.
• The distribution function of a statistic is NOT the same as the
distribution of the original population that generated the original
sample.
• The form of the theoretical sampling distribution of a statistic will
depend upon the distribution of the observable random variables in
the sample.
week1
4
Sampling from Normal population
• Often we assume the random sample X1, X2,…Xn is from a normal
population with unknown mean μ and variance σ2.
• Suppose we are interested in estimating μ and testing whether it is
equal to a certain value. For this we need to know the probability
distribution of the estimator of μ.
week1
5
Claim
• Suppose X1, X2,…Xn are i.i.d normal random variables with
unknown mean μ and variance σ2 then
 2 

X ~ N   ,
n 

• Proof:
week1
6
Recall - The Chi Square distribution
• If Z ~ N(0,1) then, X = Z2 has a Chi-Square distribution with
parameter 1, i.e., X ~  21 .
• Can proof this using change of variable theorem for univariate
random variables.
• The moment generating function of X is
1/ 2
 1 
m X t   

1

2
t


• If X 1 ~  2v1  , X 2 ~  2v2  , , X k ~  2vk  , all independent then
k
T   X i ~  2k v
i 1
1 i
• Proof…
week1
7
Claim
• Suppose X1, X2,…Xn are i.i.d normal random variables with mean μ
and variance σ2. Then, Z i  X i   are independent standard normal

variables, where i = 1, 2, …, n and
 Xi   
2
2
Z

~





i
n 
 
i 1
i 1 
n
2
n
• Proof: …
week1
8
t distribution
• Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then, T 
Z
X /v
~ t v  .
• Proof:
week1
9
Claim
• Suppose X1, X2,…Xn are i.i.d normal random variables with mean μ
and variance σ2. Then,
X 
~ t n 1
S/ n
• Proof:
week1
10
F distribution
• Suppose X ~ χ2(n) independent of Y ~ χ2(m). Then,
X /n
~ Fn ,m 
Y /m
week1
11
Properties of the F distribution
• The F-distribution is a right skewed distribution.
• Fm,n  
1
Fn,m 
i.e. PFn ,m 
 1

1
1

 a   P
   P Fm,n   
F

a

  n ,m  a 
• Can use Table 7 on page 796 to find percentile of the F- distribution.
• Example…
week1
12
Recall - The Central Limit Theorem
• Let X1, X2,…be a sequence of i.i.d random variables with mean
n
E(Xi) = μ < ∞ and Var(Xi) = σ2 < ∞. Let S n   X i
i 1
S n  n
converges in distribution to Z ~ N(0,1).
 n
Then, Z n 
• Also, Z n 
Xn  
 n
converges in distribution to Z ~ N(0,1).
• Example…
week1
13
Download