LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034

advertisement
LOYOLA COLLEGE (AUTONOMOUS), CHENNAI – 600 034
B.Sc. DEGREE EXAMINATION – STATISTICS
FIFTH SEMESTER – APRIL 2008
ST 5500 - ESTIMATION THEORY
Date : 28-04-08
Time : 1:00 - 4:00
Dept. No.
NO 24
Max. : 100 Marks
PART-A
Answer ALL the questions:
(10x2=20)
1. Define ‘bias’ of an estimator.
2. When do you say an estimator is consistent?
3. Define a sufficient statistic.
4. What do you mean by bounded completeness?
5. Describe method of moments in estimation.
6. State invariance property of maximum likelihood estimator.
7. Define Loss function and give an example.
8. Explain ‘prior distribution’ and ‘posterior distribution’.
9. Explain least square estimation.
10. Mention any two properties of least squares estimator.
PART-B
Answer any FIVE questions:
(5x8=40)
11. If Tn is asymptotically unbiased with variance approaching 0 as n   , then show that Tn is
consistent.
12. Show that xi(xi  1) is an unbiased estimate of  2 , based on a random sample
x1 , x2 ,...., xn drawn from B(1, ), 0    1 .
13. Let x1 , x2 ,...., xn be a random sample of size n from population. Examine if T ( x)  x1 is complete.
14. State and prove RAo-Blackwell theorem.
15. Estimate  and  by the method of moments in the case of Pearson’s Type III distribution with
p.d.f f ( x :  ,  ) 
   1   x
x e ,o  x   .

16. State and establish Bhattacharya inequality.
17. Describe the method of modified minimum Chi square.
18. Write a note on Baye’s estimation.
PART-C
Answer any TWO questions:
(10x2=20)
19. a) x1 , x2 and x3 is a random sample of size 3 from a population with mean value  and variance
 2 . T1 , T2 , T3 are the estimators used to estimate mean value  , where
1
T1  x1  x2  x3; T2  2 x1  3x3  4 x2 and T3  ( x1  x2  x) .
3
i)
Are T1 and T2 unbiased estimators?
ii)
Find the value of  such that T3 a consistent estimator?
iii)
With this value of  is T3 a consistent estimator?
iv)
Which is the best estimator?
1
b) If x1 , x2 ,....xn are random observations on a Bernoulli variate X taking the value 1 with
probability p and the value 0 with probability (1-p), show that
xi  xi 
1 
 is a consistent
n 
n 
estimator of p(1-p).
20. a) State and Prove cramer-Rao inequality.
f ( x; )  [ {1  ( x   ) 2 }]1 ,    x  
b) Given the probability density function
   
Show that the Cramer-Rao-lower bound of variance of an unbiased estimator of  is 2/n, where n
is the size of the random sample from this distribution.
[12+8]
21. a) State and prove Lehmann – Scheffe theorem
b) Obtain MLE of  in f ( x, )  (1   ) x ,0  x  1, based on an independent sample of size n.
Examine whether this estimate is sufficient for  .
[12+8]
22. a) Show that a necessary and sufficient condition for the linear parametric function 1 to be
linearly estimable is that
 A
ank (A) = rank  1 
 
where   ( 1 ,  2 ,.... p ) and  ( 1 , 2 ,.... p )1
b) Describe Gauss – Markov model
[12+8]
--------
2
Download