121 Internat. J. Math. & Math. Sci. (1990) 121-128 VOL. 13 NO. ESTIMATION OF THE SCALE PARAMETER OF GAMMA MODEL IN THE PRESENCE OF OUTLIER OBSERVATIONS M.E. GHITANY Department of Mathemat University oF Saskatchewan Saskatoon, Saskatchewan Canada, S7N OWO and Department ,)f lathematics UniversEty of Western Australia Nedlands, WA 6009 Australia (Received August 15, 1988) This paper considers the Bayesian point estimation of the scale parameter ABSTRACT. for a two-parameter gamma life-testl,g model In presence several of outller The Bayesian analysis is carried out under the assumption observations in the data. of squared error loss function and fixed or random shape parameter. AND KEY WORDS PIIRASES. noninformatlve function, Scale prior, squared gamma distrlbutton, parameter, loss error incomplete gamma function, modified Bayes estimate, Bessel function of the third kind. 1980 AMS SUBJECT CLASSIFICATION CODES. 62CI0. I. INTRODUCTION. Bayesian problems estimation taking account into extension possible of the parameter set of a basic probability model to include contamination components, called An excellent review for the Bayesian outllers, have been considered for some time. approaches to outliers can be found in Barnett and Lewls [I]. In particular, the published dealing literature wth the Bayesian point estimation for the scale parameter of most well known life-testlng models in presence of several outlier observations in the data, is limited. Sinha exponential [2] seems to be the first to consider a fuller Bayesian analysis in the distribution Employing data. uniform, three when a single possible noninformative), he outlier families provided the of observation prior may be distributions present in the (Inverted gamma, Bayes estimator for the scale parameter under each of these priors. Lingappaiah [3] generalizes the corresponding estimation problem of Snha [2] by considering the generalized gama d|strlbutlon when several outllers may be present n M.E. GHITANY 122 the data. Accordingly, he derived the Bayes estimator of the scale parameter for each However, assuming of the exponential, gamma and Weibull distributions. fixed shape parameter, he only employed one particular member of the inverted gamma prior family for the scale parameter. It should be mentioned here that in both of the above two papers, the authors employed beta prior family to each parameter causing the contamination in the data. Only, Sinha [2], briefly discussed the robustness of the posterior distribution of the scale parameter when the parameter causing the contamination is fixed. Also, Lingappaiah [4] studies the effect of a single outlier observation on the Bayesian estimation of the parameters of normal llfe-testlng distribution model. In this paper, we consider the following point estimation problem. Suppose an x k observations are from the gamma in such that n ,x 2 n) observed sample (x family with p.d.f exp[-x/o] x -I f(x; o) > x O, a, > O, (I.I) r()a is the shape parameter, while k observations are from different families with where ..., < k. Assume that the parameters ’s 4 I, r I, 2 r are fixed and the set of k outllers is likely to be any set of k observations from the p.d.f’s f(x;O/r), sample. Under squared error loss function, the main problem here is to find a closed form expression parameter 0 Bayes estimator of the scale parameter for the is either a fixed real number or a random variable, fixed shape parameter when the shape respectively. For the following prior families of distributions are considered: (i) Inverted Gamma family b g(o) o-(b+l) a r() exp[-a/o] o > 0, a,b > 0, (1.2) (This is a natural conjugate prior for o (Raiffa [5])); (ll) Uniform family g(o) a2 (lii) Exponential family -1 g(o) al, a I< o < (1.3) a 2, > o; > O, exp[-o/], (1.4) (iv) Noninformatlve family g(o) -c a a> O, For random shape parameter c > o. (1.5) we consider a mixture of discrete prior and a conditional inverted gamma prior distribution on Pr{=j} pj, J 1,2 ..... and o, respectively, as follows: m, ]j > O, (1.6) 123 ESTIMATION OF THE SCALE PARAMETER OF GAMMA MODEL -(+I) " > r() exPl-j/I g(ol=Uj 0 Uj 5 > (I 6a) 0 (This is a natural conjugate prior for (, (Ralffa [5])). Note that all the above priors were considered, partlally or totally, in the work of Sinha [2], Lingappalah [3,4], Bhattacharya [6], Canavos and Tsokos [7], and Lwln and Slngh [8]. In the subsequent sections we will derive the posterior mean, i.e. the Bayes and estimator, the of variance scale parameter o under each of the above priors, respectively 2. ESTIMATION OF o WHEN IS FIXED To work out the Bayes estimator of , note that the llkellhood function can be expressed as f(xl r f(xj;o) k n " j+In L(o) f(xj;o) i r- In where the summation denotes the k-fold sums over the components of the vector i (il, i i 2 k) such that i * i 2 * i k n. This likelihood function must satisfy the following relation: L(o) o( [ exp[-w i (_q)/O] i where _a-=(5’ 2 and k k n wl (-=) (2.1) 0 [ (1-r)Xl r (2.2) r=l where x is the sample mean. Now, the likelihood function and the prior density of o are used to obtain the That is, according to Bayes’ theorem the posterior density of is posterior density. given by (o) where L(o) g(o) g(o) do (2 3) ’’]f L(o) is the parameter space containing o Consequently, we can find the posterior mean and variance for o. Respectively considering the priors (1.2), (1.3), (1.4) and (1.5), the posterior are as follows: densities of W( o) A (n, _,a,b) r. exp[-{a + w i(_)}/o] -(n u+b+ o (2.4) i where A(n,__a,a,b) F(n, + b) r [a + w I i (_a)]-(n+b); (2.4a) 124 M.E. GHITANY where B(n, a l,a2) E [wi(_a)] -(nu-1) n > I, y*(nu-l,wi(_a)), (2.5a) for which and T(’r,z/a2), (’r,z/a 1) y* (’r,z) (2.5b) (.,.) denotes the well-known incomplete gamma function (Andrews [9]); (ill) c-l(nu,_._a,X) v(o) Z exp[-{o/X + -nu, wi(.._a)/o}] i o (2.6) where -(n-l)/2 C(n,a,) Z i [Wl(=)]-(nu-l)/2 Kn-I 2[wi(a)/] I/2) (2.6a) and K (.) is the modified Bessel function of the third kind of order x (Andrews [9]); (iv) D-I(nu,.._,c) (o) E i where r(nu + c-l) r. D(n,_a,c) i /o] o-(n.+c) exp[-wi() [w(._a)] -(nU+c-1), (2.7) (2.7a) n+c>l. From the above posterior densities, the posterior means of o, also respectively, are a+wl o (_a) n+b-I E Ai i where A * (_a) I A B o* C o [. -I o* -I (n,_a,a,b) r(nu+b)[a + (n,9,a ,a (2.8) n+b>l, 2) B(n-l,,a wl(_a)]-(n+b) ,a2), nu>l, (2.8a) (2.9) (ill) -I (nu,_._a, l) C(nu-1,_._a, (iv) w i where D i D I (a) (2.10) i (__a) -I (__e)= D (nv,_q,c) (2.11) nu+c>2 n+c-2 r(nu+c-l)[wi(__a)] -( n +c- (2.11a) and the posterior variances of o are: (i) Var(o) E Ai i (ll) Var() *2 [a+wl (_a) (_a) 2 (n +b-I 2 (n +b-2) B(nu,_a,a l,a 2) B(n-2,_a,a l,a 2) B2 (nu,_, a ,a 2 n+b>2, B2(n-l,_a, al,a 2) (2.12) (2.13) 125 ESTIMATION OF THE SCALE PARAMETER OF GAMMA MODEL provided nu > 2; C(n,_a, ) C(n-2.__a, ) Var(o) C 2(n 1,_a, ) (2.14) 2 c (n,,_a, ) (iv) w Vat(o) Z D t(a) given by and o are random variables u-I n(,o) ( of > 3. (2.15) having the mixture prior distributions (1.6) and (l.6a), the likelihood function in this case must satisfy the following relation: where u nB- c IS A RANDOM VARIABLE ESTIMATION OF o WHEN When both i (_=) (nB+c-2) 2 (n+c-3) i 3. 2 i=l Z rn(,) i - o-n’ exp[-wi(a)/o] (3.1) This implies that the marginal posterior probability distribution xi is given by * Pj (_) Pj (_) J --1,2 __lps, (_) (3.2) ,m. where Also, the conditional posterior density of o given is given by Uffij h(o[fJ)ffiQ-l(a’j’vJ)- iZ exp[-{wi_(a)+B.}/o]3 -(nj+vj+1) (3.4) where Q(_a,.j,uj) r(nj+j) i [w_i(__a) Ij]-(nlj+J ). + (3.4a) Now, the product of (3.2) and (3.4) yields the Joint posterior distribution of and q. The (unconditional) posterior mean of o r. r. i j=l wi(a) Q_i,J (a) nuj+ vj-I is of the form n,j+ vj> I, (3 5) where * pj (_a) Q*!,J(-a) provided nuj +j>l Q for all j -I uj]-(nuj+vJ) r(nuj 1,2 m; and the posterior variance is m Vat(o) + (__a, uj,vj) E E Q i__ J=l *2 vj) [wi_(_a) [w (-q) + I (nuj+ + (3.5a) 2 vj-l)2(nj+j-2) (3.6) M.E. GHITANY 126 provided 4. for all j=l,2, nj+>2 m. IS FIXED HOMOGENEOUS CASE WHEN In the case of simple random sample from a complete life test, the posterior mean, and variance of o can be derived from those in section 2 by letting =I, for 1,2, ...,k. all r Respectively considering the priors (1.2), (1.3), (1.4) and (1.5), the posterior means of o, in the homogeneous case, are: (i) * (+ nx o > I, nu + b nu + b-l’ (4.1) where x is the sample mean; (ii) --’(n- 2, nx) n, > 2 (4 2) mx] I/2 (4.3) n y*(n- 1, nx) (iii) , Kn_2( Kn_l( o (iv) * /All/Z) 2[n /]I/2) 2[n nx 0 nD+ c > (4.4) 2, nB+c- 2 and the posterior variances are: (i) n)2 (a + Var(o) (nu + b 1) 2 (ll) Y(nl-l, n) (nj73n Var(o) nu + (nu + b f2(nl- b > (4.5) 2, 2) -y2(n,l-2,n,) (n)2 1, n) n > 3, (4.6) (ill) Vat(o) ZnB_ (2 [n/X] (iv) Var(o) Kn-2(2[nx/]I/2) I/2) Kn-3 ([nx/l] I/2 2 I/2 K nB_2(2 [nx/] (n)2 n+c > (nl + c-2) 2 (nl + c-3) The above results agree with Bhattacharya [6], Canavos knx, (4.7) (4.8) 3. and Tsokos [7], and Lwln (l.6a), the and Singh [8]. 5. - HOMOGENEOUS CASE WHEN B IS A RANDOM VARIABLE When B and o have the mixture distribution given by (1.6) and posterior mean of o in this case is , o m Z Jffil p 2 nx nj+ + j I’ nj + > (5 I) 127 ESTIMATION OF THE SCALE PARAMETER OF GAMMA MODEL where * --P-J- 1,2 s=l Ps with p] pj u (5. Ia) m r(nj_ + _j j- rn(j) (nx + (5. Ib) nj and the posterior variance of o takes the form m Vat(O) I j=l P 2 (nx + I,) *2 2 (n,j+ j-l) (n,j+j-2) These results agree with Lwtn and Stngh ACKNOWLEDGEMENT. 1 would llke to [8], and Martz nj+ j > 2. (5.2) and Waller [I0]. express my thanks to L. Ringuette and the referee for their valuable comments and suggestions for improving the presentation of this paper. REFERENCES I. 2. 3. 4. 5. 6. 7. 8. 9. Outllers in Statistical Dgt_a Second Edition, John BARNETT, V. and LEWIS, T., Wiley & Sons, 1978. SINHA, S.K., Life Testing and Rellabillty Estimation for Non-Ilomogeneous Data a Bayesian Approach, Comm. Statist. A-Theory Methods 2(3) (1973) 235-243. LINGAPPAIAH, G.S., Effect of Outllers of the Estimation of Parameters, Metrlka 23(1976) 27-30. LINGAPPAIAH, G.S., Problem of Estimation when the Outllers are Present, TrabaJos. Estadit. ny@stgaclo_n 0per. 30(3)(1979) 71-80. RAIFFA, H., Decision Analysis, MA:Addison-Wesley, 1968. BHATTACHARYA, S.K., Bayesian Approach to Life Testing and Reliability Estimation, J. Amer. Statist. Assoc. 26(1976) 48-62. CANAVOS, G.C. and TSOKOS, C.P., A Study of an Ordinary and Empirical Bayes Approach to Reliability Estimation in the Gamma Life Testing Model, Proceedings 1971 Annual Reliability and Malntalnabil.ty Symposium, (1971), 343-349. LWIN, T. and SINGH, N., Bayesian Analysts of the Gamma Model in Reliability Estimation, IEEE Trans. Rellability R-23(5), (1974), 314-319. ANDREWS, L.C., S_pec[al Functions for Engineers and Ap_pplled Mathematicians, Macmillan Publishing Company, 1985. I0. MARTZ, H.F. and WALLER, R.A., Bayesian Rellab[llty Analysis, John Wiley & Sons, 1982.