TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS TRANSFORMATION OF RANDOM VARIABLES • If X is an rv with cdf F(x), then Y=g(X) is also an rv. • If we write y=g(x), the function g(x) defines a mapping from the original sample space of X, S, to a new sample space, , the sample space of the rv Y. g(x): S 2 TRANSFORMATION OF RANDOM VARIABLES • Let y=g(x) define a 1-to-1 transformation. That is, the equation y=g(x) can be solved 1 x g ( y) uniquely: • Ex: Y=X-1 X=Y+1 1-to-1 • Ex: Y=X² X=± sqrt(Y) not 1-to-1 • When transformation is not 1-to-1, find disjoint partitions of S for which transformation is 1-to-1. 3 TRANSFORMATION OF RANDOM VARIABLES If X is a discrete r.v. then S is countable. The sample space for Y=g(X) is ={y:y=g(x),x S}, also countable. The pmf for Y is fY y P Y y xg 1 y P X x xg 1 y f x 4 Example • Let X~GEO(p). That is, f (x) p(1 p) x 1 for x 1,2,3,... • Find the p.m.f. of Y=X-1 • Solution: X=Y+1 f Y ( y) f X ( y 1) p(1 p) y for y 0,1,2,... • P.m.f. of the number of failures before the first success • Recall: X~GEO(p) is the p.m.f. of number of Bernoulli trials required to get the first success 5 Example • Let X be an rv with pmf 1/ 5, x 2 1/ 6, x 1 p x 1/ 5, x 0 1/15, x 1 11/ 30, x 2 Let Y=X2. S ={2, 1,0,1,2} 1/ 5, y 0 p( y ) 7 / 30, y 1 17 / 30, y 4 ={0,1,4} 6 FUNCTIONS OF CONTINUOUS RANDOM VARIABLE • Let X be an rv of the continuous type with pdf f. Let y=g(x) be differentiable for all x and non-zero. Then, Y=g(X) is also an rv of the continuous type with pdf given by d 1 1 g ( y ) | for y f ( g ( y )) | dy h( y ) 0 o.w. 7 FUNCTIONS OF CONTINUOUS RANDOM VARIABLE • Example: Let X have the density 1, 0 x 1 f x 0, otherwise Let Y=eX. X=g1 (y)=log Y dx=(1/y)dy. 1 h y 1. ,0 log y 1 y 1 , 1 y e h y y 0, otherwise 8 FUNCTIONS OF CONTINUOUS RANDOM VARIABLE • Example: Let X have the density 1 x2 / 2 f x e , x . 2 Let Y=X2. Find the pdf of Y. 9 CDF method 2 x F ( x ) 1 e for x 0 • Example: Let Consider Y e X . What is the p.d.f. of Y? • Solution: FY ( y) P(Y y) P(e X y) P(X ln y) FX (ln y) 1 y 2 for y 1 d f Y ( y) FY ( y) 2 y 3 for y 1 dy 10 CDF method • Example: Consider a continuous r.v. X, and Y=X². Find p.d.f. of Y. • Solution: FY ( y) P(X 2 y) P( y X y ) FX ( y ) FX ( y ) d d f Y ( y) f X ( y ) ( y ) f X ( y ) ( y ) dy dy 1 [f X ( y ) f X ( y )] 2 y 11 TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES BIVARIATE TRANSFORMATIONS DISCRETE CASE • Let X1 and X2 be a bivariate random vector with a known probability distribution function. Consider a new bivariate random vector (U, V) defined by U=g1(X1, X2) and V=g2(X1, X2) where g1(X1, X2) and g2(X1, X2) are some functions of X1 and X2 . 13 DISCRETE CASE • Then, the joint pmf of (U,V) is fU ,V u, v PrU u,V v f X1 , X 2 x1 , x2 AU ,V x1 , x2 14 EXAMPLE • Let X1 and X2 be independent Poisson distribution random variables with parameters 1 and 2. Find the distribution of U=X1+X2. 15 CONTINUOUS CASE • Let X=(X1, X2, …, Xn) have a continuous joint distribution for which its joint pdf is f, and consider the joint pdf of new random variables Y1, Y2,…, Yk defined as Y1 g1 X 1 , X 2 , , X n Y2 g 2 X 1 , X 2 , , X n * Yk g k X 1 , X 2 , , X n T X Y ~ ~ 16 CONTINUOUS CASE • If the transformation T is one-to-one and onto, then there is no problem of determining the inverse transformation, and we can invert the equation in (*) and obtain new equations as follows: 17 CONTINUOUS CASE 1 x1 g1 y1 , y2 , , yk x2 g 21 y1 , y2 , , yk * * 1 xn g n y1 , y2 , , yk n 1 g i / y i • Assuming that the partial derivatives exist at every point (y1, y2,…,yk=n). Under these assumptions, we have the following determinant J 18 CONTINUOUS CASE g11 g11 yn y1 J det 1 g 1 g n n yn y1 called as the Jacobian of the transformation specified by (**). Then, the joint pdf of Y1, Y2,…,Yk can be obtained by using the change of variable technique of multiple variables. 19 CONTINUOUS CASE • As a result, the new p.d.f. is defined as follows: f X1 ,, X n g11 , g 21 ,, g n1 | J |, for y1 , y2 ,, yn g y1 , y2 ,, yn 0, otherwise 20 Example • Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, X ~ Gamma , n i 1 i n i 1 i • Prove this for n=2 (for simplicity). 21 M.G.F. Method • If X1,X2,…,Xn are independent random variables with MGFs M xi (t), then the n MGF of Y Xi is MY (t) MX1 (t)...MXn (t) i 1 22 Example • Recall that I claimed: ~ Bin n , p . Then, independent Let X i i X ~ Bin n n k i 1 i 1 2 n , p . k • Let’s prove this. 23 Example • Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, X ~ Gamma , n i 1 i n i 1 i • We proved this with transformation technique for n=2. • Now, prove this for general n. 24 More Examples on Transformations • Example 1: • Recall that I claimed: If X~N( , 2), then X Z ~ N (0,1) • Let’s prove this. 25 Example 2 • Recall that I claimed: Let X be an rv with X~N(0, 1). Then, X ~ 2 2 1 Let’s prove this. 26 Example 3 • Let X~N( , 2) and Y=exp(X). Find the p.d.f. of Y. 27 Example 4 Recall that I claimed: • If X and Y have independent N(0,1) distribution, then Z=X/Y has a Cauchy distribution with =0 and σ=1. Recall the p.d.f. of Cauchy distribution: f (x) 1 1 ( 1 x )2 , 0 Let’s prove this claim. 28 Example 5 • See Examples 6.3.12 and 6.3.13 in Bain and Engelhardt (pages 207 & 208 in 2nd edition). This is an example of two different transformations: • In Example 6.3.12: In Example 6.3.13: X1 & X2 ~ Exp(1) X1 & X2 ~ Exp(1) Y1=X1 Y1=X1-X2 Y2=X1+X2 Y2=X1+X2 29 Example 6 • Let X1 and X2 are independent with N(μ1,σ²1) and N(μ2,σ²2), respectively. Find the p.d.f. of Y=X1-X2. 30