TRANSFORMATION OF RANDOM VARIABLES

advertisement
TRANSFORMATION OF
FUNCTION OF A RANDOM
VARIABLE
UNIVARIATE
TRANSFORMATIONS
TRANSFORMATION OF RANDOM
VARIABLES
• If X is an rv with cdf F(x), then Y=g(X) is
also an rv.
• If we write y=g(x), the function g(x) defines
a mapping from the original sample space
of X, S, to a new sample space, , the
sample space of the rv Y.
g(x): S 
2
TRANSFORMATION OF RANDOM
VARIABLES
• Let y=g(x) define a 1-to-1 transformation.
That is, the equation y=g(x) can be solved
1
x

g
(y)
uniquely:
• Ex: Y=X-1  X=Y+1 1-to-1
• Ex: Y=X²  X=± sqrt(Y)
not 1-to-1
• When transformation is not 1-to-1, find
disjoint partitions of S for which
transformation is 1-to-1.
3
TRANSFORMATION OF RANDOM
VARIABLES
If X is a discrete r.v. then S is countable. The sample
space for Y=g(X) is ={y:y=g(x),x S}, also
countable. The pmf for Y is
fY  y   P Y  y  

xg 1  y 
P  X  x 

xg 1  y 
f  x
4
Example
• Let X~GEO(p). That is, f (x)  p(1  p) x 1 for x  1,2,3,...
• Find the p.m.f. of Y=X-1
• Solution: X=Y+1
y
f Y ( y)  f X ( y  1)  p(1  p) for y  0,1,2,...
• P.m.f. of the number of failures before the first
success
• Recall: X~GEO(p) is the p.m.f. of number of
Bernoulli trials required to get the first success
5
Example
• Let X be an rv with pmf
 1/ 5, x  2
 1/ 6, x  1

p  x    1/ 5, x  0
 1/15, x  1

11/ 30, x  2
Let Y=X2.
S ={2,  1,0,1,2}
 1/ 5, y  0

p( y )   7 / 30, y  1
17 / 30, y  4

 ={0,1,4}
6
FUNCTIONS OF CONTINUOUS
RANDOM VARIABLE
• Let X be an rv of the continuous type with pdf f.
Let y=g(x) be differentiable for all x and non-zero.
Then, Y=g(X) is also an rv of the continuous type
with pdf given by
d 1

1
g ( y) | for y 
 f ( g ( y)) |
dy
h( y)  
0
o.w.

7
FUNCTIONS OF CONTINUOUS
RANDOM VARIABLE
• Example: Let X have the density
1, 0  x  1
f  x  
0, otherwise
Let Y=eX.
X=g1 (y)=log Y dx=(1/y)dy.
1
h  y   1. ,0  log y  1
y
1
 , 1 y  e
h y    y
0, otherwise

8
FUNCTIONS OF CONTINUOUS
RANDOM VARIABLE
• Example: Let X have the density
1  x2 / 2
f  x 
e
,   x  .
2
Let Y=X2. Find the pdf of Y.
9
THE PROBABILITY INTEGRAL
TRANSFORMATION
• Let X have continuous cdf FX(x) and define
the rv Y as Y=FX(x). Then, Y is uniformly
distributed on (0,1), that is,
P(Y  y) = y, 0<y<1.
• This is very commonly used, especially in
random number generation procedures.
10
Example 1
• Generate random numbers from X~
Exp(1/λ) if you only have numbers from
Uniform(0,1).
11
Example 2
• Generate random numbers from the
distribution of X(1)=min(X1,X2,…,Xn) if X~
Exp(1/λ) if you only have numbers from
Uniform(0,1).
12
Example 3
• Generate random numbers from the
following distribution:
13
CDF method
2x
F
(
x
)

1

e
for x  0
• Example: Let
Consider Y  eX . What is the p.d.f. of Y?
• Solution:
FY ( y)  P(Y  y)  P(e X  y)  P(X  ln y)
 FX (ln y)  1  y  2 for y  1
d
f Y ( y) 
FY ( y)  2 y 3 for y  1
dy
14
CDF method
• Example: Consider a continuous r.v. X,
and Y=X². Find p.d.f. of Y.
• Solution:
FY ( y)  P(X 2  y)  P( y  X  y )  FX ( y )  FX ( y )
d
d
f Y ( y)  f X ( y ) ( y )  f X (  y ) (  y )
dy
dy
1

[f X ( y )  f X ( y )]
2 y
15
TRANSFORMATION OF
FUNCTION OF TWO OR MORE
RANDOM VARIABLES
BIVARIATE
TRANSFORMATIONS
DISCRETE CASE
• Let X1 and X2 be a bivariate random vector
with a known probability distribution function.
Consider a new bivariate random vector (U, V)
defined by U=g1(X1, X2) and V=g2(X1, X2)
where g1(X1, X2) and g2(X1, X2) are some
functions of X1 and X2 .
17
DISCRETE CASE
• If B is any subset of 2, then (U,V)B iff
(X1,X2)A where
AU ,V  x1 , x2  : g1x1 , x2 , g 2 x1 , x2   B 
• Then, Pr(U,V)B=Pr(X1,X2)A and
probability distribution of (U,V) is completely
determined by the probability distribution of
(X1,X2). Then, the joint pmf of (U,V) is
fU ,V u , v   Pr U  u ,V  v   Pr  X1 , X 2   AU ,V  

 x1 ,x2  AU ,V
f X 1 , X 2  x1 , x2 
18
2
EXAMPLE
• Let X1 and X2 be independent Poisson
distribution random variables with parameters
1 and 2. Find the distribution of U=X1+X2.
19
CONTINUOUS CASE
• Let X=(X1, X2, …, Xn) have a continuous joint
distribution for which its joint pdf is f, and
consider the joint pdf of new random variables
Y1, Y2,…, Yk defined as
Y1  g1  X 1 , X 2 , , X n  
Y2  g 2  X 1 , X 2 , , X n 
*


Yk  g k  X 1 , X 2 , , X n 
T
X Y
~
~
20
CONTINUOUS CASE
• If the transformation T is one-to-one and onto,
then there is no problem of determining the
inverse transformation. An and Bk=n, then
T:AB. T-1(B)=A. It follows that there is a oneto-one correspondence between the points
(y1, y2,…,yk) in B and the points (x1, x2,…,xn) in
A. Therefore, for (y1, y2,…,yk)B we can invert
the equation in (*) and obtain new equation as
follows:
21
CONTINUOUS CASE
1
x1  g1  y1 , y2 , , yk  

x2  g 21  y1 , y2 , , yk  
* *


1
xn  g n  y1 , y2 , , yk  n 

1
g i / y i
• Assuming that the partial derivatives
exist at every point (y1, y2,…,yk=n)B. Under
these assumptions, we have the following
determinant J
22
CONTINUOUS CASE
 g11
g11 



yn 
 y1
J  det 

 
1 
 g 1

g
n
 n 

yn 
 y1

called as the Jacobian of the transformation
specified by (**). Then, the joint pdf of Y1,
Y2,…,Yk can be obtained by using the change
of variable technique of multiple variables.
23
CONTINUOUS CASE
• As a result, the function g is defined as
follows:
1 1
1


f
g
,
g
,

,
g
 X1,, X n 1 2
n | J |, for y1, y 2 ,, y n   B
gy , y ,, y  
1
2
n


0, otherwise
24
Example
• Recall that I claimed: Let X1,X2,…,Xn be
independent rvs with Xi~Gamma(i, ).
Then,  X ~ Gamma   , 
n
i 1
i

n
i 1
i

• Prove this for n=2 (for simplicity).
25
M.G.F. Method
• If X1,X2,…,Xn are independent random
variables with
MGFs
M
xi (t), then the
n
MGF of Y   Xi is MY (t)  MX1 (t)...MXn (t)
i 1
26
Example
• Recall that I claimed: Let X1,X2,…,Xn be
independent rvs with Xi~Gamma(i, ).
Then,
 X ~ Gamma   , 
n
i 1
i

n
i 1
i

• We proved this with transformation
technique for n=2.
• Now, prove this for general n.
27
Example
• Recall that I claimed:
~ Bin  n , p . Then,
independent
Let X
i
i
 X ~ Bin  n  n 
k
i 1
i
1
2
 n , p .
k
• Let’s prove this.
28
More Examples on
Transformations
• Example 1:
• Recall the relationship:
If Z 
X 

~ N (0,1) , then X~N( , 2)
• Let’s prove this.
29
Example 2
• Recall that I claimed:
Let X be an rv with X~N(0, 1). Then,
X ~
2
2
1
Let’s prove this.
30
Example 3
Recall that I claimed:
• If X and Y have independent N(0,1)
distribution, then Z=X/Y has a Cauchy
distribution with =0 and σ=1.
Recall the p.d.f. of Cauchy distribution:
f (x) 
1
 1  (
1
x 

)2
,  0
Let’s prove this claim.
31
Example 4
• See Examples 6.3.12 and 6.3.13 in Bain
and Engelhardt (pages 207 & 208 in 2nd
edition). This is an example of two
different transformations:
• In Example 6.3.12:
In Example 6.3.13:
X1 & X2 ~ Exp(1)
Y1=X1
Y2=X1+X2
X1 & X2 ~ Exp(1)
Y1=X1-X2
Y2=X1+X2
32
Example 5
• Let X1 and X2 are independent with
N(μ1,σ²1) and N(μ2,σ²2), respectively. Find
the p.d.f. of Y=X1-X2.
33
Example 6
• Let X~N( , 2) and Y=exp(X). Find the
p.d.f. of Y.
34
Download