Lecture 8

advertisement
8. One Function of Two Random
Variables
Given two random variables X and Y and a function g(x,y),
we form a new random variable Z as
Z  g ( X , Y ).
(8-1)
Given the joint p.d.f f XY ( x, y ), how does one obtain f Z (z ),
the p.d.f of Z ? Problems of this type are of interest from a
practical standpoint. For example, a receiver output signal
usually consists of the desired signal buried in noise, and
the above formulation in that case reduces to Z = X + Y.
1
PILLAI
It is important to know the statistics of the incoming signal
for proper receiver design. In this context, we shall analyze
problems of the following type:
X Y
X Y
max( X , Y )
min( X , Y )
Z  g ( X ,Y )
X 2 Y 2
XY
(8-2)
X /Y
tan 1 ( X / Y )
Referring back to (8-1), to start with
FZ ( z )  P Z ( )  z   P g ( X , Y )  z   P( X , Y )  Dz 


x , yDz
f XY ( x, y )dxdy,
(8-3)
2
PILLAI
where Dz in the XY plane represents the region such
that g ( x, y )  z is satisfied. Note that Dz need not be simply
connected (Fig. 8.1). From (8-3), to determine FZ (z) it is
enough to find the region Dz for every z, and then evaluate
the integral there.
We shall illustrate this method through various examples.
Y
Dz
Dz
Fig. 8.1
X
3
PILLAI
Example 8.1: Z = X + Y. Find
Solution:
FZ ( z )  P  X  Y  z   

y  

f Z (z ).
z y
x  
f XY ( x, y )dxdy,
(8-4)
since the region Dz of the xy plane where x  y  z is the
shaded area in Fig. 8.2 to the left of the line x  y  z.
Integrating over the horizontal strip along the x-axis first
(inner integral) followed by sliding that strip along the y-axis
from   to   (outer integral) we cover the entire shaded
area.
y
x z y
x
Fig. 8.2
4
PILLAI
We can find f Z (z) by differentiating FZ (z) directly. In this
context, it is useful to recall the differentiation rule in (715) - (7-16) due to Leibnitz. Suppose
H ( z) 

b( z )
a( z)
h ( x, z )dx.
(8-5)
Then
b ( z ) h ( x , z )
dH ( z ) db( z )
da ( z )

h b( z ), z  
h a ( z ), z   
dx.
a
(
z
)
dz
dz
dz
z
(8-6)
Using (8-6) in (8-4) we get
 
z  y f
  z y

XY ( x, y ) 
fZ ( z)    
f XY ( x, y )dx  dy    f XY ( z  y, y )  0  
dy

 z 


z








f XY ( z  y, y )dy.
(8-7)
Alternatively, the integration in (8-4) can be carried out first
along the y-axis followed by the x-axis as in Fig. 8.3. 5
PILLAI
In that case
FZ ( z )  

x  

zx
y  
f XY ( x, y )dxdy,
y
(8-8)
and differentiation of (8-8)
gives

dFZ ( z )
f Z ( z) 

x  
dz


x  
y zx
  zx

  y  f XY ( x, y )dy dx
 z

f XY ( x, z  x )dx.
(8-9)
x
Fig. 8.3
If X and Y are independent, then
f XY ( x, y)  f X ( x) fY ( y)
(8-10)
and inserting (8-10) into (8-8) and (8-9), we get
f Z ( z)  

y  
f X ( z  y ) fY ( y )dy  

x  
f X ( x ) fY ( z  x )dx.
(8-11)
6
PILLAI
The above integral is the standard convolution of the
functions f X (z ) and fY (z) expressed two different ways. We
thus reach the following conclusion: If two r.vs are
independent, then the density of their sum equals the
convolution of their density functions.
As a special case, suppose that f X ( x)  0 for x  0 and fY ( y )  0
for y  0, then we can make use of Fig. 8.4 to determine the
new limits for Dz .
y
(z,0)
x z y
(0, z )
Fig. 8.4
x
7
PILLAI
In that case
FZ ( z )  
z
y 0

z y
x 0
f XY ( x, y )dxdy
or
z

z

y

 f ( z  y , y )dy, z  0,


f Z ( z )     f XY ( x, y )dx  dy  0 XY
y 0 z x 0



0,
z  0.

z
(8-12)
On the other hand, by considering vertical strips first in
Fig. 8.4, we get
FZ ( z )  
or
f Z ( z)  
z
x 0
z
x 0

z x
y 0
f XY ( x, y )dydx
 z f ( x ) f ( z  x )dx, z  0,
Y
f XY ( x, z  x )dx   y 0 X

0,
z  0,
(8-13)
if X and Y are independent random variables.
8
PILLAI
Example 8.2: Suppose X and Y are independent exponential
r.vs with common parameter , and let Z = X + Y.
Determine f Z (z).
 x
 y
f
(
x
)


e
U
(
x
),
f
(
y
)


e
U ( y ), (8-14)
Solution: We have X
Y
and we can make use of (13) to obtain the p.d.f of Z = X + Y.
z
f Z ( z)    e
0
2 x
e
 ( z  x )
dx  e
2 z

z
0
dx  z2ezU ( z ).
(8-15)
As the next example shows, care should be taken in using
the convolution formula for r.vs with finite range.
Example 8.3: X and Y are independent uniform r.vs in the
common interval (0,1). Determine f Z (z), where Z = X + Y.
Solution: Clearly, Z  X  Y  0  z  2 here, and as Fig. 8.5
shows there are two cases of z for which the shaded areas are
quite different in shape and they should be considered
9
separately.
PILLAI
y
y
x z y
x z y
x
(a ) 0  z  1
( b) 1  z  2
x
Fig. 8.5
For 0  z  1,
z2
1 dxdy   ( z  y )dy  , 0  z  1.
y 0
2
(8-16)
(2  z )2
 1 
(1  z  y )dy  1 
, 1  z  2.
y  z 1
2
(8-17)
FZ ( z )  
z
y 0

z y
x 0
z
For 1  z  2, notice that it is easy to deal with the unshaded
region. In that case
1
1
FZ ( z )  1  PZ  z   1  
1 dxdy
y  z 1  x  z  y
1
10
PILLAI
Using (8-16) - (8-17), we obtain
0  z  1,
dFZ ( z )  z
f Z ( z) 

dz
2  z, 1  z  2.
convolution of f X (x) and fY ( y ),
By direct
same result as above. In fact, for
0  z 1
(8-18)
we obtain the
(Fig. 8.6(a))
z
f Z ( z )   f X ( z  x) fY ( x)dx   1 dx  z.
0
and for
1 z  2
(8-19)
(Fig. 8.6(b))
f Z ( z)  
1
z 1
1 dx  2  z.
(8-20)
Fig 8.6 (c) shows f Z (z) which agrees with the convolution
of two rectangular waveforms as well.
11
PILLAI
f X ( z  x) fY ( x)
f X ( z  x)
fY (x)
x
1
z 1
x
z
x
z
(a ) 0  z  1
f X ( z  x)
fY (x)
x
f X ( z  x) fY ( x)
z 1
1
z
x
z 1
x
1
( b) 1  z  2
f Z (z )
0
Fig. 8.6 (c)
1
2
z
12
PILLAI
Example 8.3: Let Z  X  Y . Determine its p.d.f
Solution: From (8-3) and Fig. 8.7
FZ ( z )  P  X  Y  z   

y  

z y
x  
f Z (z ).
f XY ( x, y )dxdy
and hence
dFZ ( z )
fZ ( z) 

dz


y 
 

 z

z y
x 


f XY ( x, y )dx  dy  
f XY ( y  z , y )dy.


(8-21)
If X and Y are independent, then the above formula reduces
to

f Z ( z)  

f X ( z  y ) fY ( y )dy  f X (  z)  fY ( y ),
which represents the convolution of
y
y
(8-22)
f X (  z ) with fY (z ).
x y  z
x yz
x
13
Fig. 8.7
PILLAI
As a special case, suppose
f X ( x)  0, x  0, and fY ( y)  0, y  0.
In this case, Z can be negative as well as positive, and that
gives rise to two situations that should be analyzed
separately, since the region of integration for z  0 and z  0
are quite different. For z  0, from Fig. 8.8 (a) y
FZ ( z )  

y 0

z y
x 0
f XY ( x, y )dxdy
x z y
and for z  0, from Fig 8.8 (b)
FZ ( z )  

y  z

z y
x 0
z
x z y
After differentiation, this gives
z  0,
z  0.
(a)
y
f XY ( x, y )dxdy
  f ( z  y , y )dy ,
  XY
f Z ( z )   0 
f ( z  y , y )dy ,

   z XY
x
z
z
(8-23)
x
Fig. 8.8 (b)
14
PILLAI
Example 8.4: Given Z = X / Y, obtain its density function.
(8-24)
Solution: We have FZ ( z)  PX / Y  z  .
The inequality X / Y  z can be rewritten as X  Yz if Y  0,
and X  Yz if Y  0. Hence the event  X / Y  z  in (8-24) need __
to be conditioned
by the event A  Y  0 and its compliment A .
__
Since A  A  , by the partition theorem, we have
X / Y  z ( X / Y  z )  ( A  A ) ( X / Y  z )  A ( X / Y  z )  A
and hence by the mutually exclusive property of the later
two events
P  X / Y  z   P  X / Y  z, Y  0  P  X / Y  z, Y  0
(8-25)
 P  X  Yz , Y  0  P  X  Yz , Y  0 .
Fig. 8.9(a) shows the area corresponding to the first term,
and Fig. 8.9(b) shows that corresponding to the second term
y
y
in (8-25).
x  yz
x  yz
x
x
(a)
15
Fig. 8.9
(b)
PILLAI
Integrating over these two regions, we get
FZ ( z )  

y 0

yz
x  
f XY ( x, y )dxdy  
0
y  


x  yz
f XY ( x, y )dxdy.
(8-26)
Differentiation with respect to z gives

0
0

f Z ( z )   yf XY ( yz , y )dy   (  y ) f XY ( yz , y )dy

  | y | f XY ( yz , y )dy,

   z  .
(8-27)
Note that if X and Y are nonnegative random variables, then
the area of integration reduces to that shown in Fig. 8.10.
y
x  yz
x
Fig. 8.10
16
PILLAI
This gives
FZ ( z )  
or

y 0

yz
f XY ( x, y )dxdy
x 0
 y f ( yz , y )dy,
z  0,
f Z ( z )  0 XY

0,
otherwise.
(8-28)
Example 8.5: X and Y are jointly normal random variables
with zero mean so that
f XY ( x, y ) 

1
21 2 1  r
2
e
 x 2 2 rxy y 2
1



2 (1 r 2 )   12  1 2  22




.
(8-29)
Show that the ratio Z = X / Y has a Cauchy density function
centered at r 1 /  2 .
Solution: Inserting (8-29) into (8-27) and using the fact
that f XY ( x, y)  f XY ( x, y), we obtain
f Z ( z) 
2
21 2 1  r
2


0
ye
 y 2 / 2 02
 02 ( z )
dy 
,
2
1 2 1  r
17
PILLAI
where
 ( z) 
2
0
z2
 12

1  r2
2 rz
 1 2

.
1
 22
Thus
 1 2 1  r 2 / 
f Z ( z)  2
,
2
2
2
 2 ( z  r 1 /  2 )   1 (1  r )
(8-30)
which represents a Cauchy r.v centered at r 1 /  2 . Integrating
(8-30) from   to z, we obtain the corresponding
distribution function to be
1 1
 2 z  r 1
FZ ( z )   arctan
.
2
2 
1 1  r
Example 8.6: Z  X 2  Y 2 . Obtain
Solution: We have
FZ ( z )  P X 2  Y 2  z   

(8-31)
f Z (z ).
X 2 Y 2  z
f XY ( x, y )dxdy.
(8-32)
18
PILLAI
But, X 2  Y 2  z represents the area of a circle with radius
and hence from Fig. 8.11,
FZ ( z )  
z
y  z

z y2
x  z  y 2
z,
(8-33)
f XY ( x, y )dxdy.
This gives after repeated differentiation
fZ ( z)  
z
y  z
1
2 z y
2
f

2
2
(
z

y
,
y
)

f
(

z

y
, y ) dy.
XY
XY
(8-34)
As an illustration, consider the next example.
y
z
X 2 Y 2  z
z
x
 z
Fig. 8.11
19
PILLAI
Example 8.7 : X and Y are independent normal r.vs with zero
Mean and common variance  2 . Determine f Z (z) for Z  X 2  Y 2 .
Solution: Direct substitution of (8-29) with r  0, 1   2  
Into (8-34) gives
f Z ( z) 


y  z
e  z / 2


2
e( z y

2
2 z  y 2  2
1
z
2
2

 /2
0
1
2
 y ) / 2
2
2
e  z / 2

 dy 
 2

z cos 
1  z / 2 2
d 
e
U ( z ),
2
2

z cos 
2

z
0
1
zy
2
dy
(8-35)
where we have used the substitution y  z sin  . From (8-35)
we have the following result: If X and Y are independent zero
mean Gaussian r.vs with common variance  2 , then
X 2  Y 2 is an exponential r.vs with parameter 2 2 .
Example 8.8 : Let Z  X 2  Y 2 . Find f Z (z).
Solution: From Fig. 8.11, the present case corresponds to20 a
PILLAI
circle with radius z 2 . Thus
z2  y2
z
f ( x, y )dxdy.
 
And by repeated differentiation, we obtain
FZ ( z ) 
fZ ( z) 

y  z
f
z
z
z y
z
2
2
XY
x 
z2  y2
XY

( z 2  y 2 , y )  f XY (  z 2  y 2 , y ) dy.
(8-36)
Now suppose X and Y are independent Gaussian as in
Example 8.7. In that case, (8-36) simplifies to
f Z ( z)  2

z
z
0
2z
 2
1
2
z  y 2
2
e
2
 z 2 / 2 2

/2
0
e( z
2
 y 2  y 2 ) / 2 2
dy 
2z
 2
e z
2
/ 2 2

z
1
0
z y
2
2
2
2
z cos 
z
d  2 e  z / 2 U ( z ),
z cos 

dy
(8-37)
which represents a Rayleigh distribution. Thus, if W  X  iY ,
where X and Y are real, independent normal r.vs with zero
mean and equal variance, then the r.v W  X  Y has a
Rayleigh density. W is said to be a complex Gaussian r.v
with zero mean, whose real and imaginary parts are
independent r.vs. From (8-37), we have seen that its 21
magnitude has Rayleigh distribution.
PILLAI
2
2
What about its phase
  tan 1 
X
Y

?

(8-38)
Clearly, the principal value of  lies in the interval (  / 2,  / 2).
If we let U  tan   X / Y , then from example 8.5, U has a
Cauchy distribution with (see (8-30) with 1   2 , r  0 )
fU ( u ) 
1/ 
,
2
u 1
   u  .
(8-39)
As a result
1 /  ,   / 2     / 2,
1
1
1/ 
f ( ) 
fU (tan  ) 

2
2
otherwise .
| d / du |
(1 / sec  ) tan   1  0,
(8-40)
To summarize, the magnitude and phase of a zero mean
complex Gaussian r.v has Rayleigh and uniform distributions
respectively. Interestingly, as we will show later, these two
22
derived r.vs are also independent of each other!
PILLAI
Let us reconsider example 8.8 where X and Y have nonzero
means  X and Y respectively. Then Z  X 2  Y 2 is said to
be a Rician r.v. Such a scene arises in fading multipath
situation where there is a dominant constant component
(mean) in addition to a zero mean Gaussian r.v. The constant
component may be the line of sight signal and the zero mean
Gaussian r.v part could be due to random multipath
components adding up incoherently (see diagram below).
The envelope of such a signal is said to have a Rician p.d.f.
Example 8.9: Redo example 8.8, where X and Y have
nonzero means  X and Y respectively.
Multipath/Gaussian
noise
Line of sight
Solution: Since
f XY ( x, y ) 
signal (constant)
1
2
2
e [( x   X )
2
 ( y  Y ) 2 ] / 2 2
,
substituting this into (8-36) and letting
a
Rician
Output

23
PILLAI
x  z cos  , y  z sin  ,    X2  Y2 ,  X   cos  , Y   sin  ,
we get the Rician probability density function to be
ze  ( z   ) / 2
f Z ( z) 
2 2
2
2
2
/2

 /2
e
z cos(  ) /  2
 e  z cos(  ) / 
2
d
3/2
ze ( z   ) / 2  /2 z cos(  ) /  2
z cos(  ) /  2

e
d


e
d 

2


/2
 /2

2
2
2
2
ze ( z   ) / 2
 z 

I
,
0
2 
2 2



2
2
2
(8-41)
where
1
I 0 ( ) 
2


2
0
 cos(  )
e
d 
1


0
e cos d
(8-42)
is the modified Bessel function of the first kind and zeroth
order.
Example 8.10: Z  max( X , Y ), W  min( X , Y ). Determine
Solution: The functions max and min are nonlinear
f Z (z).
24
PILLAI
operators and represent special cases of the more general
order statistics. In general, given any n-tuple X1, X 2 ,  , X n ,
we can arrange them in an increasing order of magnitude
such that
X (1)  X ( 2 )    X ( n ) ,
(8-43)
where X (1)  min  X 1, X 2 , , X n  , and X ( 2 ) is the second smallest
value among X1, X 2 ,  , X n , and finally X ( n )  max  X 1, X 2 , , X n  .
If X 1, X 2 ,  , X n represent r.vs, the function X ( k ) that takes on
the value x( k ) in each possible sequence x1, x2 ,  , xn  is
known as the k-th order statistic. ( X (1) , X (2) , , X ( n ) ) represent
the set of order statistics among n random variables. In this
context
R  X ( n )  X (1)
(8-44)
represents the range, and when n = 2, we have the max and
25
min statistics.
PILLAI
Returning back to that problem, since
X ,
Z  max( X , Y )  
Y ,
X Y,
(8-45)
X Y,
we have (see also (8-25))
FZ ( z )  P max( X , Y )  z   P X  z , X  Y   Y  z , X  Y 
 P  X  z , X  Y   P Y  z , X  Y ,
since ( X  Y ) and ( X  Y ) are mutually exclusive sets that
form a partition. Figs 8.12 (a)-(b) show the regions
satisfying the corresponding inequalities in each term
y
y
above. y
xz
x y
X z
x
X Y
(a ) P( X  z, X  Y )
x y
X Y
yz

x
( z, z )

x
Y z
(b) P(Y  z, X  Y )
Fig. 8.12
(c )
26
PILLAI
Fig. 8.12 (c) represents the total region, and from there
FZ ( z)  P X  z,Y  z   FXY ( z, z).
(8-46)
If X and Y are independent, then
FZ ( z)  FX ( x) FY ( y )
and hence
f Z ( z)  FX ( z) fY ( z)  f X ( z) FY ( z).
(8-47)
Similarly
Y ,
W  min( X , Y )  
X ,
X Y,
X  Y.
(8-48)
Thus
FW ( w)  Pmin( X , Y )  w  PY  w, X  Y    X  w, X  Y .
27
PILLAI
Once again, the shaded areas in Fig. 8.13 (a)-(b) show the
regions satisfying the above inequalities and Fig 8.13 (c)
shows the overall region.
y
y
x y
xw
y
x y
( w, w)
yw
(a)
x
x
x
(b)
(c)
Fig. 8.13
From Fig. 8.13 (c),
FW ( w)  1  P W
 w  1  P  X  w, Y  w
 FX ( w)  FY ( w)  FXY ( w, w) ,
(8-49)
where we have made use of (7-5) and (7-12) with x2  y2  ,
and x1  y1  w.
28
PILLAI
Example 8.11: Let X and Y be independent exponential r.vs
with common parameter . Define W  min( X , Y ). Find fW (w) ?
Solution: From (8-49)
FW ( w)  FX ( w)  FY ( w)  FX ( w) FY ( w)
and hence
fW ( w)  f X ( w)  fY ( w)  f X ( w) FY ( w)  FX ( w) fY ( w).
But f X ( w)  fY ( w)  e w , and FX ( w)  FY ( w)  1  ew , so that
fW ( w)  2ew  2(1  e  w )e  w  2e 2 wU ( w).
(8-50)
Thus min ( X, Y ) is also exponential with parameter 2.
Example 8.12: Suppose X and Y are as give in the above
example. Define Z  min( X ,Y ) / max( X ,Y ). Determine f Z (z).
29
PILLAI
Solution: Although min( ) / max( ) represents a complicated
function, by partitioning the whole space as before, it is
possible to simplify this function. In fact
X /Y ,
Z 
Y / X ,
X Y,
(8-51)
X  Y.
As before, this gives
FZ ( z )  P  Z  z   P  X / Y  z, X  Y   P Y / X  z , X  Y 
 P  X  Yz , X  Y   P Y  Xz , X  Y  .
(8-52)
Since X and Y are both positive random variables in this case,
we have 0  z  1. The shaded regions in Figs 8.14 (a)-(b)
represent the two terms in the above sum.
y
x  yz
y
x y
x y
y  xz
x
x
(a)
Fig. 8.14
(b)
30
PILLAI
From Fig. 8.14
FZ ( z )  

0

yz
x 0
f XY ( x, y )dxdy  

0

xz
f XY ( x, y )dydx.
y 0
(8-53)
Hence



0
0
0
f Z ( z )   y f XY ( yz , y )dy   x f XY ( x, xz)dx   y f XY ( yz , y )  f XY ( y , yz ) dy
  y  e

0
2
  ( yz  y )
e
  ( y  yz )
 2
, 0  z  1,

  (1  z ) 2
 0,
otherwise .
dy  2 
2

0
ye
  (1 z ) y
2
dy 
(1  z )2


0
ue u dy
f Z (z )
(8-54)
2
1
Fig. 8.15
z
Example 8.13 (Discrete Case): Let X and Y be independent
Poisson random variables with parameters 1 and 2
respectively. Let Z  X  Y . Determine the p.m.f of Z. 31
PILLAI
Solution: Since X and Y both take integer values  0, 1, 2,,
the same is true for Z. For any n  0, 1, 2, , X  Y  n gives
only a finite number of options for X and Y. In fact, if X = 0,
then Y must be n; if X = 1, then Y must be n-1, etc. Thus the
event { X  Y  n} is the union of (n + 1) mutually exclusive
events Ak given by
Ak   X  k , Y  n  k ,
k  0,1,2,, n.
(8-55)
As a result
 n

P( Z  n )  P( X  Y  n )  P   X  k , Y  n  k 
 k 0

n
  P( X  k , Y  n  k ) .
k 0
(8-56)
If X and Y are also independent, then
P  X  k , Y  n  k   P( X  k ) P(Y  n  k )
and hence
32
PILLAI
n
P( Z  n )   P( X  k , Y  n  k )
k 0
n
 e
k 0
 1
1k
k!
e
 2
e ( 1  2 )

(n  k )!
n!
n2k
n
(



)
2
 e ( 1 2 ) 1
,
n!
n
n!
k n k


1 2
k 0 k! ( n  k )!
n  0, 1, 2, , .
(8-57)
Thus Z represents a Poisson random variable with
parameter 1  2 , indicating that sum of independent Poisson
random variables is also a Poisson random variable whose
parameter is the sum of the parameters of the original
random variables.
As the last example illustrates, the above procedure for
determining the p.m.f of functions of discrete random
variables is somewhat tedious. As we shall see in Lecture 10,
the joint characteristic function can be used in this context
33
to solve problems of this type in an easier fashion.
PILLAI
Download