3. Several Random Variables

advertisement
3. Several Random Variables
3.1
Two Random Variables
3.2
Conditional Probability--Revisited
3.3
Statistical Independence
3.4
Correlation between Random Variables
3.5
Density Function of the Sum of Two Random Variables
3.6
Probability Density Function of a Function of Two Random Variables
3.7
The Characteristic Function
Concepts

Two Dimensional Random Variables

Probability in Two Dimensions, Conditional Probability--Revisited

Statistical Independence

Two Dimensional Statistics, Correlation between Random Variables

Density Function of the Linear Combination of Two Random Variables

Multi-input Electrical Circuits

Simulating Convolution Integrals
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
1 of 18
ECE 3800
Joint Probability Distribution Function (PDF)
Probability Distribution Function:The probability of the event that the observed random variable
X is less than or equal to the allowed value x and that the observed random variable Y is less than
or equal to the allowed value y.
F  x, y   Pr  X  x, Y  y 
The defined function can be discrete or continuous along the x- and y-axis. Constraints on the
probability distribution function are:
1. 0  F  x, y   1,
for    x   and    y  
2. F  , y   F  x,   F  ,   0
3. F ,    1
4. F  x, y  is non-decreasing as either x or y increases
5. F  x,    FX  x  and F , y   FY  y 
Analogies:



a 2-dimensional probability
moving from scalars to vectors (2 or more elements)
Calc 3 as compared to Calc 1 & 2
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
2 of 18
ECE 3800
Joint Probability Density Function (pdf)
The derivative of the probability distribution function is the density function
f  x, y  
 2 FX x 
xy
Properties of the pdf include
1.
f  x, y   0,
for    x   and    y  
 
2.
  f x, y   dx  dy  1
 
Note: the “volume” of the 2-D density function is one.
y
3. F  x, y  
x
  f u, v  du  dv
 
4.
f X x  

 f x, y   dy
and f Y  y  

5. Pr  x1  X  x2 , y1  Y  y 2  

 f x, y   dx

y 2 x2
  f x, y   dx  dy
y1 x1
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
3 of 18
ECE 3800
Expected Values
E g  X , Y  
 
  g x, y   f x, y   dx  dy
  
All expected values may be computed using the Joint pdf
Correlation
The definition of a new expected value … Correlation
EX  Y  
 
  x  y  f x, y   dx  dy
 
A value describing the relationship – how correlated – two random variables are to each other.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
4 of 18
ECE 3800
Uniform Density Example
The uniform density function in two dimensions can be defined as:
f X ,Y  x, y  
1
,
x 2  x1    y 2  y1 
0
for x1  x  x 2 and y1  y  y 2
,
else
Determine the density in y
fY  y 
x2
 f x, y   dx
X ,Y
x1
x2
x2
1
x
fY  y  
 dx 
x 2  x1    y 2  y1  x
x 2  x1    y 2  y1 
x1
1
fY  y 
x2  x1 

x2  x1    y 2  y1 
1
,
y 2  y1
for y1  y  y 2
Similarly
f X x  
1
,
x 2  x1
for x1  y  x 2
Correlation
EX  Y  
 
  x  y  f x, y   dx  dy
 
E X  Y  
y 2 x2
  x
y1 x1
2
x y
 dx  dy
 x1    y 2  y1 
 x2
y
EX  Y   

x2  x1    y 2  y1   2
y1
y2
x
EX  Y  


2
 y2
 x1
1


2   x 2  x1   y 2  y1   2

2
2
EX  Y  
x2  x1   y 2  y1 
2

2

y
2
2

1 2 y  x 2  x1
  dy   
 dy




x

x

y

y
2

2
1
2
1
y
x1 
1
x2



 x  x  y 2  y 2
1
1
 2
 2

2
2

y

y1 
2
y1 

y2
1
  x 2  x1    y 2  y1 
4
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
5 of 18
ECE 3800
Exercise 3-1.1
Part a and b would appear to have the same result. It would seem that the “random errors” should
be independent, so parts the 2nd and 3rd solution values make sense. Where the first solution
value came from, I do not know.
Exercise 3-1.2
f X ,Y  x, y   A  exp 2  x  3  y ,
0
,
for 0  x and 0  y
else
Determine A
1
 

  
0 0
  f x, y   dx  dy    A  exp 2  x  3  y   dx  dy



 exp 2  x   
1  A   exp 3  y  exp 2  x   dx  dy  A   exp 3  y 
  dy

2


0
0
0
0



1
1  exp 3  y  
1 1
1  A    exp 3  y   dy  A   
  A 
3
2 0
2 
2 3
0 

A6
Determine the Distribution Function
y x
F  x, y     6  exp 2  x  3  y   dx  dy
0 0
x

F  x, y   6   exp 3  y     exp 2  x   dx   dy
0
0

y
 1 exp 2  x  
F  x, y   6   exp 3  y    
  dy
2
2

0
y
 1 exp 2  x    1 exp 3  y  
F  x, y   6   
   3 
  1  exp 2  x   1  exp 3  y 
2
3
2
Then,

1
1

 3 
F  x  , y    1  exp 1  1  exp   
2
4

 4 

Finally, determine the correlation
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
6 of 18
ECE 3800
E X  Y  
 
  x  y  f x, y   dx  dy
  

EX  Y     x  y  6  exp 2  x  3  y   dx  dy
0 0
 x  expa  x  
Given
expa  x 
 a  x  1
a2


 exp 2  x 
  exp 3  y 

EX  Y   6  
  2  x  1   
  3  y  1 
4
9

0 
0 
 

 1
  1

E X  Y   6     2  0  1     3  0  1
 4
  9

1  1  1
E  X Y   6       
 4  9  6
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
7 of 18
ECE 3800
Conditional Probability (Again, with multiple r.v.)
Using the Probability Distribution Function (PDF), define
Pr  X  x | M  F  x, y 
FX  x | Y  y  

FY  y 
Pr M 
Another way.
FX  x | y1  Y  y 2  
F  x, y 2   F  x, y1 
FY  y 2   FY  y1 
Leading to,
f X x | Y  y  
f  x, y 
fY y
fY  y | X  x 
f  x, y 
f X x 
and
These are different from the probability of a continuous distribution taking on a single value in X
and Y…
FX  X  x   0 or FY Y  y   0
An engineering derivation follows:
F x, y  y   F x, y 
F  x, y  y   F  x, y 
y
 lim
FX  x | Y  y   lim






F
y


y

F
y
y  0 FY  y  y   FY  y 
y  0
Y
Y
y
x
FX x | Y  y  
F  x, y 
FY  y 
y
 f u, y   du
 
y
fY  y 
[Note: Equ(3-9) is in error, the integral is to x not infinity]
Then
FX  x | Y  y 
 2 F  x, y 
x

FY  y 
yx
y

f  x, y 
fY  y 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
8 of 18
ECE 3800
The corresponding conditional density function is
f x | Y  y  
f  x, y 
fY  y 
f  y | X  x 
f  x, y 
f X x 
and similarly it can be shown that
[Note: Equ (3-13) is in error, the denominator is a variable in x.]
From these equations, it can be seen that
f  x, y   f  x | Y  y   f Y  y   f  y | X  x   f X  x 
The joint density total probability concepts can define the x and y marginal densities.
f X x  

 f x, y   dy
and f Y  y  

 f x, y   dx


Then from the conditional density relationship with the joint density
f  x, y   f  x | Y  y   f Y  y   f  y | X  x   f X  x 
we can replace the joint density functions in the total probability equations to define the pdf
densities of x and y based on the conditional densities as
fY  y  








 f x, y   dx   f  y | X  x  f X x  dx
or
f X x  
 f x, y   dy   f x | Y  y   fY  y   dy
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
9 of 18
ECE 3800
To derive the multiple variable Bayes Theorem, use
f  x, y   f  x | Y  y   f Y  y   f  y | X  x   f X  x 
resulting in
f x | Y  y  
f  y | X  x   f X x 
fY  y 
f  y | X  x 
f x | Y  y   fY  y 
f X x 
or
Note: the joint probability density function completely specifies:

both marginal density functions and

both conditional density functions.
Warning:
The example on 127 and 128 does not make sense without information from p. 136-137. In
particular, Eq. 3-17 implies that you know how to form fy(y) when this concept has not been
introduced. It should be revisited after discussing the pdf of “the sum of two random variables”.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
10 of 18
ECE 3800
Statistical Independence
f  x, y   f X  x   f Y  y 
Why have already seen this in the two examples presented.
Joint Uniform Densities
f X ,Y  x, y  
1
,
x 2  x1    y 2  y1 
for x1  x  x 2 and y1  y  y 2
Where we proved
fY  y  
1
,
y2  y1
for y1  y  y2
f X x  
1
,
x 2  x1
for x1  y  x 2
and
Therefore
 1   1 
  
,
f X ,Y  x, y   f X  x   f Y  y   
x

x
y

y
1   2
1 
 2
for x1  x  x 2 and y1  y  y 2
Exercise 3-1.2
f X ,Y  x, y   A  exp 2  x  3  y ,
for 0  x and 0  y
Where it turns out that
f X ,Y  x, y   2  exp 2  x   3  exp 3  y , for 0  x and 0  y
As
fY  y 


 f x, y   dx   2  exp 2  x  3  exp 3  y  dx


 1 exp 2  x 
f Y  y   3  exp 3  y    2  exp 2  x   dx  3  exp 3  y   2   

2
2

0
f Y  y   3  exp 3  y 


and
f X x   2  exp 2  x 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
11 of 18
ECE 3800
Definition of correlation for X and Y independent
 

 




E X  Y  
x  y  f x, y   dx  dy 
x  f X x   dx 
y  f Y  y x  dy 

 

 
 
  




E X  Y   E X   E Y   X  Y
As another consequence, the conditional density is simplified as
f x | Y  y  
f  y | X  x   f X x  fY  y   f X x 

 f X x 
fY  y 
fY  y 
f  y | X  x 
f x | Y  y   fY  y  f X x   fY  y 

 fY  y 
f X x 
f X x 
and similarly
[Note: p.131 first equation is in error, the denominator is the density function.]
In general, if independence can be establish, or even assumed, the computations to be performed
become much easier!
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
12 of 18
ECE 3800
Example p. 126 revisited for independence and correlation.
f X ,Y  x, y  
6
 1  x 2  y ,
5
for 0  x  1 and 0  y  1
Where the marginal densities are
1
6
6 
x3 
f Y  y     1  x 2  y   dx    x   y 
5
5 
3
0
0
6 
y
f Y  y    1  
5  3
1
and
1
6
6 
y2 
f X  x     1  x 2  y   dy    y  x 2  
5
5 
2 0
0
1
6  x2 
f X  x    1  
5 
2 
Note that f  x, y   f X x   f Y  y  , Therefore the variables are not independent!
From computations:
E Y  
 
 
7
3
9
7
. EY2 
and E X  
. E X2 
15
10
20
25
Then the correlation value is
EX  Y  
 
  x  y  f x, y   dx  dy
  
1
1 1

3
 y2
6
6
2
2 y 

E X  Y      x  y  1  x  y   dx  dy    x  
 x    dx
5 00
5   2
3 0
1

6
6  x2 x4 
1
2 1
E  X  Y     x    x    dx     
5   2
3
5  4 12  0
E X Y  
6 1 1  1
   
5  4 12  5
And again EX  Y   X   Y  which would be the case for independents random variables.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
13 of 18
ECE 3800
Example 3-3.1
f X ,Y  x, y   k  exp x  y  1,
for 0  x   and 1  y  
There is no a, unless the problem was supposed to be stated as …
f X ,Y  x, y   k  exp x  y  a ,
for 0  x   and 1  y  
but then k and a are not necessarily computed separately as 1 and 1.!
Overall, the correct pdf is
f X ,Y  x, y   exp  x  y  1,
for 0  x   and 1  y  
Correlation
EX  Y  
 
  x  y  f x, y   dx  dy
  

E  X  Y     x  y  exp x  y  1  dx  dy
1 0
expax 
 ax  1
a2
 x  expax   dx 
exp0 
 0  1  dx
E  X  Y   exp 1   y  exp y   
2



1


1




E  X  Y   exp 1   y  exp y   1  dx
1
 exp 1

  1  1  exp 1  exp 1  2  2
E X  Y   exp 1  
2
  1

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
14 of 18
ECE 3800
Exercise 3-3.2
Assume X and Y independent
f X  x   0.5  exp x  1 ,
for    x  
f Y  y   0.5  exp y  1 ,
for    y  
Find PrX Y  0 ? That is, the product of the random variables is positive.
PrX  Y  0  PrX  0  PrY  0  PrX  0  PrY  0
PrX  Y  0  1  Fx 0   1  FY 0  Fx 0  FY 0
find the distribution for X and Y based on the ranges defined for the absolute value
FX  x  
x
 0.5  exp x  1   dx

For for    x  1 and for 1  x  
for    x  1
FX  x  
for 1  x  
x
x
 0.5  expx  1  dx
FX x   0.5   0.5  exp1  x   dx

1
exp1  x 
1
1
FX  x   0.5  0.5  1  exp1  x 
exp x  1
1

FX x   0.5  exp x  1
F X  x   0 .5 
x
F X  x   0 .5  0 .5 
x
FX 0  FY 0  0.5  exp 1  0.1839
Pr X  Y  0  1  Fx 0  1  FY 0   Fx 0  FY 0  0.6998
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
15 of 18
ECE 3800
Correlation and Covariance between Random Variables
The definition of correlation was given as
 
  x  y  f x, y   dx  dy
EX  Y  
 
But most of the time, we are not interested in products of mean values (observed when X and Y
are independent) but what results when they are removed prior to the computation. Developing
values where the random variable means have been extracted, is defined as computing the
covariance
E  X  E X   Y  E Y  
 
  x   X    y  Y   f x, y   dx  dy
 
This gives rise to another factor, when the random variable variances are used to normalize the
factors or covariance computation as:
 X   X
  E 
  X
  Y  Y
  
  Y
 
 x  X


 
 X

 

  y  Y
  
  Y

  f x, y   dx  dy

This equation defines the correlation coefficient or normalized covariance,
the modified random variables are called the standardized variables and have zero mean and a
unit variance.
An alternate expression for the correlation coefficient is derived by performing the multiplication
 X   X   Y   Y
  

X

  Y

  E 


 X  Y   X  Y   Y  X   X   Y 
  E 

 X  Y



 
 x  y   X  y  Y  x   X  Y
 X  Y
  
  

  f  x, y   dx  dy

 
 














x
y
f
x
,
y
dx
dy

y
f
x
,
y
dx
dy
 

X

1
  
  



 
 


 X  Y
  Y    x  f x, y   dx  dy   X   Y    f  x, y   dx  dy 


  
  
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
16 of 18
ECE 3800



 

1




x
y
f
x
,
y
dx
dy

y
f
y
dy

x  f  x   dx   X   Y 










 
X
Y


 X   Y  



 

1

   x  y  f  x, y   dx  dy   X   Y   Y   X   X   Y 
 X   Y  


E x  y    X   Y
 X  Y
An alternate method to “skip the integrals” …
 X   X   Y   Y
  

X

  Y

 X  Y   X  Y   Y  X   X   Y 

  E 

 X  Y



  E 
The expected value is a linear operator … constants remain constants and sums are sums …

E  X  Y    X  E Y    Y  E  X    X   Y
 X  Y

E  X  Y    X   Y   Y   X   X   Y
 X  Y

E  X  Y    X   Y
 X  Y
Simplifications for random variables that are inherently zero mean with a unit variance,
  E x  y 
For either X or Y a zero mean variable,

E X  Y 
 X  Y
and for independent random variables …

E X  Y    X   Y  X   Y   X   Y

0
 X  Y
 X  Y
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
17 of 18
ECE 3800
Standardized variables and have zero mean and a unit variance.
This is similar to using the normal density/distribution for a Gaussian. The standardized or
normalized R.V. must have a zero mean and unit variance (normalized).
 X  X
  
 X

 Y  Y
 and   

 Y



Note that now
Y  Y 
 X  X 
E    E 

  0 and E    E 
 Y 
 X 
and
 X   X
E      E 
  X

1
 
E  X   X   Y   Y 



X
Y

  Y  Y
  
  Y
E     
E X  Y    X   Y

 X  Y
Remember: this is generalized for all R.V, not just Gaussian/Normal R.V.
There are also computations based on the sum and difference of these random variables that can
be computed.
E     E   E   0  0  0

 
E      E  2  2       2

2

  
 
E       1  2    1
E       2  1   
E      E  2  2  E      E  2
2
2
2
and the variance is the same value


Var      E      E      2  1   
2
2
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
18 of 18
ECE 3800
Download