MS PowerPoint

advertisement
Random Variables
ECE460
Spring, 2012
Combinatorics
Notation:
Population size
Subpopulation size
Ordered Sample
n
r
 a1 , a 2 , a 3    a 3 , a1 , a 2 
How many samples of size r can be
formed from a population of size n?
1.
Sampling with replacement and ordering
2.
Sampling without replacement and with ordering
2
How many samples of size r can be formed
from a population of size n?
3.
Sampling without replacement and without ordering
4.
Sampling with replacement and without ordering
3
Bernoulli Trials
Independent trials that result in a success with
probability p any failure probability 1-p.
4
Conditional Probabilities
Given two events, E1 & E2, defined on the same probability
space with corresponding probabilities P(E1) & P(E2):
 P  E1  E 2 
,

P ( E1 | E 2 )   P  E 2 

0,

P  E2   0
O th erw ise
5
Example 4.5
An information source produces 0 and 1 with probabilities 0.3 and 0.7,
respectively. The output of the source is transmitted via a channel that
has a probability of error (turning a 1 into a 0 or a 0 into a 1) of 0.2.
1.
What is the probability that at the output a 1 is observed?
2.
What is the probability that a 1 was the output of the source if at
the output of the channel a 1 is observed?
6
Random Variables
Working with Sets has its limitations
– Selecting events in Ω
– Assigning P[ ]
– Verification of 3 Axioms
1.
P E   0
2.
P    1
3.
P  E  F   P  E   P  F  if E  F  
Would like to leave set theory and move into more
advanced & widely used mathematics
(i.e., Integration, Derivatives, Limits…)
Random Variables:
– Maps sets in Ω to R
– Subsets of the real line
of the form (   , x ] are
called Borel sets. A
collection of Borel sets is
called a Borel σ-fields.
– A function that will associate events in Ω with the Borel
sets is called a Random Variable.
7
Cumulative Distribution Function (CDF)
CDF of a random variable X is defined as:
FX
x 
or
FX
P    : X  x 
x 
P  X  x
Properties:
1.
0  FX
2.
FX
3.
x
lim F X
x  
x
x  1
is n o n d ecreasin g
FX  x   1
 x   0 an d lim
x 
4.
FX
is co n tin u o u s fro m th e rig h t
5.
P  a  X  b   FX  b   FX  a 
6.
P  X  a   FX  a   FX  a


8
Probability Density Function (PDF)
PDF of a random variable X is defined as:
fX
x 
d
dx
FX
x
Properties:
1.
2.
3.
4.
fX


x  0


b
a
fX
 x  dx  1
fX
 x  dx 


In g en eral, P  X  A  
x
5.
P a  X  b
FX

 x    

A
fX
 x  dx
f X u  du
For discrete random variables, this is known as the
Probability Mass Function (PMF)
pi  P  X  xi 
9
Example 4.6
A coin is flipped three times and the random variable X denotes the
total number of heads that show up. The probability of a head in one
flip of this coin is denoted by p.
1. What values can the random variable X take?
2. What is the PMF of the random variable X?
3. Derive and plot the CDF of X.
4. What is the probability that X exceeds 1?
10
Uniform Random Variable
This a continuous random variable taking values between
a and b with equal probabilities over intervals of equal
length.
11
Bernoulli Random Variable
Only two outcomes (e.g., Heads or Tails)
Probability Mass Function:
1  p ,

P[ X  x ]   p ,
 0,

x0
x 1
o th erw ise
Leads to binomial law
(sampling w/o replacement & w/o ordering)
P  k su ccesses in n trials   b  k ; n , p 
w h ere p = p ro b ab ility o f a su ccess
Example: 10 independent, binary pulses per second arrive at a
receiver. The error probability (that is, a zero received as a one
or vice versa) is 0.001. What is the probability of at least one
error/second?
12
Binomial Law Example
Five missiles are fired against an aircraft carrier in the ocean. It
takes at least two direct hits to sink the carrier. All five missiles
are on the correct trajectory must get through the “point defense”
guns of the carrier. It is known that the point defense guns can
destroy a missile with probability P = 0.9. What is the probability
that the carrier will still be afloat after the encounter?
13
Uniform Distribution
1
ba
a
b
x
Probability Density Function (pdf)
Cumulative distribution function
A resistor r is an RV uniform distribution between 900 and 1100
ohms. Find the probability that r is between 950 and 1050
ohms.
14
Gaussian (Normal) Random Variable
A continuous random variable described by the density
function:
fX
x 
1
2 

(xm )
e
2
2
2
N otation:
X : N  m,
Properties:
1.
If Y  a X  b w h ere a an d b are scalars,
th en Y : N  a m  b , a 
2.
xm

3.
2

 N  0,1 
FX  a  

a

fX
 x  dx
15

Special Case: CDF for N(0,1)
The CDF for the normalized Gaussian random variable
with m = 0 and σ = 1 is:
X

1
x
 x    
2
e
t
2
2
dt
Pre-Normalized Gaussian:
FX
x   X
xm


  
Another related function (complimentary error function)
used for finding P(X > x) is:
Q x 


x

1
2
 1  X
e
t
2
2
dt,
x0
x
Properties:
Q x  1 Q  x
Q 0 
1
2
Q    0
16
Gaussian Example
A random variable is N(1000; 2500). Find the probability that x
is between 900 and 1050.
17
Complimentary Error Function
18
The Central Limit Theorem
Let X 1 , X 2 , ... , X n be a set of random variables with the following
properties:
1. The Xk with k = 1, 2, …, n are statistically independent
2. The Xk all have the same probability density function
3. Both the mean and the variance exist for each Xk
Let Y be a new random variable defined as
n
Y 

Xk
k 1
The, according to the central limit theorem, the normalized
random variable
Z 
Y  E [Y ]
Y
Approaches a Gaussian random variable with zero mean and
unit variance as the number of random variables X 1 , X 2 , ..., X n
Increases without limit.
19
Functions of Random Variables
Let X be a r.v. with known CDF and PDF. If g(X) is a
function of the r.v. X then
Y  gX


FY
 y 
P    : g  X     y
fY
 y 
 xi 
 g x
 i
i

fX
Example:
Given the function Y  a X  b where a > 0 and b are
constants and X is a r.v. with F X  x  .
Find FY  y  an d f Y  y  .
20
Statistical Averages
The expected value of the random variable X is defined as
E  g  x  



g  x fX
 x  dx
Special Cases:
Characteristic Functions:

X
  



fX
 x  e j x d x
Special Cases:
21
Multiple Random Variables
The joint CDF of X and Y is
F X ,Y  x , y   P     : X     x , Y     y 
 P  X  x,Y  y 
and its joint PDF is
f X ,Y  x , y  

2
x y
F X ,Y  x , y 
Properties:
1.
FX
x 
F X ,Y  x ,   ,
2.
fX
 x    

3.
4.
5.




 y 
F X ,Y   , y 
fY
 y    

f X ,Y  x , y  d y ,
  f  x, y  dx dy  1
P  X ,Y   A    
f


FY
f X ,Y  x , y  d x
X ,Y
x,y A
F X ,Y  x , y  
x
y


 
X ,Y
u , v  du dv
f X ,Y  u , v  d v d u
22
Expected Values
Given g(X,Y) as a function of X and Y, the expected value is
E ( g ( X , Y )) 




 
g  x , y  f X ,Y  x , y  d x d y
Special Cases:
• Correlation of X & Y (e.g, g ( X , Y )  X Y )
R X Y ( x , y )  E ( g ( X , Y )) 
•
Covariance of X & Y
COV ( X ,Y ) 
 e.g,




 




 
x y f X ,Y  x , y  d x d y
g ( X , Y )   X  m x  Y  m y 

( x  m x )( y  m y ) f X ,Y  x , y  d x d y
Conditional PDF
fY |X
 f X ,Y  x , y 
,

y
|
x

f
x

  X 

0,

fX
x  0
otherw ise
X and Y are statistically independent if:
f X ,Y  x , y   f X
x
fY
 y
23
Example
Two random variables X and Y are distributed according to
f X ,Y
 K e x y
 x, y   
0

x y0
o th erw ise
1. Find the value of the constant K.
2. Find the marginal density functions of X and Y .
3. Are X and Y independent?
24
Example
4. Find f X |Y  x | y  .
5. Find E  X | Y  y  .
6. Find C O V  X , Y  and  X ,Y .
25
Jointly Gaussian R.V.’s
Definition: X and Y are jointly Gaussian if
f X ,Y  x , y  
1
2  1 2 1  
2

  x  m1 


2
1


2

1

ex p  
2
2 1   


 y  m2 
2
2
2


2   x  m1   y  m 2  


 1 2



where ρ is the correlation coefficient between X and Y. If ρ = 0,
then
f X ,Y  x , y   f X
x
fY
 y
If jointly Gaussian, then the following are also Gaussian
fX
x,
fY
 y, f x | y, f  y | x
26
n - Jointly Gaussian R.V.’s
Definition:
X   X 1 , X 2 , ... , X n 
fX  x  
where
1
1
n
 2  2
C
2
T
is jointly Gaussian if
T
 1

1
ex p    x  m  C  x  m  
 2

 E[ X 1] 


m  E X 


 E[ X ]
n 

 m ean vecto r o f X .
and
T
C  E  X  m   X  m  


 V ar  X 1 

C ov  X 2 , X 1 



 C ov  X , X 
n
1

C ov  X 1, X 2 
V ar  X 2 
C ov  X 1, X n  

C ov  X 2 , X n  


V ar  X n  
 co varian ce m atrix o f X .
27
Jointly Gaussian Properties
1. Any subset of  X 1 , X 2 , ... , X n  is a vector of jointly Gaussian
R.V.’s
2. Jointly Gaussian R.V.’s are completely characterized by
m an d C .
3. Any collection of R.V.’s  X 1 , X 2 , ... , X n  are uncorrelated iff
C is diagonal. Also, independence implies their noncorrelation.
For jointly Gaussian R.V.’s,
non-correlation 
independence
4. A collection of uncorrelated R.V.’s, each of which is Gaussian,
may not be jointly Gaussian.
5. If X is jointly Gaussian, then
Y  AX  b
is also jointly Gaussian with:
m Y  E Y   A E X   b  Am X  b
C Y  E  Y  m Y

Y  mY 
 E A X  m X

 AC X A
T


X  m X 
T
T
A 

T
28
Example
Let A be a binary random variable that takes the values of +1
and -1 with equal probabilities. Let X ~ N  0,  2  . A and X are
statistically independent. Let Y = A X.
1. Find the pdf f Y  y  .
2. Find the covariance cov(X,Y).
3. Find the covariance cov(X2,Y2).
4. Are X and Y jointly Gaussian?
29
Download