Document

advertisement
Probability Theory and Random Processes
Communication Systems, 5ed., S. Haykin and M.
Moher, John Wiley & Sons, Inc., 2006.
Probability
• Probability theory is based on the phenomena that
can be modeled by an experiment with an outcome
that is subject to chance.
• Definition: A random experiment is repeated n time
(n trials) and the event A is observed m times (m
occurrences). The probability is the relative
frequency of occurrence m/n.
Probability Based on Set Theory
• Definition: An experiment has K possible outcomes
where each outcome is represented as the kth sample sk.
The set of all outcomes forms the sample space S. The
probability measure P satisfies the
• Axioms:
– 0 ≤ P[A] ≤ 1
– P[S] = 1
– If A and B are two mutually exclusive events (the two events
cannot occur in the same experiment), P[AUB]=P [A] + P[B],
otherwise P[AUB] = P[A] + P[B] – P[A∩B]
– The complement is P[Ā] = 1 – P[A]
– If A1, A2,…, Am are mutually exclusive events, then P[A1] + P[A2]
+ … + P[Am] = 1
Venn Diagrams
sk
Sample can only come
from A, B, or neither.
S
A
B
Events A and B that are mutually
exclusive events in the sample space S.
Sample can only come
from both A and B.
sk
S
A
Events A and B are not mutually
exclusive events in the sample space S.
B
Conditional Probability
• Definition: An experiment involves a pair of events A
and B where the probability of one is conditioned on
the occurrence of the other. Example: P[A|B] is the
probability of event A given the occurrence of event
B
• In terms of the sets and subsets
– P[A|B] = P[A∩B] / P[A]
– P[A∩B] = P[A|B]P[B] = P[B|A]P[A]
• Definition: If events A and B are independent, then
the conditional probability is simply the elementary
probability, e.g. P[A|B] = P[A], P[B|A] = P[B].
Random Variables
• Definition: A random variable is the assignment of a
variable to represent a random experiment. X(s)
denotes a numerical value for the event s.
• When the sample space is a number line, x = s.
• Definition: The cumulative distribution function (cdf)
assigns a probability value for the occurrence of x
within a specified range such that FX(x) = P[X ≤ x].
• Properties:
– 0 ≤ FX(x) ≤ 1
– FX(x1) ≤ FX(x2), if x1 ≤ x2
Random Variables
• Definition: The probability density function (pdf) is
an alternative description of the probability of the
random variable X: fX(x) = d/dx FX(x)
• P[x1 ≤ X ≤ x2] = P[X ≤ x2] - P[X ≤ x1]
= FX(x2) - FX(x1)
=  fX(x)dx over the interval [x1,x2]
Example Distributions
• Uniform distribution
xa
 0,
 1
f X ( x)  
, a xb
b

a

xb
 0,
xa
 0,
x a
FX ( x)  
, a xb
b

a

xb
 1,
Several Random Variables
• CDF:
FX ,Y ( x, y )  PX  x, Y  y 
• Marginal cdf:
• PDF:
 y
 x
FX ( x) 
f
X ,Y
• Conditional pdf:
f
X ,Y
(u , v) du dv
 
f
X ,Y
(u , v) du dv  1
  

f X ( x) 
f
  
  
2
f X ,Y ( x, y) 
FX ,Y ( x, y)
xy
• Marginal pdf:
FY ( y ) 
(u, v) du dv

X ,Y
( x, v) dv

f Y ( y | x) 
fY ( y ) 
f

f X ,Y ( x, y)
f X ( x)
X ,Y
(u , y ) du
Statistical Averages

• Expected value:
 X  EX    xf X x  dx

• Function of a random variable:
EY  

Y  g( X )

 yf  y  dy  Eg  X    g  X  f x  dx
Y

• Text Example 5.4
X

Statistical Averages
• nth moments:

   x
EX
n
n
f X  x  dx
2
f X  x  dx


   x
EX
2
Mean-square value of X

• Central moments:

  x  
E X   X  
n

X
n f X x  dx


  x  
E X   X  
2


2
2



f
x
dx


X
X
X
Variance of X
Joint Moments
• Correlation:

 
i
E X ,Y
k
   x y
i
k
f X ,Y x, y  dx dy
  
Expected value of the product
- Also seen as a weighted inner product
• Covariance:
covXY   E X  EX Y  EY 
 EXY    X Y
• Correlation coefficient:

Correlation of the central moment
uncorrelat ed
covXY   0,

 X  Y   1, strongly correlated
Random Processes
• Definition: a random process is described as a timevarying random variable X t 

 X t   EX t    xf X t  x  dx
• Mean of the random process:

• Definition: a random process is first-order stationary if its
pdf is constant
f X t  x   f X t  x   X t    X  X2 t    X2
Constant mean, variance
1
2
• Definition: the autocorrelation is the expected value of
the product of two random variables at different times
RX t1 , t2   EX t1 X t2 
R t , t   R t  t  Stationary to
X
1
2
X
1
2
second order
Random Processes
• Definition: the autocorrelation is the expected value
of the product of two random variables at different
times
RX t1 , t2   EX t1 X t2 
RX t1 , t2   RX t1  t2 
Stationary to
second order
• Definition: the autocovariance of a stationary
random process is
C X t1 , t2   E X t1    X  X t2    X 
 RX t1  t2    X2
Properties of Autocorrelation
• Definition: autocorrelation of a stationary process
only depends on the time differences
RX    EX t   X t 
• Mean-square value:


RX 0  E X 2 t 
• Autocorrelation is an even function:
RX    RX   
• Autocorrelation has maximum at zero:
RX    RX 0
Example
• Sinusoidal signal with random phase
X t   A cos2f c t   ,
 1
 , -    
f  t    2
 0,
otherwise
• Autocorrelation
RX    EX t   X  
A2

cos2f c 
2
As X(t) is compared to
itself at another time, we
see there is a periodic
behavior it in correlation
Cross-correlation
• Two random processes have the cross-correlation
X t , Y t 
RXY t , u   EX t Y u 
• Wide-sense stationary cross-correlation
RX t ,   RX t   , RY u,   RY u   
RXY  , u   RXY  
Example
• Output of an LTI system when the input is a RP
• Text 5.7
Power Spectral Density
• Definition: Fourier transform of autocorrelation
function is called power spectral density
SX  f  

 j 2f


R

e
dτ
X


R X τ  

j 2f


S
f
e
df
 X

• Consider the units of X(t) Volts or Amperes
• Autocorrelation is the projection of X(t) onto itself
• Resulting units of Watts (normalized to 1 Ohm)
Properties of PSD
• Zero-frequency of PSD
S X 0  

 R   dτ
X

• Mean-square value


 S
E X 2 t  

Which theorem does this property resemble?
• PSD is non-negative
• PSD of a real-valued RP
SX  f   0
S X  f   S X  f 
X
 f df
Example
• Text Example 5.12
– Mixing of a random process with a sinusoidal process
Y t   X t cos2f ct  
Wide-sense stationary RP
(to make it easier)
Uniformly distributed, but
not time-varying
– Autocorrelation
RY    EY t   Y t  
– PSD
SY  f  
1
RX   cos2f c 
2
1
SY  f  f c   SY  f  f c 
4
PSD of LTI System
• Start with what you know and work the math
Y t   ht * X t 
SY  f  

 R  e
Y
 j 2f
dτ


SY  f    EY t   Y t e  j 2f d


  Eht   * X t   ht * X t e  j 2f d

 
  j 2f
 
d
  E   h 1 X t     1  d 1   h 2 X t   2  d 2  e
 

  
  




 

 j 2f










d 1 d 2 d
e


t
X




t
X
E

h

h
2
1
2
1

  
PSD of LTI System
• The PSD reduces to
SY  f  


 

j 2f








h

h

R





e
d 1 d 2 d
1
2
 1 2 X
  
Change of variables    1   2   0     0   1   2
SY  f  


 

j 2f  0  1  2 






h





h






R

e
d 1 d 2 d 0
0
2
0
1
X
0

  
SY  f   H  f  S X  f 
2
System shapes power spectrum of input as
expected from a filtering like operation
Gaussian Process
• The Gaussian probability density function for a single
variable is
  y  Y 2 
1
fY  y  
exp 

2
2  Y

2 Y

• When the distribution has zero mean and unit variance
 y2 
1
fY  y  
exp  
2
 2
• The random variable Y is said to be normally distributed
as N(0,1)
Properties of a Gaussian Process
• The output of a LTI is Gaussian if the input is Gaussian
• The joint pdf is completely determined by the set of
means and autocovariance functions of the samples of
the Gaussian process
• If a Gaussian process is wide-sense stationary, then the
output of the LTI system is strictly stationary
• A Gaussian process that has uncorrelated samples is
statistically independent
Noise
• Shot noise
• Thermal noise
• White noise
• Narrow
Download