Chapter 6: Correlation Functions

advertisement
Chapter 6: Correlation Functions
6-1
Introduction
6-2
Example: Autocorrelation Function of a Binary Process
6-3
Properties of Autocorrelation Functions
6-4
Measurement of Autocorrelation Functions
6-5
Examples of Autocorrelation Functions
6-6
Crosscorrelation Functions
6-7
Properties of Crosscorrelation Functions
6-8
Example and Applications of Crosscorrelation Functions
6-9
Correlation Matrices for Sampled Functions
Concepts









Autocorrelation
o of a Binary/Bipolar Process
o of a Sinusoidal Process
Linear Transformations
Properties of Autocorrelation Functions
Measurement of Autocorrelation Functions
An Anthology of Signals
Crosscorrelation
Properties of Crosscorrelation
Examples and Applications of Crosscorrelation
Correlation Matrices For Sampled Functions
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
1 of 15
ECE 3800
Chapter 6: Correlation Functions
We have already seen correlation functions a number of times in defining how similar one signal
is to another or the signal to itself as a fixed sequence or random variables (Chap. 3 and 4).
From probability the correlation coefficient was defined:  1    1

E  XY    X  Y
 XY
From sampled statistics Pearson’s r was defined:  1  r  1
n
  yi  y xi  x 
i 1

 n
 n

2 



xi  x 
yi  y 2 



 i 1
 i 1



In defining a wide sense stationary process, the probabilistic expectation of a random process in
time was defined in order to determine stationarity:
EX t1 X t 2   EX t X t     EX 0X  
In fact for ergodic processes, the probabilistic operations have equivalent sample average
statistics such that

X
n


1
x  f  x   dx  lim
T   2T
n

T
X
n
t   dt
T
From this basis, we are interested in observing signal and systems that are stationary, ergodic,
have sample statistics that can be readily collected and relate to know probability distributions.
This allows the mathematical derivation of systems design concepts, the simulation have
possible outcomes, and the collection of statistics on a prototype system to validate the
theoretical performance!
So, on with additional properties and mathematical descriptions for wide-sense stationary
(WSS), ergodic random processes.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
2 of 15
ECE 3800
Chapter 6: Correlation Functions
The Autocorrelation Function
For a sample function defined by samples in time of a random process, how alike are the
different samples?
Define:
X 1  X t1  and X 2  X t 2 
The autocorrelation is defined as:




 dx1  dx2  x1x2 f x1, x2 
R XX t1 , t 2   E  X 1 X 2  
The above function is valid for all processes, stationary and non-stationary.
For WSS processes:
R XX t1 , t 2   E X t X t     R XX  
If the process is ergodic, the time average is equivalent to the probabilistic expectation, or
1
 XX    lim
T   2T
T
 xt   xt     dt 
xt   xt   
T
and
 XX    R XX  
As a note for things you’ve been computing, the “zoreth lag of the autocorrelation” is

   dx  x
R XX t1 , t1   R XX 0   E X 1 X 1   E X 1 
2
2

2
2
2
1 f  x1    X   X

1
 XX 0   lim
T   2T
T
 xt 
2
 dt  xt 2
T
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
3 of 15
ECE 3800
Demonstrating the similarity definition for the autocorrelation
Let
Y t   X t     X t   
We wish to find the value of  that minimize the mean squared value of Y.
Derivation:

min E Y t 



2

 
E Y t   E  X t     X t   
 
2
2

E Y t   E X t   2    X t   X t      2  X t   
2
2
2

Take the derivative with respect to , set the result equal to zero and determine the value.

0  E  2  X t   X t     2    X t   

E X t   X t       E X t   


2

2

  E X t 2  R XX  

Therefore,
R XX  

E X t 
2


Y t   X t  
R XX  
R XX  

R XX 0   X 2   X 2
R XX  

E X t 
2
  X t   
Text Note: Equ (6-5) if a zero mean value is assumed for X.

R XX  

E X t 
2


R XX  
X2
Interpretation of the autocorrelation …
The statistical (or probabilistic) similarity of future (or past) samples of a random sample
function to itself for an ergodic random process.
How similar is a time shifted version of a time function to itself?
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
4 of 15
ECE 3800
Exercise 6-1.1
A random process has a sample function of the form
0  t 1
else
 A,
X t   
0,
where A is a random variable that is uniformly distributed from 0 to 10.
Find the autocorrelation of the process.
f a  
1
,
10
for 0  a  10
Using
R XX t1 , t 2   EX t1   X t 2 
 
R XX t1 , t 2   E A 2 ,
R XX t1 , t 2  
10
1
 10  a
2
 da,
for 0  t1 , t 2  1
for 0  t1 , t 2  1
0
10
a3
R XX t1 , t 2  
,
30 0
R XX t1 , t 2  
1000 100

,
30
3
for 0  t1 , t 2  1
for 0  t1 , t 2  1
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
5 of 15
ECE 3800
Exercise 6-1.2
Define a random variable Z(t) as
Z t   X t   X t   1 
where X(t) is a sample function from a stationary random process whose autocorrelation function
is

R XX    exp   2

Write an expressions for the autocorrelation function of Z(t).
RZZ t1 , t 2   EZ t1   Z t 2 
RZZ t1 , t 2   EX t1   X t1   1  X t 2   X t 2   1 
RZZ t1 , t 2   EX t1   X t 2   X t1   X t 2   1   X t1   1   X t 2   X t1   1   X t 2   1 
RZZ t1 , t 2   E  X t1   X t 2   E X t1   X t 2   1 
 E X t1   1   X t 2   E X t1   1   X t 2   1 
Note: based on the autocorrelation definition,   t 2  t1
RZZ t1 , t 2   R XX t 2  t1   R XX t 2   1  t1   R XX t 2  t1   1   R XX t 2  t1 
RZZ t1 , t 2   2  R XX    R XX    1   R XX    1 





R ZZ t1 , t 2   2  exp   2  exp     1   exp     1 
2
2

Note that for this example, the autocorrelation is strictly a function of tau and not t1 or t2. As a
result, we expect that the random variable is stationary.
Helpful in defining a stationary random process:
A stationary random variable will have an autocorrelation that is dependent on the time
difference, not the absolute times.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
6 of 15
ECE 3800
Autocorrelation of a Binary Process
Binary communications signals are discrete, non-deterministic signals with plenty of random
variables involved in their generations.
Nominally, the binary bit last for a defined period of time, ta, but initially occurs at a random
delay (uniformly distributed across one bit interval 0<=t0<ta). The amplitude may be a fixed
constant or a random variable.
ta
A
t0
-A
bit(0)
bit(1)
bit(2)
bit(3)
bit(4)
The bit values are independent, identically distributed with probability p that amplitude is A and
q=1-p that amplitude is –A.
1
pdf t 0   , for 0  t 0  t a
ta
Determine the autocorrelation of the bipolar binary sequence, assuming p=0.5.
 
t 
 t   t0  a 2   k  ta


xt    a k  rect 

ta
k  









R XX t1 , t 2   EX t1   X t 2 
For samples more than one period apart, t1  t 2  t a , we must consider


E a k  a j  p  A  p  A  p  A  1  p    A  1  p    A  p  A  1  p    A  1  p    A



E a k  a j  A 2  p 2 2  p  1  p   1  p 



2


E a k  a j  A 2  4  p 2 4  p  1
For p=0.5




E a k  a j  A 2  4  p 2 4  p  1  0
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
7 of 15
ECE 3800
For samples within one period, t1  t 2  t a ,
Ea k  a k   p  A 2  1  p    A  A 2
2


Ea k  a k 1   A 2  4  p 2 4  p  1  0
there are two regions to consider, the sample bit overlapping and the area of the next bit. But the
overlapping area … should be triangular. Therefore
R XX   
1

ta

ta
2
 Ea
k
 ak 1   dt 
ta
 2
1
R XX    
ta
ta
1

ta
2
1
t Eak  ak  dt  t a 
a
 ta 2
 Ea

ta
k
 ak   dt ,
for  t a    0
k
 ak 1   dt ,
for 0    t a
2
 t a 2
 2
 Ea
ta
2
or
R XX    A 2 
1

ta
 t a 2
1  dt,

1
R XX    A  
ta
2
ta
for  t a    0
2
ta
2
1  dt,
for 0    t a
ta
 2
Therefore
 2 ta  
A  t ,

a
R XX    

t
 A2  a  ,

ta
for  t a    0
for 0    t a
or recognizing the structure
 
R XX    A 2  1 
 ta

, for  t a    t a


This is simply a triangular function with maximum of A2, extending for a full bit period in both
time directions.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
8 of 15
ECE 3800
For unequal bit probability
 2  ta  

 4  p 2 4  p  1 
 A  
ta
R XX    
 ta
 2
2
 A  4  p 4  p  1 ,





,


for  t a    t a
for t a  
As there are more of one bit or the other, there is always a positive correlation between bits (the
curve is a minimum for p=0.5), that peaks to A2 at  = 0.
Note that if the amplitude is a random variable, the expected value of the bits must be further
evaluated. Such as,
Eak  ak    2   2
Ea k  a k 1    2
In general, the autocorrelation of communications signal waveforms is important, particularly
when we discuss the power spectral density later in the textbook.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
9 of 15
ECE 3800
Examples of discrete waveforms used for communications, signal processing, controls, etc.
(a) Unipolar RZ & NRZ, (b) Polar RZ & NRZ , (c) Bipolar NRZ , (d) Split-phase Manchester,
(e) Polar quaternary NRZ.
From Cahp. 11: A. Bruce Carlson, P.B. Crilly, Communication Systems,
5th ed., McGraw-Hill, 2010. ISBN: 978-0-07-338040-7
In general, a periodic bipolar “pulse” that is shorter in duration than the pulse period will have
the autocorrelation function
R XX    A 2 
tw
tp
  
 1  ,
 tw 
for  t w    t w
for a tw width pulse existing in a tp time period, assuming that positive and negative levels are
equally likely.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
10 of 15
ECE 3800
Sinusoidal Signal
Another classic is a sinusoidal signal with a random phase, uniformly distributed from 0 to 2π.
xt   A  sin2    f  t   
R XX t1 , t 2   EX t1   X t 2 
R XX t1 , t 2   EA  sin2    f  t1     A  sin2    f  t 2   
R XX t1 , t 2   A 2  E sin 2    f  t1     sin 2    f  t 2   
1
1

R XX t1 , t 2   A 2  E   cos2    f  t 2  t1    cos2    f  t 2  t1   2   
2
2

R XX t1 , t 2  
A2
A2
 cos2    f  t 2  t1  
 Ecos2    f  t 2  t1   2   
2
2
R XX t1 , t 2  
A2
 cos2    f  t 2  t1 
2
R XX   
A2
 cos2    f   
2
Of note is that the phase need only be uniformly distributed over 0 to π!
Also of note, if the amplitude is an independent random variable, then
 
E A2
R XX   
 cos2    f   
2
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
11 of 15
ECE 3800
Properties of Autocorrelation Functions
1)
 
R XX 0   E X 2  X 2 or  XX 0   xt 2
the mean squared value of the random process can be obtained by observing the zeroth lag of the
autocorrelation function.
2)
R XX    R XX   
The autocorrelation function is an even function in time. Only positive (or negative) needs to be
computed for an ergodic WSS random process.
Note that:
R XX t1 , t 2   EX t1 X t 2 
t1  t1 and t 2  t1   ,
Let
then EX t1 X t1     R XX  
and for R XX t1 , t1     EX t1 X t1     EX t1   X t1   R XX t1   , t1   R XX   
t1  t 2   and t 2  t 2 ,
Let
then R XX t 2   , t 2   E X t 2   X t 2   EX t 2 X t 2     R XX   
But, both of the above equations are describing exactly the sample correlation in time!
Therefore, R XX    R XX   
3)
R XX    R XX 0
The autocorrelation function is a maximum at 0. For periodic functions, other values may equal
the zeroth lag, but never be larger.

 
0  E  X 1  X 2 2  E X 12  2  X 1  X 2  X 2 2
 
 

0  E X 12  2  E  X 1  X 2   E X 2 2
0  R XX 0  2  R XX    R XX 0
For the worst case
0  2  R XX 0  2  R XX  
R XX    R XX 0
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
12 of 15
ECE 3800
4)
If X has a DC component, then Rxx has a constant factor.
X t   X  N t 
R XX    E X t X t     E  X  N t    X  N t   


   E X  E X  N t     EN t   X   EN t   N t   
R XX    E X 2  X  N t     N t   X  N t   N t   
R XX
2
R XX    X 2  2  X  N  R NN  
R XX    X 2  R NN  
Note that the mean value can be computed from the autocorrelation function constants!
5)
If X has a periodic component, then Rxx has a will also have a periodic component of the
same period.
Think of:
X t   A  cosw  t   ,
0    2 
where A and w are known constants and theta is a uniform random variable.
R XX    EX t X t     E A  cosw  t      A  cosw  t  w     
1

R XX    E X t X t     A 2  E   cos2  w  t  w    2     cosw   
2

R XX    E X t X t    
A2
 E cos2  w  t  w    2     E cosw   
2

2

A2 
1


R XX    E X t X t    
 cosw     E cos2  w  t  w    2    
 d  


2 
2
0



R XX    E X t X t    
A2
A2
 cosw     0 
 cosw   
2
2
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
13 of 15
ECE 3800
For a more general case where
X t   A  cosw  t     N t ,
0    2 
where A and w are known constants, theta is a uniform random variable and N is a random
variable with theta and N independent.
R XX   
A2
 cos w     R NN  
2
5b)
For signals that are the sum of independent random variable, the autocorrelation is the
sum of the individual autocorrelation functions.
W t   X t   Y t 
RWW    E W t W t     E  X t   Y t    X t     Y t   
RWW    E  X t   X t     X t   Y t     Y t   X t     Y t   Y t   
RWW    E X t   X t     E X t   Y t     E Y t   X t     E Y t   Y t   
RWW    R XX    E X t   Y t     E Y t   X t     RYY  
RWW    R XX    E X t   E Y t     E Y t   E X t     RYY  
RWW    R XX     X   Y   X  Y  RYY  
RWW    R XX    RYY    2   X  Y
For non-zero mean functions, (let w, x, y be zero mean and W, X, Y have a mean)
RWW    R XX    RYY    2   X  Y
RWW    Rww    W 2  R xx     X 2  R yy    Y 2  2   X   Y
RWW    Rww    W 2  R xx    R yy     X 2  2   X   Y   Y 2
RWW    Rww    W 2  R xx    R yy     X   Y 2
Rww    R xx    R yy  
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
14 of 15
ECE 3800
6)
If X is ergodic and zero mean and has no periodic component, then
lim R XX    0
 
No good proof provided.
The logical argument is that eventually new samples of a non-deterministic random process will
become uncorrelated. Once the samples are uncorrelated, the autocorrelation goes to zero.
7)
Autocorrelation functions can not have an arbitrary shape. One way of specifying shapes
permissible is in terms of the Fourier transform of the autocorrelation function. That is, if
R XX   

 R XX    exp jwt   dt

then the restriction states that
R XX    0
for all w
Fact affecting this concept: (a) the autocorrelation is an even function, therefore the Fourier
transform has a symmetric real part and anti-symmetric imaginary part (b) since it is symmetric,
the Fourier coefficients that describe the autocorrelation must be zero for all sin and non-zero for
cosine terms.
Additional concept:
X t   a  N t 
R XX    EX t X t     Ea  N t   a  N t   
R XX    a 2  E N t   N t     a 2  R NN  
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
B.J. Bazuin, Spring 2015
15 of 15
ECE 3800
Download