ECE 3800 Probabilistic Methods of Signal and System Analysis Review

advertisement
ECE 3800 Probabilistic Methods of
Signal and System Analysis Review
Methods of Signal and System Analysis (3rd ed.) by George
R. Cooper and Clare D. McGillem; Oxford Press.
1. Introduction to Probability
1.1. Engineering Applications of Probability
1.2. Random Experiments and Events
1.3. Definitions of Probability
Experiment
Possible Outcomes
Trials
Event
Equally Likely Events/Outcomes
Objects
Attribute
Sample Space
With Replacement and Without Replacement
1.4. The Relative-Frequency Approach
NA
N
Pr  A  lim r  A
r  A 
Where Pr  A
1.
2.
3.
4.
N 
is defined as the probability of event A.
0  Pr  A  1
Pr  A  PrB   Pr C     1 , for mutually exclusive events
An impossible event, A, can be represented as Pr A  0 .
A certain event, A, can be represented as Pr  A  1 .
1.5. Elementary Set Theory
Set
Subset
Space
Null Set or Empty Set
Venn Diagram
Equality
Sum or Union
Products or Intersection
Mutually Exclusive or Disjoint Sets
Complement
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
Differences
Proofs of Set Algebra
1.6. The Axiomatic Approach
1.7. Conditional Probability
Pr  A  B   Pr  A | B   Pr B  , for PrB   0
Pr  A  B 
Pr  A | B  
, for PrB   0
Pr B 
Joint Probability
Pr A, B 
Pr  A | B   Pr A when A follows B
Pr  A, B   Pr B, A  Pr  A | B   Pr B   Pr B | A  Pr  A
Marginal Probabilities
Total Probability
Pr B   Pr B | A1   Pr  A1   Pr B | A2   Pr  A2     Pr B | An   Pr  An 
Bayes Theorem
Pr B | Ai   Pr  Ai 
Pr  Ai | B  
Pr B | A1   Pr  A1   Pr B | A2   Pr  A2     Pr B | An   Pr  An 
1.8. Independence
Pr  A, B   Pr B, A  Pr  A  Pr B 
1.9. Combined Experiments
1.10. Bernoulli Trials
 n
Pr  A occuring k times in n trials   p n k     p k  q n k
k 
1.11. Applications of Bernoulli Trials
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
2. Random Variables
2.1. Concept of a Random Variable
2.2. Distribution Functions
Probability Distribution Function (PDF)
 0  FX  x   1, for    x  
 FX     0 and FX    1
 FX is non-decreasing as x increases
 Pr  x1  X  x2   FX  x 2   FX  x1 
For discrete events
For continuous events
2.3. Density Functions
Probability Density Function (pdf)
 f X x   0, for    x  


 f x   dx  1
X


x
FX 
 f u   du
X


Pr  x1  X  x 2  
x2
 f x   dx
X
x1
Probability Mass Function (pmf)
 f X x   0, for    x  


 f X u   1
u  

FX x  
x
 f X u 
u  

Pr  x1  X  x 2  
x2
 f X u 
u  x1
Functions of random variables
f Y  y   f X x  
dx
dy
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
2.4. Mean Values and Moments
1st, general, nth Moments

X  EX  
x f
X
or X  E  X  
x   dx
E g  X  


 g  X   Pr X  x
g  X   f X  x   dx or E g  X  
x  


   x
X EX
n
n
n

   x
 f X  x   dx or X  E X
n
n
n
 Pr  X  x 
x  

Central Moments

X  X 
n
X  X 
n
E XX

E XX


2
2  X X
2  X X

2
    x  X 

n
n


    x  X 
n
n
 f X  x   dx
 Pr  X  x 
x  
Variance and Standard Deviation

 x  Pr X  x
x  




E XX

E XX
    x  X 
2

 f X  x   dx
2


    x  X 
2
2
 Pr  X  x 
x  
2.5. The Gaussian Random Variable

where

 x X 2 
, for    x  
f X x  
 exp
 2  2 
2  


X is the mean and  is the variance
1


 v X 2 
  dv
 exp
FX  x  


2

2 
2 


v  
Unit Normal (Appendix D)
x

x  
1
x
 u2 
  du
exp 


2
2


u  
1


 x   1   x 
x X
FX  x   
 
x X

 or FX  x   1  

 






Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
The Q-function is the complement of the normal function, : (Appendix E)
Q x  

 u2 
  du
exp 


2
 2 
ux
1


2.6. Density Functions Related to Gaussian
2.7. Other Probability Density Functions
Exponential Distribution
1
  
 exp
f T   
,
M
M 
 0,
  
FT    1  exp
,
M 
 0,
T  E T   M
for 0  
for   0
for 0  
for   0
 
T 2  E T 2  2M 2


2
E    T    T2  T 2  ET 2  2  M 2  M 2  M 2


Binomial Distribution
f B x  
FB  x  
n
 n k
   p  1  p n  k    x  k 
k
k 0 

n
 n
  k   p k  1  p n  k  ux  k 
k 0
2.8. Conditional Probability Distribution and Density Functions
Pr  A  B   Pr  A | B   Pr B  , for PrB   0
Pr  A  B 
Pr  A | B  
, for PrB   0
Pr B 
Pr  A  B  Pr  A, B 
Pr  A | B  

, for PrB   0
Pr B 
Pr B 
It can be shown that F x | M  is a valid probability distribution function with all the
expected characteristics:
 0  F  x | M   1, for    x  
 F   | M   0 and F  | M   1
 F x | M  is non-decreasing as x increases
 Pr  x1  X  x2 | M   F x2 | M   F  x1 | M 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
2.9. Examples and Applications
3. Several Random Variables
3.1. Two Random Variables
Joint Probability Distribution Function (PDF)
F  x, y   Pr  X  x, Y  y 
for    x   and    y  
 0  F  x, y   1,
 F  , y   F  x,   F  ,   0
 F ,    1
 F  x, y  is non-decreasing as either x or y increases
 F  x,    FX  x  and F , y   FY  y 
Joint Probability Density Function (pdf)
 2 FX x 
f  x, y  
xy
for    x   and    y  
 f  x, y   0,
 

  f x, y   dx  dy  1
 
y


F  x, y  
x
  f u, v  du  dv
 

f X x  
 f x, y   dy
and f Y  y  
 f x, y   dx




Pr  x1  X  x2 , y1  Y  y 2  
y 2 x2
  f x, y   dx  dy
y1 x1
Expected Values
E g  X , Y  
 
  g x, y   f x, y   dx  dy
  
Correlation
EX  Y  
 
  x  y  f x, y   dx  dy
 
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
3.2. Conditional Probability--Revisited
Pr  X  x | M  F  x, y 

FY  y 
Pr M 
F  x, y 2   F  x, y1 
FX  x | y1  Y  y 2  
FY  y 2   FY  y1 
f  x, y 
FX x | Y  y  
fY  y 
f  x, y 
FY  y | X  x  
f X x 
f  y | x   f X x 
f x | y  
fY  y 
f  x, y   f  x | Y  y   f Y  y   f  y | X  x   f X  x 
f  y | X  x   f X x 
f x | Y  y  
fY  y 
f x | Y  y   fY  y 
f  y | X  x 
f X x 
FX x | Y  y  
3.3. Statistical Independence
f  x, y   f X  x   f Y  y 
E X  Y   E X   E Y   X  Y
3.4. Correlation between Random Variables
 
EX  Y  
  x  y  f x, y   dx  dy
 
Covariance
E  X  E X   Y  E Y  
 
  x   X    y  Y   f x, y   dx  dy
 
Correlation coefficient or normalized covariance,
 X   X
  E 
  X
  Y  Y
  
  Y
 
 x  X


 
 X

 


  y  Y
  
  Y

  f x, y   dx  dy

E x  y    X   Y
 X  Y
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
3.5. Density Function of the Sum of Two Random Variables
Z  X Y
y
x
  f u, v  du  dv
F  x, y  
 


FZ  z  
fY  y  

f Z z  
z y
 f X x dx  dy





 f X x  fY z  x  dx   fY  y   f X z  y   dy
3.6. Probability Density Function of a Function of Two Random Variables
3.7. The Characteristic Function
 u   Eexp j  u  X 

 u  
 f x  exp j  u  x  dx

The inverse of the characteristic function is then defined as:
1
f x  
2

  u   exp j  u  x  du

Computing other moments is performed similarly, where:
d n  u 
du
du

u 0
 exp j  u  x   dx

  j  x
n

 f x    j  x 
n


d n  u 
n
n

 f  x   dx  j 
n

x
n
 
 f x   dx  j n  E X n

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
4. Elements of Statistics
4.1. Introduction
4.2. Sampling Theory--The Sample Mean
1
Xˆ 
n
Sample Mean
n
 Xi ,
where X i are random variables with a pdf.
i 1
Variance of the sample mean

 
   2   2
1
 n   X 2   X 2  X
Var Xˆ    X 2 

n
n2
n

 
n

 2  N n

Var Xˆ 

n  N 1 
4.3. Sampling Theory--The Sample Variance
 X  Xˆ 
n 1
E S  

n
N n 1


E S  
N 1 n
S2 
1
n
n
2
i
i 1
2
2
2
unbiased
 
2


n
N
~
E S  

 E S 
N 1 n 1
 
n
n 1
~
ES2 
E S2 

n 1
n 1 n

n
i 1

n
2
2
1
ˆ
Xi  X 
X i  Xˆ
n 1
i 1
2
2
4.4. Sampling Distributions and Confidence Intervals
Gaussian
Xˆ  X
Z

n
Student’s t distribution
Xˆ  X
Xˆ  X
T
 ~
S
S
n 1
n
k 
k 
X
 Xˆ  X 
n
n
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
4.5. Hypothesis Testing
One tail or two-tail testing
4.6. Curve Fitting and Linear Regression
Yˆ  R XX  Xˆ  R XY
Yˆ  R XX  Xˆ  R XY
a

2
C XX
R XX  Xˆ
R  Yˆ  Xˆ
C
b  XY
 XY
2
C XX
R XX  Xˆ





  

4.7. Correlation between Two Sets of Data
 xi
 
 xi 2
E X 2  R XX 
X
2
1
 C XX  
n
1

n
i 1
n
i 1
2

1 n
xi   
xi   R XX   X 2

n
i 1
 i 1 
n


2
R XY  E X  Y  

n
1
 X  E X   
n


C XY  E X  X  Y  Y 
 XY 
1

n
n
 xi  y i
1

n
i 1
n
  xi  y i    X  Y
i 1
C XY
 X  Y
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
5. Random Processes
5.1. Introduction
Ensemble
For example, assume that there is a known AM signal transmitted:
st   1  b  At   sinw  t 
at an undetermined distance the signal is received as
yt   1  b  At   sinw  t   ,
0    2 
The received signal is mixed and low pass filtered …
xt   ht    yt   cosw  t   ht   1  b  At   sinw  t     cosw  t ,0    2  
xt   ht    yt   cosw  t   ht   1  b  At   0.5  sin 2  w  t     sin ,0    2  
If the filter removes the 2wt term, we have
1  b  At   sin  ,0    2  
xt   ht    y t   cosw  t  
2
Notice that based on the value of the random variable, the output can change significantly! From
producing no output signal, (   0,  ), to having the output be positive or negative (
  0 to  or  to 2 ). P.S. This is not how you perform non-coherent AM demodulation.
To perform coherent AM demodulation, all I need to do is measured the value of the random
variable and use it to insure that the output is a maximum (i.e. mix with
cosw  t   m , where m   t1  .
5.2. Continuous and Discrete Random Processes
5.3. Deterministic and Nondeterministic Random Processes
5.4. Stationary and Nonstationary Random Processes
The requirement that all marginal and joint density functions be independent of the choice of
time origin is frequently more stringent (tighter) than is necessary for system analysis. A more
relaxed requirement is called stationary in the wide sense: where the mean value of any random
variable is independent of the choice of time, t, and that the correlation of two random variables
depends only upon the time difference between them. That is
E  X t   X   X and
E X t1   X t 2   E X 0   X t 2  t1   X 0   X    R XX   for   t 2  t1
You will typically deal with Wide-Sense Stationary Signals.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
5.5. Ergodic and Nonergodic Random Processes
A Process for Determining Stationarity and Ergodicity
a) Find the mean and the 2nd moment based on the probability
b) Find the time sample mean and time sample 2nd moment based on time
averaging.
c) If the means or 2nd moments are functions of time … non-stationary
d) If the time average mean and moments are not equal to the probabilistic mean
and moments or if it is not stationary, then it is non ergodic.
For ergodic processes, all the statistics can be determined from a single function of the process.
This may also be stated based on the time averages. For an ergodic process, the time averages
(expected values) equal the ensemble averages (expected values). That is to say,

Xn 

1
x n  f  x   dx  lim
T   2T

T
X
n
t   dt
T
Note that ergodicity cannot exist unless the process is stationary!
5.6. Measurement of Process Parameters
5.7. Smoothing Data with a Moving Window Average
A Process for Determining Stationarity and Ergodicity
a) Find the mean and the 2nd moment based on the probability
b) Find the time sample mean and time sample 2nd moment based on time averaging.
c) If the means or 2nd moments are functions of time … non-stationary
d) If the time average mean and moments are not equal to the probabilistic mean and moments
or if it is not stationary, then it is non ergodic.
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
6. Correlation Functions
6.1. Introduction
6.2. Example: Autocorrelation Function of a Binary Process
R XX t1 , t 2   E  X 1 X 2  




 dx1  dx2  x1x2 f x1, x2 
The above function is valid for all processes, stationary and non-stationary.
For WSS processes:
R XX t1 , t 2   E X t X t     R XX  
If the process is ergodic, the time average is equivalent to the probabilistic expectation, or
1
 XX    lim
T   2T
and
T
 xt   xt     dt 
xt   xt   
T
 XX    R XX  
6.3. Properties of Autocorrelation Functions
 
1)
R XX 0   E X 2  X 2 or  XX 0  xt 
2)
R XX    R XX   
2
R XX    R XX 0
3)
4)
If X has a DC component, then Rxx has a constant factor.
5)
If X has a periodic component, then Rxx has a will also have a periodic component of the
same period.
6)
If X is ergodic and zero mean and has no periodic component, then
lim R XX    0
 
7)
Autocorrelation functions can not have an arbitrary shape. One way of specifying shapes
permissible is in terms of the Fourier transform of the autocorrelation function. That is, if
R XX   

 R XX    exp jwt   dt

then the restriction states that
R XX    0
for all w
6.4. Measurement of Autocorrelation Functions
6.5. Examples of Autocorrelation Functions
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
6.6. Crosscorrelation Functions
The cross-correlation is defined as:
R XY t1 , t 2   E  X 1Y2  
RYX t1 , t 2   E Y1 X 2  
For jointly WSS processes:
and








 dx1  dy2  x1 y2 f x1, y2 
 dy1  dx2  y1x2 f  y1, x2 
R XY t1 , t 2   E X t Y t     R XY  
RYX t1 , t 2   EY t X t     RYX  
 XY    R XY  
YX    RYX  
6.7. Properties of Cross-correlation Functions
1)
The properties of the zoreth lag have no particular significance and do not represent
mean-square values. It is true that the ordered crosscorrelations are equal at 0. .
R XY 0  RYX 0 or  XY 0  YX 0
2)
Crosscorrelation functions are not generally even functions. There is an antisymmetry to
the ordered crosscorrelations:
R XY    RYX   
3)
The crosscorrelation does not necessarily have its maximum at the zeroth lag. This makes
sense if you are correlating a signal with a timed delayed version of itself. The crosscorrelation
should be a maximum when the lag equals the time delay!
4)
If X and Y are statistically inpendent, then the ordering is not important
R XY    E  X t   Y t     E  X t   E Y t     X  Y
and
R XY    X  Y  RYX  
5)
If X is a stationary random process and id differentiable with respect to time, the
crosscorrelation of the signal and it’s derivative is given by
dR XX  
R XX   
d
6.8. Examples and Applications of Crosscorrelation Functions
6.9. Correlation Matrices for Sampled Functions
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
7. Spectral Density
7.1. Introduction
Therefore, we can define a power spectral density for the ensemble as:
S XX w  R XX   

 R XX    exp iw   d

S XX w  R XX  
R XX     1 S XX w
1
R XX t  
2

 S XX w  expiwt   dw

7.2. Relation of Spectral Density to the Fourier Transform
 XX    X w  X  w  X w
2
7.3. Properties of Spectral Density
The power spectral density as a function is always

real,

positive,

and an even function in w.
7.4. Spectral Density and the Complex Frequency Plane
7.5. Mean-Square Values From Spectral Density
The mean squared value of a random process is equal to the 0th lag of the autocorrelation
 
EX
2
1
 R XX 0  
2
 
E X 2  R XX 0  


1
S XX w  expiw  0   dw 
2



 S XX w  dw


 S XX  f   expi2f  0  dw   S XX  f   df


As a note, since the PSD is real and symmetric, the integral can be performed as
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
 
EX
2
1
 R XX 0   2 
2

 S XX w  dw
0

 

E X 2  R XX 0   2 S XX  f   df
0
7.6. Relation of Spectral Density to the Autocorrelation Function
The Fourier Transform in f

S XX  f  
 R XX    exp i2f   d


R XX t  
 S XX  f   expi2ft   df

7.7. White Noise
As a result, we define “White Noise” as
R XX    S 0   t 
N
S XX w   S 0  0
2
7.8. Cross-Spectral Density
The Fourier Transform in w
S XY w 

 R XY    exp iw   d


1
R XY t  
2

and SYX w 

 RYX    exp iw   d

1
S XY w  expiwt   dw and RYX t  
2

Properties of the functions

 SYX w  expiwt   dw

S XY w  conjSYX w
Since the cross-correaltion is real,
 the real portion of the spectrum is even
 the imaginary portion of the spectrum is odd
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
7.9. Autocorrelation Function Estimate of Spectral Density
7.10. Periodogram Estimate of Spectral Density
7.11. Examples and Applications of Spectral Density
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
8. Response of Linear Systems to Random Inputs
8.1. Introduction
8.2. Analysis in the Time Domain
8.3. Mean and Mean-Square Value of System Output
After defining the convolution, we can use the expected value operator … note that processing of
Wide-Sense Stationary Random Processes is desired and usually implied.



E Y t   E X t     h   d 


0


E Y t  

 EX t    h   d
0
For a wide-sense stationary process, this result in
E Y t  


0
0
 EX  h   d  EX   h   d
The coherent gain of a filter is defined as:

h gain   ht   dt  H 0 
0
Therefore,
E Y t   E  X   hgain  E X   H 0 

H 0    ht   dt
At f=0
0
The Mean Square Value at a System Output
2
 
 


 
E Y t 2  E  X t     h   d  
 
 0
 

 


   h   R
E Y t  
2
XX
   h  d
0
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
Example: White Noise Inputs
R XX t  
Let


E Y t  
2
N0
  t 
2
N0 
2
  h1   d1
2 0
Signal-to-Noise-Ratio SNR (always done for powers)
PSignal
SNR is defined as
PNoise

 
E S2
N 0  B EQ
This assumes that the “filter” does not change the input signal, but strictly reduces the noise
power by the equivalent noise bandwidth of the filter.
The narrower the filter applied prior to signal processing, the greater the SNR of the signal!
Therefore, always apply an analog filter prior to processing the signal of interest!
Equivalent noise bandwidth
From the definition of band-limited noise power, the equation for the equivalent noise bandwidth
is defined. Based on the noise power, we want to define a “brick-wall” bandwidth for noise
computations even when the actual filter is not a “brick-wall” in the transition region. Therefore,


E Y t  
Based on


2
N0 
2
  h1   d1
2 0
E Y t   N 0 B EQ
2
N0 
2

  h1   d1
2 0


1
B EQ   ht 2  dt
2
0
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
8,4. Autocorrelation Function of System Output
RYY    EY t   Y t     Eht   xt   ht     xt   
 
 


 


RYY    E  xt  1   h1   d1    xt    2   h2   d 2 

 

 0

 0


RYY   
0
 h 1   R XX   1   h  1  d1

8.5. Crosscorrelation between Input and Output
R XY    E X t   Y t     Ext   ht     xt   



R XY    E  xt    xt    1   h1   d1 
0



R XY     E xt   xt    1   h1   d1
0

R XY     R XX   1   h1   d1
0
This is the convolution of the autocorrelation with the filter.
What about the other Autocorrelation?
RYX    EY t   X t     Eht   xt   xt   



RYX    E  xt      xt  1   h1   d1 
0



RYX     E xt     xt  1   h1   d1
0

RYX     R XX   1   h1   d1
0
8.6. Example of Time-Domain System Analysis
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
8.7. Analysis in the Frequency Domain
The Power Spectral Density at a System Output
Review Power Spectral Density
What can we describe for all sample functions of an ensemble that contains time- or periodbased information? For WSS random processes, the autocorrelation function is time based and,
for ergodic processes, describes all sample functions in the ensemble!
Therefore, we can define a power spectral density for the ensemble as:
S XX w  R XX   

 R XX    exp iw   d

Based on this definition, we also have
S XX w  R XX  
R XX     1 S XX w
1
R XX t  
2

 S XX w  expiwt   dw

8.8. Spectral Density at the System Output
The power spectral density is the Fourier Transform of the autocorrelation:
S XX w  R XX   
SYY w  RYY   

 EX t   X t    exp iw   d


 EY t   Y t    exp iw   d

Taking the Power Spectral Density
RYY   
0
 h 1   R XX   1   h  1  d1

SYY w  RYY    S XX w  H w  H  w
SYY w  RYY    S XX w  H w
2
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
8.9. Cross-Spectral Densities between Input and Output
Relation of Spectral Density to the Crosscorrelation Function
S XY s   S XX s   H s 
SYX s   S XX s   H  s 
8.10. Examples of Frequency-Domain Analysis
8.11. Numerical Computation of System Output
9. Optimum Linear Systems
9.1. Introduction
9.2. Criteria of Optimality
9.3. Restrictions on the Optimum System
9.4. Optimization by Parameter Adjustment
9.5. Systems That Maximize Signal-to-Noise Ratio
9.6. Systems That Minimize Mean-Square Error
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
Appendices
A. Mathematical Tables
A.1. Trigonometric Identities
A.2. Indefinite Integrals
A.3. Definite Integrals
A.4. Fourier Transform Operations
A.5. Fourier Transforms
A.6. One-Sided Laplace Transforms
B. Frequently Encountered Probability Distributions
B.1. Discrete Probability Functions
B.2. Continuous Distributions
C. Binomial Coefficients
D. Normal Probability Distribution Function
E. The Q-Function
F. Student's t Distribution Function
G. Computer Computations
H. Table of Correlation Function--Spectral Density Pairs
I. Contour Integration
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System
Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.
Download