linear Prediction - University of Kentucky

advertisement
EE599-020
Audio Signals and Systems
Linear Prediction
Kevin D. Donohue
Electrical and Computer Engineering
University of Kentucky
Related Web Sites
Linear Prediction Coding (LPC) has been a popular
modeling, analysis and synthesis tool for speech the 1970’s.
Many web sites are devoted to education and research in
this area. A general search of Linear Prediction … will
turn up many interesting web sites. A few examples are
given below:
http://www.otolith.com/pub/u/howitt/lpc.tutorial.html
http://www.dcs.shef.ac.uk/~martin/MAD/lpcspect/lpcspect.htm
http://www.speex.org/manual/node8.html
LPC Derivation
Derive an algorithm to compute LPC coefficients from a stream of
data that minimizes the mean squared prediction error.
Let x ( n )
for 0  n  N- 1
be the sequence of data points and
a(m )
for 1  m  p be the m-th order LPC coefficients, and xˆ ( n )
be the prediction estimate.
The mean squared error for the prediction is given by:
mse 
1
N
N 1
2
ˆ


x
(
n
)

x
(
n
)

n0

1
N
N 1

n0
e( n )
2
LPC Computation
The LPC coefficients can be computed from the following matrix
equation:


1 T
T
XDXD
XDx p
a
where
 x ( m  1)
 x(m )
 x ( m  1)

x(m  2)
XD  




 x( N  2)

x( m  2 ) x( m  3)
x ( m  1) x ( m  2 )
x(m )
x ( m  1)
x ( m  1)
x(m )












x ( N  3 ) x ( N  4 )  x ( N  1  m ) 




x(0)
x (1)
x(2)
x(3)
 x(m )
 x ( m  1)
 x(m  2)

x( m  3)
xP  




 x ( N  1)











  a (1)
  a(2)
  a (3)

 a(4)
a  




 a(m )











Autocorrelation and LPC
Define the autocorrelation of a sequence as:
N 1
r(k ) 
 x(n  k ) x(n )
where
x( n )  0
for n  0 and n  N- 1
n0
Note that the LPC coefficients are computed from the
autocorrelation coefficients:
r (1)
r ( 2 )  r ( m  1) 
 r (0)
 r (1)
r (0)
r (1)
r ( m  2 )
T
X D X D   r(2)
r (1)
r (0)
r ( m  3) 






r ( 0 ) 
 r ( m  1) r ( m  2 ) r ( m  3 )


 r (1) 
 r(2) 
T
X D x p   r (3) 





 r ( m )
Autocorrelation Matrix
Autocorrelation Examples
Compute examples of the the autocorrelation of white
noise. What should it be theoretically? What should
LPC coefficients of white noise look like?
Estimate the pitch (periodicity) of a guitar note using
its autocorrelation.
Examine the prediction error as a function of
increasing order for the bell sound. What should the
best order be based on the graph? What should the
best order be based on the Akaike Information
Criterion (AIC). Could this order be predicted from
looking at the autocorrelation function?
Homework (1)
a) Prove that the DFT of autocorrelation of a sequence is
equal to its magnitude spectrum squared.
DFT r ( m )  R ( k ) R ( k )
*
b) Compute the Autocorrelation for the piano sound.
Examine the prediction error as a function of
increasing order for the piano sound. What should
the best order be based on the graph? What is the
best order based on the Akaike Information Criterion
(AIC).
Download