Optimal & Adaptive Filters - Intro - Home pages of ESAT

advertisement
DSP-CIS
Chapter-8: Introduction to
Optimal & Adaptive Filters
Marc Moonen
Dept. E.E./ESAT, KU Leuven
marc.moonen@esat.kuleuven.be
www.esat.kuleuven.be/scd/
Optimal & Adaptive Filters
• Introduction
Optimal & Adaptive Filters
Applications
• Optimal/Wiener Filters and least squares estimation
• Recursive Least Squares (RLS) Estimation
• Least Means Squares (LMS) Algorithm
• Other…
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 2
Optimal & Adaptive Filters - Intro
3
I nt r oduct i on : Opt i m al and adapt i ve fi l t er s
1. ‘Classical’ filter design
lowpass/ bandpass/ notch filters/ ...
2. ‘Optimal’ filter design
filter input
• signals are viewed as
stochastic processes (H249-HB78)
• filter optimisation/ design in a
statistical sense based on a priori
statistical infor mation
filter
filter parameters
filter output
+
→ Wiener filters
error
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
desired signal
p. 3
Optimal & Adaptive Filters - Intro
4
I nt r oduct i on : Opt i m al and adapt i ve fi l t er s
P r ot ot y p e opt i m al fi l t er i ng set -up :
filter input
Design filter such that for a given
(i.e. ‘statistical info available’)
input signal, filter output signal is
‘optimally close’ (to be defined)
to a given ‘desired output signal’.
filter
filter parameters
filter output
+
error
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
desired signal
p. 4
Optimal & Adaptive Filters - Intro
5
I nt r oduct i on : Opt i m al and adapt i ve fi l t er s
when a priori statistical information is not available :
3. Adaptive filter s
• self-designing
• adaptation algorithm to monitor environment
• properties of adaptive filters :
convergence/ tracking
numerical stability/ accuracy/ robustness
computational complexity
hardware implementation
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 5
Optimal & Adaptive Filters - Intro
6
I nt r oduct i on : Opt i m al and adapt i ve fi l t er s
P r ot ot y p e adapt i ve fi l t er i ng set -up :
filter input
Basic operation involves 2 processes :
1. filtering process
adaptive
filter
2. adaptation process
adjusting filter parameters to
(time-varying) environment
adaptation is steered by error signal
filter parameters
filter output
+
error
desired signal
• Depending on theapplication, either thefilter parameters, thefilter output
or the error signal is of interest
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 6
Optimal & Adaptive Filters - Intro
10
I nt r oduct i on : A ppl i cat i ons
sy st em i dent i fi cat i on/ m odel i ng
plant input
adaptive
filter
plant
+
error
plant output
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 7
Optimal & Adaptive Filters - Intro
11
I nt r oduct i on : A ppl i cat i ons
ex am pl e : channel identification
training sequence
0,1,1,0,1,0,0,...
training sequence
0,1,1,0,1,0,0,...
radio channel
adaptive
filter
+
base station antenna
mobile receiver
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 8
Optimal & Adaptive Filters - Intro
echo path
near-end signal
near-end signal
+ echo
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 9
Optimal & Adaptive Filters - Intro
14
I nt r oduct i on : A ppl i cat i ons
ex am pl e : line echo cancellation
hybrid
hybrid
two-wire
four-wire
hybrid
talker speech path
hybrid
talker echo
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 10
Optimal & Adaptive Filters - Intro
15
I nt r oduct i on : A ppl i cat i ons
ex am pl e : line echo cancellation
(continued)
far-end signal
adaptive
filter
hybrid
+
near-end signal
+ residual echo
near-end signal
+ echo
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 11
Optimal & Adaptive Filters - Intro
16
I nt r oduct i on : A ppl i cat i ons
ex am pl e : echo cancellation in full-duplex modems
transmit signal
D/A
adaptive
filter
+
receive signal
+ residual echo
hybrid
A/D
receive signal
+ echo
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 12
Optimal & Adaptive Filters - Intro
17
I nt r oduct i on : A ppl i cat i ons
ex am pl e : interference cancellation
noise
reference sensor
adaptive
filter
noise source
+
signal
+ residual noise
signal
+ noise
signal source
primary sensor
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 13
Optimal & Adaptive Filters - Intro
18
I nt r oduct i on : A ppl i cat i ons
ex am pl e : acoustic noise cancellation
noise source
adaptive
filter
reference sensors
+
signal
+ residual noise
signal source
primary sensor
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 14
Optimal & Adaptive Filters - Intro
19
I nt r oduct i on : A ppl i cat i ons
I nver se m odel i ng
plant input
plant output
plant
adaptive
filter
inverse plant
(+delay)
+
error
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
(delayed) plant input
p. 15
Optimal & Adaptive Filters - Intro
20
I nt r oduct i on : A ppl i cat i ons
ex am pl e : channel equalization (training mode)
training sequence
0,1,1,0,1,0,0,...
radio channel
base station antenna
mobile receiver
adaptive
filter
+
training sequence
0,1,1,0,1,0,0,...
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 16
Optimal & Adaptive Filters - Intro
21
I nt r oduct i on : A ppl i cat i ons
ex am pl e : channel equalization (decision-directed mode)
symbol sequence
radio channel
base station antenna
mobile receiver
adaptive
filter
decision
device
estimated symbol sequence
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 17
Optimal & Adaptive Filters - Intro
26
Opt i m al fi l t er i ng/ W i ener fi l t er s
P r ot ot y p e opt i m al fi l t er r ev i si t ed
filter structure ?
filter input
→ FIR filters
(= pragmatic choice)
filter
filter parameters
cost function ?
filter output
→ quadratic cost function
(= pragmatic choice)
+
error
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
desired signal
p. 18
Optimal & Adaptive Filters - Intro
27
Opt i m al fi l t er i ng/ W i ener fi l t er s
FIR filters (= tapped-delay line filter/ ‘transversal’ filter)
filter input
N−1
wl · uk− l
yk =
u
u[k]
u
u
kk
l= 0
u[k-2]
u
k-2
u[k-1]
k-1
w0[k]
w1[k]
w2[k]
u
u[k-3]
k-3
w3[k]
0
yk = w T · u k
filter output
where
wT =
T =
uk
+
error
w0 w1 w2 · · · wN − 1
desired signal
b
a+bw
w
a
uk uk− 1 uk− 2 · · · uk− N + 1
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 19
Optimal & Adaptive Filters - Intro
30
Opt i m al fi l t er i ng/ W i ener fi l t er s
Quadratic cost function :
minimum mean-square error (MMSE) criter ion
2} = E{ |d − y |2} = E{ |d − w T u |2}
J M SE (w ) = E{ ek
k
k
k
k
E{ x} is ‘expected value’ (mean) of x
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 20
Optimal & Adaptive Filters - Intro
35
Opt i m al fi l t er i ng/ W i ener fi l t er s
Wiener filter solution:
filter input u[k]
filter
filter parameters
filter output y[k]
+
J M SE (w ) = E{ |ek |2}
error e[k]
desired signal d[k]
= E{ |dk − w T · u k |2}
2} + w T E{ u u T } w − 2w T E{ u d } .
= E{ dk
k k
k k
¯ uu
X
¯ uu = correlation matr ix
X
¯ du
X
¯ du = cross-correlation vector
X
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 21
Optimal & Adaptive Filters - Intro
36
Opt i m al fi l t er i ng/ W i ener fi l t er s
2} + w T E{ u u T } w − 2w T E{ u d } .
J M SE (w ) = E{ dk
k k
k k
¯ uu
X
¯ du
X
cost function is convex, with a (mostly) unique minimum,
obtained by setting the gradient equal to zero:
0= [
∂J M SE (w )
¯ uuw − 2X
¯ du]w = w
]w = w W F = [2X
WF
∂w
Wiener-Hopf equations :
¯ uu · w W F = X
¯ du
X
→
¯ − 1X
¯
wWF = X
uu du .....simple enough!
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 22
Least Squares & RLS Estimation
5
1.
2 L east Squar es (L S) Est imat ion
Quadratic cost function
M M SE : (see Lecture 8)
J M SE (w ) = E{ e2k } = E{ |dk − yk |2} = E{ |dk − w T u k |2}
L east -squar es(L S) cr it er ion :
if statistical info is not available, may use an alternative ‘data-based’ criterion...
L
L
e2k =
J L S(w ) =
k= 1
|dk − yk |2 =
L
T u |2
|d
−
w
k
k
k= 1
k= 1
Interpretation? : see below
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 23
Least Squares & RLS Estimation
6
3
1.
2 L east Squar es (L S) Est imat ion
filter input sequence : u 1, u 2, u 3, . . . , u L
corresponding desired response sequence is : d1, d2, d3, . . . , dL
e1
e2
..
eL
error signal e
=
d1
d2
..
dL
d
cost function : J L S(w) =
u T1
−
u T2
..
u TL
w0
w
· .1
.
wN
w
U
L
2
k= 1 ek =
e 22 = d − Uw 22
→ linear least squares problem : minw d − Uw 22
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 24
Least Squares & RLS Estimation
7
3
1.
2 L east Squar es (L S) Est imat ion
L
e2k = e 22 = eT · e = d − Uw 22
J L S(w) =
k= 1
minimum obtained by setting gradient = 0 :
∂J L S(w)
∂
]w = w L S = [ (d T d + w T U T Uw − 2w T U T d)]w = w L
∂w
∂w
= [2U T U w − 2U T d ]w = w L S
0= [
X uu
X du
X uu · w L S = X du → w L S = X −uu1X du
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
‘normal equations’.
p. 25
Least Squares & RLS Estimation
9
1.
3
2 L east Squar es (L S) Est imat ion
N ot e : correspondences with Wiener filter theory ?
¯ uu and X¯ du by time-averaging (ergodicity!)
♣ estimate X
¯ uu} = 1
estimate{ X
L
¯ du } = 1
estimate{ X
L
L
u k · u Tk =
1 T
1
U · U = X uu
L
L
u k · dk =
1 T
1
U · d = X du
L
L
k= 1
L
k= 1
leads to same optimal filter :
estimate{ w W F } = ( L1 X uu)− 1 · ( L1 X du) = X −uu1 · X du = w L S
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 26
Least Squares & RLS Estimation
11
2.
3 R ecur sive L east Squar es (R L S)
For a fixed data segment 1...L , least squares problem is
minw d − Uw 22
d=
d1
d2
..
dL
u T1
U=
u T2
..
u TL
w LS = [U T U]− 1 · U T d .
Wanted : recursive/ adaptive algorithms
L → L + 1, sliding windows, etc...
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 27
Least Squares & RLS Estimation
3 R ecur sive L east Squar es (R L S)
2.
12
• given opt imal solut ion at t ime L ...
w L S(L ) = [X uu (L )]− 1X du(L ) = [U(L )T U(L )]− 1[U(L )T d(L )]
d1
d(L ) =
u T1
d2
...
U(L ) =
dL
u T2
...
u TL
• ...comput e opt imal solut ion at t ime L + 1
w L S(L + 1) = ...
d(L + 1) =
d1
u T1
d2
...
u T2
...
u TL + 1
dL + 1
U(L + 1) =
W ant ed : O(N 2) instead of O(N 3) updating scheme
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 28
Least Squares & RLS Estimation
13
2.1
3 St andar d R L S
It is observed that X uu(L + 1) = X uu (L ) + u L + 1u TL + 1
The matrix inversion lemma states that
[X uu(L + 1)]− 1 = [X uu(L )]− 1 −
[X uu (L )]− 1u L + 1u TL + 1[X uu (L )]− 1
1+ u TL + 1[X uu (L )]− 1u L+ 1
Result :
w L S(L + 1) = w L S(L ) + [X uu(L + 1)]− 1u L + 1 · (dL + 1 − u TL + 1w L S(L ))
Kalman gain
a priori residual
= st andar d r ecur sive least squar es (R L S) algor it hm
R emar k : O(N 2) operations per time update
R emar k : square-root algorithms with better numerical properties
see below
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 29
Least Squares & RLS Estimation
16
2.3
3 Exponent ially W eight ed R L S
Exp onent ially weight ed R L S
J L S(w ) =
L
2(L − k) 2
ek
k= 1 λ
0 < λ < 1 is weighting factor or forget factor
1
1− λ
is a ‘measure of the memory of the algorithm’
w L S(L ) = [U(L )T U(L )]− 1 [U(L )T d(L )]
[X uu (L )]− 1
X du (L )
with
d(L ) =
λ L − 1d1
λ L − 2d2
..
λ 0dL
λ L − 1u T1
U(L ) =
λ L − 2u T2
..
λ 0u TL
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 30
Least Squares & RLS Estimation
17
2.3
3 Exponent ially W eight ed R L S
It is observed that X uu(L + 1) = λ 2X uu (L ) + u L + 1u TL + 1
hence
−1
[X uu(L + 1)]
=
1 [X (L )]− 1u
1
T
−1
uu
L+ 1u L+ 1 λ 2 [X uu (L )]
1
−1
λ2
[X (L )] −
λ 2 uu
1+ 12 u TL+ 1[X uu(L )]− 1u L + 1
λ
w L S(L + 1) = w L S(L ) + [X uu(L + 1)]− 1u L + 1 · (dL + 1 − u TL + 1w L S(L )) .
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 31
4. Least Mean Squares (LMS) Algorithm
Bernard Widrow
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 32
Least Mean Squares (LMS) Algorithm
4
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 33
Least Mean Squares (LMS) Algorithm
4
(Widrow 1965 !!)
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 34
Least Mean Squares (LMS) Algorithm
4
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 35
Least Mean Squares (LMS) Algorithm
4
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 36
Least Mean Squares (LMS) Algorithm
4
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 37
Least Mean Squares (LMS) Algorithm
4
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 38
5. Other…: Square-root RLS Algorithms
18
3. Squar e-r oot A lgor it hms
• Standard RLSexhibitsunstableroundoff error accumulation,
hence not the algorithm of choice in practice
• Alternativealgorithms(‘square-root algorithms’), which have
been proved to be stable numerically, are based on orthogonal matrix decompositions, namely QR decomposition
(+ QR updating, inverse QR updating, see below)
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 39
5. Other… : Fast RLS Algorithms
DSP-CIS / Chapter-8 : Optimal & Adaptive Filters / Version 2012-2013
p. 40
Download