Exact Approximation of Monte Carlo Filters Adam M. Johansen , Arnaud Doucet

advertisement
Exact Approximation of Monte Carlo Filters
Exact Approximation of Monte Carlo Filters
Adam M. Johansen? , Arnaud Doucet‡ and Nick Whiteley†
? University
of Warwick Department of Statistics,
of Bristol School of Mathematics,
‡ University of Oxford Department of Statistics
† University
a.m.johansen@warwick.ac.uk
http://go.warwick.ac.uk/amjohansen/talks
October 11th, 2012
SMC Methods and Efficient Simulation in Finance
1
Exact Approximation of Monte Carlo Filters
Background
Outline
Context & Outline
Filtering in State-Space Models:
I
SIR Particle Filters [GSS93]
I
Rao-Blackwellized Particle Filters [AD02, CL00]
I
Block-Sampling Particle Filters [DBS06]
Exact Approximation of Monte Carlo Algorithms:
I
Particle MCMC [ADH10]
Approximating the RBPF
I
Approximated Rao-Blackwellized Particle Filters [CSOL11]
I
Exactly-approximated RBPFs [JWD12]
Approximating the BSPF
I
Local SMC [JD12]
2
Exact Approximation of Monte Carlo Filters
Background
Particle MCMC & Related Concepts
Particle MCMC
I
I
MCMC algorithms which employ SMC proposals [ADH10]
SMC algorithm as a collection of RVs
I
I
I
I
Values
Weights
Ancestral Lines
Construct MCMC algorithms:
I
I
I
With many auxiliary variables
Exactly invariant for distribution on extended space
Standard MCMC arguments justify strategy
I
Does this suggest anything about SMC?
I
Can something similar help with smoothing?
3
Exact Approximation of Monte Carlo Filters
Background
Particle MCMC & Related Concepts
Ancestral Trees
t=1
t=2
t=3
a31 =1
a34 =3
a21 =1
a24 =3
2
4
6
b3,1:3
=(1, 1, 2) b3,1:3
=(3, 3, 4) b3,1:3
=(4, 5, 6)
4
Exact Approximation of Monte Carlo Filters
Background
Particle MCMC & Related Concepts
SMC Distributions
We’ll need the SMC Distribution:
M
an−L+2:n , xn−L+1:n , k; xn−L
ψn,L
"M
#
"
#
n
M
Y
Y
Y
aip
i
i
=
q( x n−L+1 x n−L )
r (ap |wp−1 )
q x p |x p−1
r (k|wn )
i=1
p=n−L+2
i=1
and the conditional SMC Distribution:
ek
M
k
e
ψen,L
ak
xk
xn−L+1:n
n−L+2:n , e
n−L+1:n ; xn−L bn−L+1:n−1 , k, e
M
ψn,L
(e
an−L+2:n , e
xn−L+1:n , k; xn−L )
"
#
= ebk ebn n
k
e
Q
bn,n−L+1
n,p
n,p−1
k
e p−1 q e
e n)
q e
xn−L+1 |xn−L
r e
bn,p |w
xp |e
xp−1
r (k|w
p=n−L+2
5
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
A Class of Hidden Markov Models
A (Rather Broad) Class of Hidden Markov Models
y1
x1
z1
x2
z2
y2
x3
z3
y3
I
I
I
Unobserved Markov chain {(Xn , Zn )} transition f .
Observed process {Yn } conditional density g .
Density:
p(x1:n , z1:n , y1:n ) = f1 (x1 , z1 )g (y1 |x1 , z1 )
n
Y
f (xi , zi |xi−1 , zi−1 )g (yi |xi , zi ).
i=2
6
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
A Class of Hidden Markov Models
Formal Solutions
I
Filtering and Prediction Recursions:
p(xn , zn |y1:n−1 )g (yn |xn , zn )
p(xn , zn |y1:n ) = R
p(xn0 , zn0 |y1:n−1 )g (yn |xn0 , zn0 )d(xn0 , zn0 )
Z
p(xn+1 , zn+1 |y1:n ) = p(xn , zn |y1:n )f (xn+1 , zn+1 |xn , zn )d(xn , zn )
I
Smoothing:
p((x, z)1:n |y1:n ) ∝ p((x, z)1:n−1 |y1:n−1 )f ((x, z)n |(x, z)n−1 )g (yn |(x, z)n )
7
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Elementary Algorithms
A Simple SIR Filter
Algorithmically, at iteration n:
I
I
i
Given {Wn−1
, (X , Z )i1:n−1 } for i = 1, . . . , N:
e, Z
e )i
}.
Resample, obtaining { 1 , (X
1:n−1
N
I
e, Z
e )i )
Sample (X , Z )in ∼ qn (·|(X
n−1
I
Weight
Wni ∝
e ,Z
e )i )g (yn |(X ,Z )i )
f ((X ,Z )in |(X
n
n−1
e ,Z
e )i )
qn ((X ,Z )i |(X
n
n−1
Actually:
I
Resample efficiently.
I
Only resample when necessary.
I
...
8
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Elementary Algorithms
A Rao-Blackwellized SIR Filter
Algorithmically, at iteration n:
I
X ,i
i
i
Given {Wn−1
, (X1:n−1
, p(z1:n−1 |X1:n−1
, y1:n−1 )}
ei
ei
, p(z1:n−1 |X
Resample, obtaining { 1 , (X
I
For i = 1, . . . , N:
I
1:n−1
N
I
I
1:n−1 , y1:n−1 ))}.
ei )
Sample Xni ∼ qn (·|X
n−1
i
i
ei
Set X1:n
← (X
1:n−1 , Xn ).
ei )
p(Xni ,yn |X
n−1
ei )
qn (X i |X
I
Weight WnX ,i ∝
I
i
Compute p(z1:n |y1:n , X1:n
).
n
n−1
Requires analytically tractable substructure.
9
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
An Approximate Rao-Blackwellized SIR Filter
Algorithmically, at iteration n:
I
I
I
X ,i
i
i
Given {Wn−1
, (X1:n−1
,b
p (z1:n−1 |X1:n−1
, y1:n−1 )}
ei
ei
,b
p (z1:n−1 |X
Resample, obtaining { 1 , (X
1:n−1
N
1:n−1 , y1:n−1 ))}.
For i = 1, . . . , N:
I
I
ei )
Sample Xni ∼ qn (·|X
n−1
i
i
ei
Set X1:n
← (X
1:n−1 , Xn ).
ei )
b
p (Xni ,yn |X
n−1
ei )
qn (X i |X
I
Weight WnX ,i ∝
I
i
Compute b
p (z1:n |y1:n , X1:n
).
n
n−1
Is approximate; how does error accumulate?
10
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
Exactly Approximated Rao-Blackwellized SIR Filter
At time n = 1
I Sample, X i ∼ q x ( ·| y1 ).
1
i,j
z ·| X i , y .
I Sample, Z
1 1
1 ∼q
I Compute and normalise the local weights
w1z X1i , Z1i,j
p(X i , y1 , Z i,j )
:= 1 1 , W1z,i,j
q z Z1i,j X1i , y1
define b
p (X1i , y1 ) :=
:=
w1z X1i , Z1i,j
,
PM
z X i , Z i,k
w
1
1
k=1 1
M
1 X z i i,j w1 X1 , Z1 .
M
j=1
I
Compute and normalise the top-level weights
w1x
X1i
w1x X1i
b
p (X1i , y1 )
x,i
, W1 := PN
:= x
.
x
k
q X1i |y1
k=1 w1 X1
11
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
At times n ≥ 2
I
Resample
x,i
Wn−1
,
n
o z,i,j
i,j
i
X1:n−1 , Wn−1 , Z1:n−1
j
i
to obtain
I
I
I
n
o 1
z,i,j
i,j
i
e
, X1:n−1 , W n−1 , Z 1:n−1
.
N
j
i
z,i,j
i,j
1 e i,j
, Z1:n−1 }j .
Resample {W n−1 , Z 1:n−1 }j to obtain { M
i
x
i
i
i
e
ei
Sample Xn ∼ q (·|X
, y1:n ); set X1:n := (X
1:n−1 , Xn ).
1:n−1
e i,j
Sample Zni,j ∼ q z ·| X i , y1:n , Z
; set
1:n
i,j
Z1:n
1:n−1
e i,j , Zni,j ).
:= (Z
1:n−1
12
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
I
Compute and normalise the local weights
ei
e i,j
,
Z
p Xni , yn , Zni,j X
n−1
n−1
i,j
i
,
wnz X1:n
, Z1:n
:=
i ,y ,Z
e i,j
q z Zni,j X1:n
1:n
1:n−1
ei
b
p ( Xni , yn X
1:n−1 , y1:n−1 )
: =
Wnz,i,j : =
M
1 X z i
i,j
wn X1:n , Z1:n
,
M
j=1
i , Z i,j
wnz X1:n
1:n
.
PM
i,k
i
z
k=1 wn X1:n , Z1:n
13
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
I
Compute and normalise the top-level weights
wnx
i
X1:n
Wnx,i
i
e
b
p ( Xni , yn X
1:n−1 , y1:n−1 )
,
:=
x
i
ei
q (Xn |X
1:n−1 , y1:n )
i
wnx X1:n
:= PN
.
k
x
k=1 wn X1:n
14
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
How can this be justified?
I
As an extended space SIR algorithm.
I
Via unbiased estimation arguments.
Note also the M = 1 and M → ∞ cases.
How does this differ from CSOL11?
Principally in the local weights, benefits including:
I
Valid (N-consistent) for all M ≥ 1 rather than
(M, N)-consistent.
I
Computational cost O(MN) rather than O(M 2 N).
I
Only requires knowledge of joint behaviour of x or z; doesn’t
require say p(xn |xn−1 , zn−1 ).
15
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
Toy Example: Model
We use a simulated sequence of 100 observations from the model
defined by the densities:
x1
0
1 0
µ(x1 , z1 ) =N
;
z1
0
0 1
xn
xn−1
1 0
f (xn , zn |xn−1 , zn−1 ) =N
;
,
zn
zn−1
0 1
2
xn
σx 0
g (yn |xn , zn ) =N yn ;
,
zn
0 σz2
Consider IMSE (relative to optimal filter) of filtering estimate of
first coordinate marginals.
16
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
Mean Squared Filtering Error
Approximation of the RBPF
N = 10
−1
10
N = 20
N = 40
N = 80
−2
10
N = 160
0
10
1
10
Number of Lower−Level Particles, M
2
10
For σx2 = σz2 = 1.
17
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
Computational Performance
0
Mean Squared Filtering Error
10
N = 10
N = 20
N = 40
N = 80
N = 160
−1
10
−2
10
−3
10
1
10
2
10
3
10
Computational Cost, N(M+1)
4
10
5
10
For σx2 = σz2 = 1.
18
Exact Approximation of Monte Carlo Filters
Approximating the RBPF
Exact Approximation of the RBPF
Computational Performance
N = 10
N = 20
N = 40
N = 80
N = 160
1.9
Mean Squared Filtering Error
10
1.8
10
1.7
10
1.6
10
1
10
2
10
3
10
Computational Cost, N(M+1)
4
10
5
10
For σx2 = 102 and σz2 = 0.12 .
19
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Model
What About Other HMMs / Algorithms?
Returning to:
x1
x2
x3
x4
x5
x6
y1
y2
y3
y4
y5
y6
I
Unobserved Markov chain {Xn } transition f .
I
Observed process {Yn } conditional density g .
Density:
I
p(x1:n , y1:n ) = f1 (x1 )g (y1 |x1 )
n
Y
f (xi |xi−1 )g (yi |zi ).
i=2
20
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Idealised Algorithms
Block Sampling: An Idealised Approach
At time n, given x1:n−1 ; discard xn−L+1:n−1 :
I
Sample from q(xn−L+1:n |xn−L , yn−L+1:n ).
I
Weight with
W (x1:n ) =
I
p(x1:n |y1:n )
p(x1:n−L |y1:n−1 )q(xn−L+1:n |xn−L , y1:n−L+1:n )
Optimally,
q(xn−L+1:n |xn−L , yn−L+1:n ) =p(xn−L+1:n |xn−L , yn−L+1:n )
p(x1:n−L |y1:n )
W (x1:n ) ∝
=p(yn |x1:n−L , yn−L+1:n−1 )
p(x1:n−L |y1:n−1 )
I
Typically intractable; auxiliary variable approach in [DBS06].
21
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Idealised Algorithms
Why Try To Block-Sample?
Explicit motivation from the linear Gaussian case:
Varp(xn−L |y1:n−1 ) [wn,L (Xn−L )]
Z ∞
N 2 (xn−L ; µn−L|n , Σn−L|n )
=
dxn−L − 1
−∞ N (xn−L ; µn−L|n−1 , Σn−L|n−1 )
Σn−L|n−1
=q
exp
Σn−L|n (2Σn−L|n−1 − Σn−L|n )
(µn−L|n − µn−L|n−1 )2
2Σn−L|n−1 − Σn−L|n
!
− 1.
22
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Idealised Algorithms
Optimal Block Sampling Central Limit Theorem
I
I
I
R
Let ϕn : X n → R, ϕ̄n = ϕn (x1:n )p(x1:n |y1:n )dx1:n .
Allow ϕ
bN
n,L to denote the estimate obtained with lag-L.
Then:
h
i
n
N
Q
P
i )
i
ϕ(Xn,?
wp,L (Xp−1,p−L
)
ϕ
bN
n,L =
i=1
p=L+1
N
P
n
Q
i=L+1 p=L+1
i
Xn,?
I
i
wp,L (Xp−1,p−L
)
i ,Xi
i
i
(XL,1
L+1,2 , . . . , Xn−1,n−L , Xn,n−L+1:n ).
where
:=
Under basic regularity conditions:
√
d
bN
lim N(ϕ
n,L − ϕ̄n ) →N (0, Vn (ϕn ))
N→∞
Z Y
n
p(xp−L |y1:p )2
where V (ϕ)
b =
(ϕ(x1:n ) − ϕ̄)2 dx1:n .
p(xp−L |y1:p−1 )
p=L+1
23
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Toy Model: Linear Gaussian HMM
I
Linear, Gaussian state transition:
f (xt |xt−1 ) = N (xt ; xt−1 , 1)
I
and likelihood
g (yt |xt ) = N (yt ; xt , 1)
I
I
Analytically: Kalman filter/smoother/etc.
Simple bootstrap PF:
I
Proposal:
q(xt |xt−1 , yt ) = f (xt |xt−1 )
I
Weighting:
W (xt−1 , xt ) ∝ g (yt |xt )
I
Resample residually every iteration.
24
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
More than one SMC Algorithm?
I
Standard approach:
I
I
Run an SIR algorithm with N particles.
Use
N
X
i (dx1:n ).
πnN (dx1:n ) =
Wni δX1:n
i=1
I
A crude alternative:
I
I
Run L = bN/Mc algorithms with M particles.
Use
M
X
M,l
πn (dx1:n ) =
Wnl,i δX l,i (dx1:n ).
1:n
i=1
I
I
Guarantees L i.i.d. samples.
For small M their distribution may be poor.
25
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Covariance Estimation: 1d Linear Gaussian Model
0.4
0.35
0.3
0.25
0.2
0.15
0.1
0.05
0
0
100
200
300
400
500
600
700
800
900
1000
26
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Local Particle Filtering: Current Trajectories
4
3
2
1
0
−1
−2
−3
−4
0
2
4
6
8
10
12
14
16
27
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Local Particle Filtering: First Particle
4
3
2
1
0
−1
−2
−3
−4
0
2
4
6
8
10
12
14
16
28
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Local Particle Filtering: SMC Proposal
4
3
2
1
0
−1
−2
−3
−4
0
2
4
6
8
10
12
14
16
29
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Linear Gaussian Model
Local Particle Filtering: CSMC Auxiliary Proposal
4
3
2
1
0
−1
−2
−3
−4
0
2
4
6
8
10
12
14
16
30
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Local SMC
I
Propose from:
⊗n−1
M
U1:M
(b1:n−2 , k)p(x1:n−1 |y1:n−1 )ψn,L
(an−L+2:n , xn−L+1:n , k; xn−L )
k
k
M
ψen−1,L−1 e
an−L+2:n−1 , e
xn−L+1:n−1 ; xn−L ||bn−L+2:n−1 , xn−L+1:n−1
I
Target:
b
k̄
n,n−L+1:n
⊗n
k̄
U1:M
(b1:n−L , b̄n,n−L+1:n−1
|y1:n )
, k̄)p(x1:n−L , x n−L+1:n
k̄
b n,n−L+1:n
k
k
k̄
M
e
ψn,L an−L+2:n , xn−L+1:n ; xn−L b̄n,n−L+1:n , x n−L+1:n
M
ψn−1,L−1
(e
an−L+2:n−1 , e
xn−L+1:n−1 , k; xn−L ) .
I
en−L+1:n−1 .
Weight: Z̄n−L+1:n /Z
31
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Key Identity
M
ψn,L
(an−L+2:n , xn−L+1:n , k; xn−L )
k
M (ak
p(xn−L+1:n |xn−L , yn−L+1:n )ψen,L
n−L+2:n , xn−L+1:n , k; xn−L ||. . . )
"
k
#
k
n
n
Q
b
b
bn,n−L+1
n,p−1
k
r (k|wn )
q xn−L+1 |xn−L
r bn,p
|wp−1 q xp n,p |xp−1
=
p=n−L+2
p(xn−L+1:n |xn−L , yn−L+1:n )
bn−L+1:n /p(yn−L+1:n |xn−L )
=Z
32
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Bootstrap Local SMC
I
Top Level:
I
I
I
Local SMC proposal.
Stratified resampling when ESS< N/2.
Local SMC Proposal:
I
Proposal:
q(xt |xt−1 , yt ) = f (xt |xt−1 )
I
Weighting:
W (xt−1 , xt ) ∝
I
f (xt |xt−1 )g (yt |xt )
= g (yt |xt )
f (xt |xt−1 )
Resample multinomially every iteration.
33
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Bootstrap Local SMC: M=100
N = 100, M = 100
100
Average Number of Unique Values
90
80
70
L=2
L=3
L=4
L=5
60
50
40
30
20
10
0
0
100
200
300
400
500
n
600
700
800
900
1000
34
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Bootstrap Local SMC: M=1000
N = 100, M = 1000
100
Average Number of Unique Values
90
80
70
L=2
L=3
L=4
L=5
60
50
40
30
20
10
0
0
100
200
300
400
500
n
600
700
800
900
1000
35
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Exact Approximation of BSPFs / Local Particle Filtering
Bootstrap Local SMC: M=10000
N = 100, M = 10000
100
Average Number of Unique Values
90
80
70
L=2
L=3
L=4
L=5
60
50
40
30
20
10
0
0
100
200
300
400
500
n
600
700
800
900
1000
36
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
Stochastic Volatility Bootstrap Local SMC
I
Model:
f (xi |xi−1 ) =N φxi−1 , σ 2
g (yi |xi ) =N 0, β 2 exp(xi )
I
Top Level:
I
I
I
Local SMC proposal.
Stratified resampling when ESS< N/2.
Local SMC Proposal:
I
Proposal:
q(xt |xt−1 , yt ) = f (xt |xt−1 )
I
Weighting:
W (xt−1 , xt ) ∝
I
f (xt |xt−1 )g (yt |xt )
= g (yt |xt )
f (xt |xt−1 )
Resample residually every iteration.
37
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Simulated Data
Simulated Data
4
3
Observed Values
2
1
0
−1
−2
−3
0
100
200
300
400
500
600
700
800
900
n
38
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=100
N = 100, M = 100
100
Average Number of Unique Values
90
80
70
L=2
L=4
L=6
L = 10
60
50
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
39
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=1000
N = 100, M = 1000
100
Average Number of Unique Values
90
80
70
60
L=2
L=4
L=6
L = 10
L = 14
L = 18
50
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
40
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=10000
N = 100, M = 10000
100
Average Number of Unique Values
90
80
70
60
50
L=2
L=4
L=6
L = 10
L = 14
L = 18
L = 22
L = 26
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
41
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Exchange Rata Data
Exchange Rate Data
5
4
Observed Values
3
2
1
0
−1
−2
−3
−4
0
100
200
300
400
500
600
700
800
900
n
42
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=100
N = 100, M = 100
100
Average Number of Unique Values
90
80
70
L=2
L=4
L=6
L = 10
60
50
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
43
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=1000
N = 100, M = 1000
100
Average Number of Unique Values
90
80
70
60
L=2
L=4
L=6
L = 10
L = 14
L = 18
50
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
44
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Bootstrap Local SMC: M=10000
N=100, M=10,000
100
Average Number of Unique Values
90
80
70
60
50
L=2
L=4
L=6
L = 10
L = 14
L = 18
L = 22
L = 26
40
30
20
10
0
0
100
200
300
400
500
600
700
800
900
n
45
Exact Approximation of Monte Carlo Filters
Block Sampling Particle Filters
Stochastic Volatility Model
SV Exchange Rata Data
Exchange Rate Data
5
4
Observed Values
3
2
1
0
−1
−2
−3
−4
0
100
200
300
400
500
600
700
800
900
n
46
Exact Approximation of Monte Carlo Filters
Summary
In Conclusion
I
I
I
SMC can be used hierarchically.
Software implementation is not difficult [Joh09].
The Rao-Blackwellized particle filter can be approximated
exactly
I
I
I
I
I
Can reduce estimator variance at fixed cost.
Attractive for distributed/parallel implementation.
Allows combination of different sorts of particle filter.
Can be combined with other techniques for parameter
estimation etc..
The optimal block-sampling particle filter can be
approximated exactly
I
I
I
Requiring only simulation from the transition and evaluation of
the likelihood
Easy to parallelise
Low storage cost
47
Exact Approximation of Monte Carlo Filters
Summary
References I
C. Andrieu and A. Doucet. Particle filtering for partially observed Gaussian state space models. Journal of
the Royal Statistical Society B, 64(4):827–836, 2002.
C. Andrieu, A. Doucet, and R. Holenstein. Particle Markov chain Monte Carlo. Journal of the Royal
Statistical Society B, 72(3):269–342, 2010.
R. Chen and J. S. Liu. Mixture Kalman filters. Journal of the Royal Statistical Society B, 62(3):493–508,
2000.
T. Chen, T. Schön, H. Ohlsson, and L. Ljung. Decentralized particle filter with arbitrary state
decomposition. IEEE Transactions on Signal Processing, 59(2):465–478, February 2011.
A. Doucet, M. Briers, and S. Sénécal. Efficient block sampling strategies for sequential Monte Carlo
methods. Journal of Computational and Graphical Statistics, 15(3):693–711, 2006.
N. J. Gordon, S. J. Salmond, and A. F. M. Smith. Novel approach to nonlinear/non-Gaussian Bayesian state
estimation. IEE Proceedings-F, 140(2):107–113, April 1993.
A. M. Johansen and A. Doucet. Hierarchical particle sampling for intractable state-space models. CRiSM
working paper, University of Warwick, 2012. In preparation.
A. M. Johansen. SMCTC: Sequential Monte Carlo in C++. Journal of Statistical Software, 30(6):1–41,
April 2009.
A. M. Johansen, N. Whiteley, and A. Doucet. Exact approximation of Rao-Blackwellised particle filters. In
Proceedings of 16th IFAC Symposium on Systems Identification. IFAC, 2012.
48
Download