# Hierarchical Bayesian Level Set Inversion Matt Dunlop

Hierarchical Bayesian Level Set Inversion
Matt Dunlop
Mathematics Institute, University of Warwick, U.K.
Marco Iglesias (Nottingham), Andrew Stuart (Warwick)
SIAM UQ 2016,
Lausanne, Switzerland,
April 7th.
EPSRC
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
1 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
2 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
2 / 21
Level Set Representation
S.Osher and J.Sethian
Fronts propagating with curvature-dependent speed . . . .
J. Comp. Phys. 79(1988), 12–49.
Piecewise constant function v defined through thresholding a
continuous level set function u. Let
1 = c0 &lt; c1 &lt; &middot; &middot; &middot; &lt; cK
v (x) =
K
X
k =1
vk I{ck
1 &lt;uck }
1
(x);
&lt; cK = 1.
v = F (u).
F : X ! Z is the level-set map, X cts fns. Z piecewise cts fns.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
3 / 21
Example 1: Image Reconstruction
H.W. Engl, M. Hanke and A. Neubauer
Regularization of Inverse Problems
Kluwer(1994)
Forward Problem
Define K : Z ! RJ by (Kv )j = v (xj ), xj 2 D ⇢ Rd . Given v 2 Z
y = Kv .
Let ⌘ 2 RJ be a realization of an observational noise.
Inverse Problem
Given prior information v = F (u), u 2 X and y 2 RJ , find v :
y = Kv + ⌘.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
4 / 21
Example 2: Groundwater Flow
M. Dashti and A.M. Stuart
The Bayesian approach to inverse problems.
Handbook of Uncertainty Quantification
Editors: R. Ghanem, D.Higdon and H. Owhadi, Springer, 2017.
arXiv:1302.6989
Forward Problem: Darcy Flow
Let X + := {v 2 Z : essinfx2D v &gt; 0}. Given  2 X + , find y := G() 2 RJ
where yj = `j (p), V := H01 (D), `j 2 V ⇤ , j = 1, . . . , J and
(
r &middot; (rp) = f in D
p=0
on @D.
Let ⌘ 2 RJ be a realization of an observational noise.
Inverse Problem
Given prior information  = F (u), u 2 X and y 2 RJ , find  :
y = G() + ⌘.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
5 / 21
Example 3: Electrical Impedance Tomography
el
D
Apply currents I` on
e` , ` = 1, . . . , L.
Induces voltages ⇥` on
e` , ` = 1, . . . , L.
Input is ( , I), output is (✓, ⇥).
We have an Ohm’s law
⇥ = R( )I.
8
&gt;
r &middot; ( (x)r✓(x)) = 0
&gt;
&gt;
&gt;
&gt;
Z
&gt;
&gt;
@✓
&gt;
&gt;
&gt;
dS = I`
&lt;
@⌫
e`
@✓
&gt;
&gt;
(x) (x) = 0
&gt;
&gt;
&gt;
@⌫
&gt;
&gt;
&gt;
&gt;
&gt;
:✓(x) + z` (x) @✓ (x) = ⇥`
@⌫
Matt Dunlop (University of Warwick)
x 2D
` = 1, . . . , L
x 2 @D \
SL
`=1 e`
(PDE)
x 2 e` , ` = 1, . . . , L
Hierarchical Bayesian Level Set Inversion
April 7th 2016
6 / 21
Example 3: Electrical Impedance Tomography
M. M. Dunlop and A. M. Stuart
The Bayesian formulation of EIT.
arXiv:1509.03136
Inverse Problems and Imaging, Submitted, 2015.
Forward Problem
Let X ✓ L1 (D), and denote X + := {u 2 X : essinfx2D u &gt; 0}.
Given u 2 X , F : X ! X + and
solving (PDE).
= F (u), find (✓, ⇥) 2 H 1 (D) ⇥ RL
This gives ⇥ = R(F (u))I.
Let ⌘ 2 RL be a realisation of an observational noise.
Inverse Problem
Given prior information
y 2 RL , find :
Matt Dunlop (University of Warwick)
= F (u), u 2 X , and given currents I and
y = R( )I + ⌘.
Hierarchical Bayesian Level Set Inversion
April 7th 2016
7 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
7 / 21
Tikhonov Regularization
G : Z ! RJ and ⌘ 2 RJ a realization of an observational noise.
Inverse Problem
Find u 2 X , given y 2 RJ satisfying y = G
F (u) + ⌘.
Model-Data Misfit
The model-data misfit is
(u; y ) =
1
2
1
2
y
G
F (u)
2
.
Tikhonov Regularization
Minimize, for some E compactly embedded into X ,
I(u; y ) :=
Matt Dunlop (University of Warwick)
1
(u; y ) + kuk2E .
2
Hierarchical Bayesian Level Set Inversion
April 7th 2016
8 / 21
Issue 1: Discontinuity of Level Set Map
F (&middot;) is discontinuous at u
F (&middot;) is continuous at u
v = F (u) := v + I{u
0} (x)
+ v I{u&lt;0} (x).
Causes problems in classical level set inversion.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
9 / 21
Issue 2: Length-Scale Matters
Figure demonstrates role of length-scale in level-set function.
Classical Tikhonov-Phillips regularization does not allow for direct
control of the length-scale.
New ideas needed.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
10 / 21
Issue 3: Amplitude Matters
Y. Lu
Probabilistic Analysis of Interface Problems
PhD Thesis (2016), Warwick University
Consider case of K = 2, c1 = 0.
For contradiction assume u ? is a minimizer of I(u; y ).
Now define the sequence u✏ = ✏u ? . Then if 0 &lt; ✏ &lt; 1,
(u✏ ; y ) =
(u ? ; y ), ku✏ kE = ✏ku ? kE ) I(u✏ ; y ) &lt; I(u ? ; y ).
Hence I(u✏ ; y ) is an infimizing sequence. But u✏ ! 0 and
I(0; y ) &gt; I(u✏ ; y ) for ✏ ⌧ 1.
Thus infimum cannot be attained.
This issue caused by thresholding at 0. But idea generalizes.
Choice of threshold matters.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
11 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
11 / 21
Bayes’ Formula
Prior
Probabilistic information about u before data is collected:
&micro;0 (du)
Likelihood
Since y = G(u) + ⌘, if ⌘ ⇠ N(0, ), then y |u ⇠ N(G(u), ). The
model-data misfit is the negative log-likelihood:
P(y |u) / exp
(u; y ) ,
(u; y ) =
1
2
1
2
y
G(u)
2
.
Posterior
Probabilistic information about u after data is collected:
&micro;y (du) / exp
Matt Dunlop (University of Warwick)
(u; y ) &micro;0 (du).
Hierarchical Bayesian Level Set Inversion
April 7th 2016
12 / 21
Rigorous Statement
M. Iglesias, Y. Lu and A. M . Stuart
A level-set approach to Bayesian geometric inverse problems.
arXiv:1504.00313
M. Iglesias
A regularizing iterative ensemble Kalman method for PDE constrained inverse problems.
arXiv:1505.03876
L2⌫ (X ; S) = {f : X ! S : E⌫ kf (u)k2S &lt; 1}.
Theorem (Iglesias, Lu and S)
Let &micro;0 (du) = P(du) and &micro;y (du) = P(du|y ). Assume that u 2 X &micro;0 a.s.
Then for Examples 1–3 (and more) &micro;y ⌧ &micro;0 and y 7! &micro;y (du) is
Lipschitz in the Hellinger metric. Furthermore, if S is a separable
Banach space and f 2 L2&micro;0 (X ; S), then
y
E&micro; 1 f (u)
y
E&micro; 2 f (u)
S
 C|y1
y2 |.
Key idea in proof is that F : X ! Z is continuous &micro;0
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
a.s.
April 7th 2016
13 / 21
Whittle-Matern Priors
c(x, y ) =
2
1
2⌫
1
(⌫)
(⌧ |x
y |)⌫ K⌫ (⌧ |x
y |).
⌫ controls smoothness – draws from Gaussian fields with this
covariance have ⌫ fractional Sobolev and H&ouml;lder derivatives.
⌧ is an inverse length-scale.
is an amplitude scale.
Ignoring boundary conditions, corresponding covariance operator
given by
d
CWM( ,⌧ ) / 2 ⌧ 2⌫ (⌧ 2 I 4) ⌫ 2 .
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
14 / 21
Example 2: Groundwater Flow
1
6.5
0.9
6
0.8
5.5
0.7
5
0.6
4.5
0.5
4
0.4
3.5
0.3
3
0.2
2.5
0.1
2
0
1.5
0
0.2
0.4
0.6
0.8
1
Figure: (Row 1, ⌧ = 15) Logarithm of the true hydraulic conductivity. (Row 2
⌧ = 10, 30, 50, 70, 90) samples of F (u). (Row 3) F (E(u)). (Row 4) E(F (u)).
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
15 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
15 / 21
Hierarchical Whittle-Matern Priors (The Problem)
Prior:
P(u, ⌧ ) = P(u|⌧ )P(⌧ ).
P(u|⌧ ) = N(0, CWM(
Recall CWM(
,⌧ )
/
2 ⌧ 2⌫ (⌧ 2 I
4)
⌫
d
2
,⌧ ) ).
.
Theorem
For fixed the family of measures N(0, CWM(
,&middot;) )
are mutually singular.
Algorithms which attempt to move between singular measures perform
badly under mesh refinement (since they fail completely in the fully
resolved limit).
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
16 / 21
Hierarchical Whittle-Matern Priors (The Solution)
M. Dunlop, M. Iglesias and A. M . Stuart
Hierarchical Bayesian Level Set Inversion
arXiv:1601.03605
Hence choose = ⌧ ⌫ and define:
C⌧ / (⌧ 2 I
4)
⌫
d
2
. Prior:
P(u, ⌧ ) = P(u|⌧ )P(⌧ ).
P(u|⌧ ) = N(0, C⌧ ).
Theorem
The family of measures N(0, C⌧ ) are mutually equivalent.
Suggests need to scale thresholds in F by ⌧. Let
1 = c0 &lt; c1 &lt; &middot; &middot; &middot; &lt; cK
v (x) =
K
X
k =1
Matt Dunlop (University of Warwick)
vk I{ck
1 &lt;u⌧
⌫ c }
k
1
(x);
&lt; cK = 1.
v = F (u, ⌧ ).
Hierarchical Bayesian Level Set Inversion
April 7th 2016
17 / 21
Hierarchical Posterior
Prior
&micro;0 (du, d⌧ ) = &micro;0 (du | ⌧ )⇡0 (d⌧ )
where &micro;0 ( &middot; |⌧ ) = N(0, C⌧ ), ⇡0 is hyperprior on ⌧ .
Likelihood
⇣
Likelihood is given by P(y |u, ⌧ ) / exp
model-data misfit is now
(u, ⌧ ; y ) =
1
2
⌘
(u, ⌧ ; y ) . where the
1
2
y
G
F (u, ⌧ )
2
.
Posterior
Probabilistic information about u after data is collected:
&micro;y (du, d⌧ ) / exp
Matt Dunlop (University of Warwick)
(u, ⌧ ; y ) &micro;0 (du, d⌧ ).
Hierarchical Bayesian Level Set Inversion
April 7th 2016
18 / 21
MCMC
We implement a Metropolis-within-Gibbs algorithm to generate a
posterior-invariant Markov chain (uk , ⌧k ):
Propose vk +1 from a P(u|⌧k ) reversible kernel.
uk +1 = vk +1 w.p. 1 ^ exp( (uk , ⌧k )
uk +1 = uk otherwise.
(vk +1 , ⌧k )),
Propose tk +1 from a P(⌧ ) reversible kernel.
⌧k +1 = tk +1 w.p. 1 ^ exp( (uk +1 , ⌧k )
⌧k +1 = ⌧k otherwise.
(uk +1 , tk +1 ))w(⌧k , tk +1 ),
Here w(⌧, t) is density of N(0, Ct ) with respect to N(0, C⌧ ).
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
19 / 21
Example 2: Groundwater Flow
1
6.5
0.9
6
0.8
5.5
0.7
5
0.6
4.5
0.5
4
0.4
3.5
0.3
3
0.2
2.5
0.1
2
0
1.5
0
0.2
0.4
0.6
0.8
1
Figure: (Row 1, ⌧ = 15) Logarithm of the true hydraulic conductivity (middle,
top). (Row 2 ⌧ = 10, 30, 50, 70, 90) samples of F (u, ⌧ ). (Row 3)
F (E(u), E(⌧ )). (Row 4) E(F (u, ⌧ )).
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
20 / 21
Outline
1
Introduction
2
Classical Level Set Inversion
3
Bayesian Level Set Inversion
4
Hierarchical Bayesian Level Set Inversion
5
Conclusions
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
20 / 21
Summary
Last 5 years development of a theoretical and computational framework
for infinite dimensional Bayesian inversion with wide applicability:
1
2
3
4
5
Framework allows for noisy data and uncertain prior information.
Probabilistic well-posedness.
Theory clearly delineates (and links) analysis and probability.
Theory leads to new algorithms (defined on Banach space).
Grid-independent convergence rates for MCMC.
The methodology has been extended to solve interface problems. This is
achieved via the level set representation.
1
2
3
4
5
Level set method becomes well-posed in this setting.
Discontinuity in level set map is a probability zero event.
Hierarchical choice of length-scale improves performance.
Amplitude scale is linked to length-scale via measure equivalence.
Algorithms which reflect this link are mesh-invariant.
Potential for many future developments for both applications and theory:
1
2
Applications: subsurface imaging, medical imaging.
Theory: inhomogenous length-scales, new hierarchies.
Matt Dunlop (University of Warwick)
Hierarchical Bayesian Level Set Inversion
April 7th 2016
21 / 21