Sparsity in Bayesian Inversion of Parametric Operator Equations Christoph Schwab

advertisement
Sparsity in Bayesian Inversion
of Parametric Operator Equations
Christoph Schwab
joint with Claudia Schillings
Ackn.: A. Cohen, R. DeVore, A. Stuart
June 20, 2013
research supported by the European Research Council under FP7 Grant AdG247277
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
1 / 19
Outline
1
Bayesian Inversion of Parametric Operator Equations
2
Sparsity of Parametric Forward Solution
3
Sparsity of the Posterior Density
4
Sparse Quadrature
5
Numerical Results
Parametric Parabolic Problem
6
Summary
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
2 / 19
Bayesian Inverse Problems (Stuart 2010)
Goal: Expected response in QoI over all uncertain parameters u ∈ X,
conditional on noisy data δ
δ = G(u) + η,
G =O◦G
X (separable Banach) space of uncertainties
G : X 7→ X uncertainty-to-solution map
O : X 7→ RK observation operator, O ∈ (X 0 )K
G : X 7→ RK uncertainty-to-observation map, G = O ◦ G
η ∈ RK additive observational noise (η ∼ N (0, Γ)), noise (co)variance Γ > 0.
Linear Operator Equation with operator uncertainty
Given f ∈ Y 0 and u ∈ X, find q ∈ X :
A(u; q) = f
with A ∈ L(X , Y 0 ), X , Y reflexive Banach spaces, a(v, w) :=Y hw, AviY 0 ∀v ∈ X , w ∈ Y induced
bilinear form; q(u) = G(u) := (A(u; q))−1 f , G(u) := O(G(u))
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
3 / 19
Bayesian Inverse Problems (Stuart 2010)
Goal: Expected response in QoI over all uncertain parameters u ∈ X,
conditional on noisy data δ
δ = G(u) + η,
G =O◦G
X (separable Banach) space of uncertainties
G : X 7→ X uncertainty-to-solution map
O : X 7→ RK observation operator, O ∈ (X 0 )K
G : X 7→ RK uncertainty-to-observation map, G = O ◦ G
η ∈ RK additive observational noise (η ∼ N (0, Γ)), noise (co)variance Γ > 0.
Bayes LSQ potential ΦΓ : X × RK → R
ΦΓ (u; δ) :=
Ch. Schwab (SAM)
1
(δ − G(u))> Γ−1 (δ − G(u))
2
Sparsity in Bayesian Inversion
June 20, 2013
3 / 19
Bayesian Inverse Problems (Stuart 2010)
Bayes’ Theorem:
Expected value of QoI φ(u) w.r. to posterior measure µδ ,
conditional on given data δ
Z
1
µδ
E [φ] =
exp(−ΦΓ (u; δ))φ(u)dµ0 (u)
ZΓ u∈X
Normalization constant ZΓ = Eµ0 [exp(−ΦΓ (u; δ))].
µδ -expectation over all uncertainties u ∈ X
Standard Approach: sampling (w.r. to unknown measure µδ !)
MCMC; Rate ∼ #(PDEsolves)−1/2
Present work: reformulation of µδ -expectation over all u ∈ X as an
infinite dimensional, deterministic quadrature problem; sparsity of
integrands; adaptive Smolyak; massively parallel implementation.
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
3 / 19
Bayesian Inverse Problems (Stuart 2010)
Parametrization of uncertainty u ∈ X (eg. KL, MRA, ...)
X
u = u(y) := hui +
yj ψj ∈ X, y = (yj )j∈J ∈ U
j∈J
y = (yj )j∈J i.i.d sequence of real-valued random variables yj ∼ U(−1, 1)
bj := kψj kX , (bj )j≥1 ∈ `1 (J),
hui, ψj ∈ X,
J finite (J = {1, 2, ..., J}) or countably infinite (J = N) index set
Bayesian prior probability measure on uncertain parameters y
µ0 :=
O
j∈J
πj ,
1
non-gaussian, eg. uniform: πj = λ1 .
2
N
1
(U, B) = [−1, 1]J ,
j∈J B [−1, 1] measurable space
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
3 / 19
(p, ε) Analyticity (Chkifa, Cohen, DeVore & CS 2012)
(p, ε) : 1 (uniform well-posedness)
For each y ∈ U, there exists a unique realization u(y) ∈ X and a unique solution
q(y) ∈ X of the forward problem. This solution satisfies the a-priori estimate
∀y ∈ U :
kq(y)kX ≤ C0 (y) ,
where U 3 y 7→ C0 (y) ∈ L1 (U, µ0 ).
(p, ε) : 2 (holomorphy)
There exist 0 ≤ p ≤ 1 and b = (bj )j∈J ∈ `p (J) such that for 0 < ε ≤ 1, there exist
0 < Cε < ∞ and ρ = (ρj )j∈J , ρj > 1 such that
X
(ρj − 1)bj ≤ ε ,
j∈J
and U 3 y 7→ q(y) ∈ X admits holomorphic extension to Eρ :=
∀z ∈ Eρ :
Ch. Schwab (SAM)
Q
j∈J
Eρj ⊂ CJ
kq(z)kX ≤ Cε .
Sparsity in Bayesian Inversion
June 20, 2013
4 / 19
(p, ε)-Analyticity =⇒ Sparsity
Theorem (Sparsity)(Chkifa, Cohen, DeVore & CS)
Assume q(y) = G(u(y)) is (p, ε)-analytic. Then
∀y ∈ U :
q(y) =
X
qPν Pν (y)
ν∈F
with unconditional convergence in L∞ (U, µ0 ; X ), where
O
Pνj (yj ) (Tensor Legendre Polynomials)
Pν (y) :=
j≥1
ν ∈ F := (J ∪ {0})Nfinite , Pk (1) = 1, kPk kL∞ (−1,1) = 1 ,
k = 0, 1, ....
There exists sequence of nested, monotone ΛPN ⊂ F with #(ΛPN ) ≤ N such that
X P
≤ C(p, q)N −(1/p−1) .
q(y)
−
q
P
(y)
sup ν
ν
y∈U P
ν∈Λ
N
Ch. Schwab (SAM)
X
Sparsity in Bayesian Inversion
June 20, 2013
5 / 19
Bayesian Inverse Problem
Bayes’ Theorem (parametric; CS & A.M. Stuart 2010)
Assume G(u)
u=hui+
P
j∈J yj ψj
: U 7→ R bounded and continuous.
Then µδ is a.c. with respect to µ0 :
dµδ
1
(y) =
ΘΓ (y)
dµ0
ZΓ
with posterior density ΘΓ given by
ΘΓ (y) = exp −ΦΓ (u; δ) u=hui+
P
j∈J yj ψj
,
y∈U
and normalization constant
Z
ZΓ :=
ΘΓ (y)dµ0 (y) .
U
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
6 / 19
Bayesian Inverse Problem
Expectation of Quantity of Interest (QoI) φ : X → S
Z
Z0
1
µδ
E [φ(u)] =
exp −ΦΓ (u; δ) φ(u)
dµ0 (y) =: Γ
P
ZΓ U
ZΓ
u=hui+ j∈J yj ψj
with ZΓ :=
R
y∈U
exp(− 12 (δ − G(u))> Γ−1 (δ − G(u)) )dµ0 (y),
Γ>0.
Reformulation of the forward problem with uncertain, distributed input
parameter u ∈ X as infinite-dimensional, parametric-deterministic
problem
Parametric posterior density ΘΓ (y) of µδ with respect to the
(non-Gaussian) prior µ0
Deterministic, adaptive quadrature for ZΓ0 and ZΓ to compute the
posterior expectation of QoI, given data δ
⇒ Efficient algorithm to approximate expectations conditional on given
data with dimension-independent rates of convergence > 1/2
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
6 / 19
Sparsity of the Posterior Density
Theorem (Cl. Schillings and CS 2013)
Assume that the forward solution map U 3 y 7→ q(y) is (p, ε)-analytic for
some 0 < p < 1 and ε > 0.
Then the Bayesian posterior density ΘΓ (y) is, as a function of the
parameter y, likewise (p, ε)-analytic and, therefore, p-sparse.
Examples:
affine-parametric, linear operator equations
semilinear elliptic PDEs (Hansen & CS; Math. Nachr. 2013)
parametric initial value ODEs
(Hansen & CS; Vietnam J. Math. 2013)
elliptic multiscale problems
(Hoang & CS; Analysis and Applications 2012)
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
7 / 19
N-term Approximation Rates
Theorem (Cl. Schillings and CS 2013)
Assume that the uncertainty-to-observation map G(u(y)) : U 7→ RK is
(p, ε)-analytic for some 0 < p < 1.
Then exists a nested sequence (ΛΘ
N )N≥1 ⊂ F of monotone sets
Θ
Θ
ΛN ⊂ F such that #(ΛN ) ≤ N and such that, for all N,
X
1
−s
θν Pν (y)
sup ΘΓ (y) −
≤ C(Γ, p)N , s := p − 1 .
y∈U Θ
ν∈ΛN
X
δ
Evaluation of Eµ [φ] by Adaptive Smolyak quadrature algorithm with
convergence rate N −(1/p−1) (N = # PDE solves).
ΛPN = ? Either direct construction from sets ΛG
N in adaptive sGFEM (CS &
Stuart (2010)), or b) adaptive Smolyak or ...
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
8 / 19
Smolyak: Univariate Quadratures
Family of univariate quadratures (for coordinate measures)
Qk (g) =
nk
X
wki · g(zki )
i=0
with g : [−1, 1] 7→ S for state space S
(Qk )k≥0 sequence of univariate quadrature formulas
k
(zkj )nj=0
⊂ [−1, 1] with zkj ∈ [−1, 1] , ∀j, k and zk0 = 0 , ∀k quadrature points
wkj , 0 ≤ j ≤ nk , ∀k ∈ N0 quadrature weights
Assumption
(i) (I − Qk )(vk ) = 0 ,
with I(vk ) =
R
[−1,1]
(ii) wkj > 0 ,
Ch. Schwab (SAM)
∀vk ∈ Sk := Pk ⊗ S , Pk = span{yj : j ∈ N0 , j ≤ k}
vk (y)λ1 (dy)
0 ≤ j ≤ nk , ∀k ∈ N0 .
Sparsity in Bayesian Inversion
June 20, 2013
9 / 19
Smolyak: Univariate Quadratures
Family of univariate quadratures (for coordinate measures)
Qk (g) =
nk
X
wki · g(zki )
i=0
with g : [−1, 1] 7→ S for state space S
(Qk )k≥0 sequence of univariate quadrature formulas
k
(zkj )nj=0
⊂ [−1, 1] with zkj ∈ [−1, 1] , ∀j, k and zk0 = 0 , ∀k quadrature points
wkj , 0 ≤ j ≤ nk , ∀k ∈ N0 quadrature weights
Univariate quadrature difference operator
∆j = Qj − Qj−1 ,
j≥0
with Q−1 = 0 and z00 = 0, w00 = 1
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
9 / 19
Smolyak: Univariate Quadratures
Family of univariate quadratures (for coordinate measures)
k
Q (g) =
nk
X
wki · g(zki )
i=0
with g : [−1, 1] 7→ S for state space S
(Qk )k≥0 sequence of univariate quadrature formulas
k
(zkj )nj=0
⊂ [−1, 1] with zkj ∈ [−1, 1] , ∀j, k and zk0 = 0 , ∀k quadrature points
wkj , 0 ≤ j ≤ nk , ∀k ∈ N0 quadrature weights
Univariate quadrature operator rewritten as telescoping sum
Qk =
k
X
∆j
j=0
with Z k = {zkj : 0 ≤ j ≤ nk } ⊂ [−1, 1] set of points corresponding to Qk
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
9 / 19
Smolyak: Tensorization
Tensorized multivariate quadrature operators
O
O
Qν =
Qνj ,
∆ν =
∆νj
j≥1
j≥1
with associated set of collocation points Z ν = ×j≥1 Z νj ⊂ U
If ν = 0F , then ∆ν g = Qν g = g(z0F ) = g(0F )
If 0F 6= ν ∈ F , with ν̂ = (νj )j6=i
Qν g = Qνi (t 7→
O
Qν̂j gt ) ,
i ∈ Iν
∆ν̂j gt ) ,
i ∈ Iν ,
j≥1
and
∆ν g = ∆νi (t 7→
O
j≥1
for g ∈ Z, gt is the function defined on Z N by
gt (ŷ) = g(y), y = (. . . , yi−1 , t, yi+1 , . . .) , i > 1 and y = (t, y2 , . . .) , i = 1
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
10 / 19
Sparse Smolyak Quadrature Operator
For any finite monotone set Λ ⊂ F, QΛ is defined by
XO
X
∆ν =
QΛ =
∆νj
ν∈Λ
ν∈Λ j≥1
with associated Smolyak grid ZΛ = ∪ν∈Λ Z ν
Theorem (Cl. Schillings and CS 2013)
For any finite monotone index set ΛN ⊂ F, the sparse quadrature QΛN
is exact for any polynomial g ∈ PΛN , i.e. it holds
QΛN (g) = I(g),
∀g ∈ SΛN := PΛN ⊗ S ,
P
ν
with PΛN = Rspan{yν : ν ∈ ΛN } , i.e. SΛN = span
ν∈ΛN sν y : sν ∈ S ,
and I(g) = U g(y)µ0 (dy).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
11 / 19
Convergence Rates for Adaptive Smolyak Integration
Corollary (Cl. Schillings and CS 2013)
Assume that the forward solution map U 3 y 7→ q(y) is (p, ε)-analytic for some
0 < p < 1 and ε > 0.
Then exists (ΛN )N≥1 of monotone index sets ΛN ⊂ F such that #ΛN ≤ N and
|ZΓ − QΛN [ΘΓ ]| ≤ C1 (Γ, p)N −s ,
with s = 1/p − 1, and, for ΨΓ (y) := ΘΓ (y)φ(u(y)),
kZΓ0 − QΛN [ΨΓ ]kS ≤ C2 (Γ, p)N −s .
with ZΓ0 = I[ΨΓ ] =
Remark:
R
U
ΨΓ (y)dµ0 (y), C1 , C2 > 0 independent of N.
SAME index set ΛN for BOTH, ZΓ and ZΓ0 .
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
12 / 19
Convergence Rates for Adaptive Smolyak Integration
Remark:
SAME index set ΛN for BOTH, ZΓ and ZΓ0 .
Sketch of proof
Relating the quadrature error with Legendre gpc coefficients
|I[ΘΓ ] − QΛ [ΘΓ ]| ≤ 2 ·
X
γν |θνP |
ν ∈Λ
/
and
kI[ΨΓ ] − QΛ [ΨΓ ]kS ≤ 2 ·
X
γν kψνP kS
ν ∈Λ
/
for any monotone index set Λ ⊂ F, where γν :=
Q
j∈J (1
+ ν j )2 .
(γν |θνP |)ν∈F ∈ `p (F) and (γν kψνP kX )ν∈F ∈ `p (F).
⇒ ∃ sequence (ΛN )N≥1 of monotone sets ΛN ⊂ F, #ΛN ≤ N, such that
the Smolyak quadrature w.r. to (ΛN )N≥1 converges with order 1/p − 1.
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
12 / 19
Adaptive Construction of {ΛN }N≥1
Successive identification of N largest Smolyak contributions
O
|∆ν (Θ)| = |
∆νj (Θ)| , ν ∈ F
j≥1
A. Chkifa, A. Cohen and Ch. Schwab. High-dimensional adaptive sparse polynomial interpolation
and applications to parametric PDEs, JFoCM 2013.
Set of reduced neighbors
N (Λ) := {ν ∈
/ Λ : ν − ej ∈ Λ, ∀j ∈ Iν and νj = 0 , ∀j > j(Λ) + 1}
with j(Λ) = max{j : νj > 0 for some ν ∈ Λ}, Iν = {j ∈ N : νj 6= 0} ⊂ N
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
13 / 19
Adaptive Construction of {ΛN }N≥1
1:
2:
3:
4:
5:
6:
7:
8:
9:
10:
11:
function ASG
Set Λ1 = {0} , k = 1 and compute ∆0 (Θ).
Determine the set of reduced neighbors N (Λ1 ).
Compute
P ∆ν (Θ) , ∀ν ∈ N (Λ1 ).
while ν∈N (Λk ) |∆ν (Θ)| > tol do
Pick ν from N (Λk ) w. largest |∆ν | and set Λk+1 := Λk ∪ {ν}.
Determine the set of reduced neighbors N (Λk+1 ).
Compute ∆ν (Θ) , ∀ν ∈ N (Λk+1 ).
Set k = k + 1.
end while
end function
T. Gerstner and M. Griebel. Dimension-adaptive tensor-product quadrature, Computing, 2003
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
13 / 19
Numerical Experiments
Model parametric parabolic problem
∂t q(t, x) − div(u(x)∇q(t, x)) = 100 · tx
(t, x) ∈ T × D ,
x ∈ D,
q(0, x) = 0
t∈T
q(t, 0) = q(t, 1) = 0
with
u(x, y) = hui +
64
X
yj ψj , where hui = 1 and ψj = αj χDj
j=1
where Dj = [(j − 1) 641 , j 641 ], y = (yj )j=1,...,64 and αj =
1.8
jζ
, ζ = 2, 3, 4.
Finite element method using continuous, piecewise linear ansatz functions in
space, backward Euler scheme in time
Uniform mesh with meshwidth hT = hD = 2−10
LAPACK’s DPTSV routine
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
14 / 19
Numerical Experiments
Find the expected system response, given (noisy) data
δ = G(u) + η ,
Expectation of interest ZΓ0 /ZΓ
ZΓ0
Z
=
U
Z
ZΓ =
U
exp −ΦΓ (u; δ) φ(u)
exp −ΦΓ (u; δ) dµ0 (y)
P
u=hui+ 64
j=1 yj ψj
u=hui+
P64
j=1 yj ψj
dµ0 (y)
Observation operator O consists of system responses at K observation points in T × D at
j
i
ti = NK,T
, i = 1, . . . , 2NK,T − 1, xj = NK,D
, k = 1, . . . , 2NK,D − 1, ok (·, ·) = δ(· − tk )δ(· − xk )
2
2
with K = 1, NK,D = 1, NK,T = 1, K = 3, NK,D = 2, NK,T = 1, K = 9, NK,D = 2, NK,T = 2
G : L∞ (D) → RK , with K = 1, 3, 9, φ(u) = G(u)
η = (ηj )j=1,...,K iid with ηj ∼ N (0, 1), ηj ∼ N (0, 0.52 ) and ηj ∼ N (0, 0.12 )
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
14 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, N =3
R−Leja, N =4
R−Leja, N =4
10
K
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
|∆ν(Θ)|
−4
N(Λ )
10
k
k
Σν ε
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
# Λ
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
−4
10
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
10
10
K
Leja, NK=3
Σν ε
|∆ν(Θ)|
10
K
Leja, NK=3
−3
10
Leja, N =2
−1
Leja, NK=3
10
k
K
Leja, N =2
−1
Clenshaw−Curtis, NK=4
N(Λ )
K
R−Leja, N =4
K
Leja, N =2
−2
K
R−Leja, N =3
K
K
10
R−Leja, N =2
K
R−Leja, N =3
K
10
j
0
10
R−Leja, N =2
K
−1
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, NK=4
−2
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
10
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, N =3
K
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
Clenshaw−Curtis, N =2
K
−4
Clenshaw−Curtis, N =3
10
K
Clenshaw−Curtis, NK=4
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, N =3
R−Leja, N =4
R−Leja, N =4
10
K
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
|∆ν(Θ)|
−4
N(Λ )
10
k
k
Σν ε
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
# Λ
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
−4
10
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
10
10
K
Leja, NK=3
Σν ε
|∆ν(Θ)|
10
K
Leja, NK=3
−3
10
Leja, N =2
−1
Leja, NK=3
10
k
K
Leja, N =2
−1
Clenshaw−Curtis, NK=4
N(Λ )
K
R−Leja, N =4
K
Leja, N =2
−2
K
R−Leja, N =3
K
K
10
R−Leja, N =2
K
R−Leja, N =3
K
10
j
0
10
R−Leja, N =2
K
−1
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, NK=4
−2
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
10
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, N =3
K
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
Clenshaw−Curtis, N =2
K
−4
Clenshaw−Curtis, N =3
10
K
Clenshaw−Curtis, NK=4
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, N =3
R−Leja, N =4
R−Leja, N =4
10
K
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
|∆ν(Θ)|
−4
N(Λ )
10
k
k
Σν ε
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
# Λ
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
−4
10
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
10
10
K
Leja, NK=3
Σν ε
|∆ν(Θ)|
10
K
Leja, NK=3
−3
10
Leja, N =2
−1
Leja, NK=3
10
k
K
Leja, N =2
−1
Clenshaw−Curtis, NK=4
N(Λ )
K
R−Leja, N =4
K
Leja, N =2
−2
K
R−Leja, N =3
K
K
10
R−Leja, N =2
K
R−Leja, N =3
K
10
j
0
10
R−Leja, N =2
K
−1
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, NK=4
−2
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
10
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, N =3
K
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
−4
10
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, NK=2
R−Leja, NK=3
R−Leja, N =4
K
Leja, NK=2
−2
10
Leja, NK=4
Clenshaw−Curtis, N =2
K
−4
Clenshaw−Curtis, N =3
10
K
Clenshaw−Curtis, NK=4
Σν ε
k
N(Λ )
|∆ν(Θ)|
Leja, NK=3
−6
10
−8
10
0
10
1
10
2
# Λ
10
3
10
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, η ∼ N(0,1)
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, N =3
R−Leja, N =4
R−Leja, N =4
R−Leja, N =4
K
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
# Λ
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
Σν ε
10
Σν ε
|∆ν(Θ)|
−4
10
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
10
10
K
Leja, NK=3
−3
10
Leja, N =2
−1
10
K
Leja, NK=3
10
k
K
Leja, N =2
−1
10
K
Leja, NK=3
Clenshaw−Curtis, NK=4
N(Λ )
K
K
Leja, N =2
−2
K
R−Leja, N =3
K
10
R−Leja, N =2
K
R−Leja, N =3
K
−1
j
0
10
R−Leja, N =2
K
10
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
2
2
Z’, ζ=2, η ∼ N(0,0.5 )
K
K
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
Σν ε
Σν ε
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
10
# Λ
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
−4
0
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
10
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
10
K
Leja, NK=3
−3
|∆ν(Θ)|
K
Leja, N =2
−1
10
K
Leja, NK=3
10
k
K
R−Leja, N =4
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
N(Λ )
K
R−Leja, N =3
R−Leja, N =4
K
Leja, N =2
−2
R−Leja, N =2
K
R−Leja, N =3
R−Leja, N =4
10
j
0
10
R−Leja, N =2
K
R−Leja, N =3
−1
Z’, ζ=4, η ∼ N(0,0.5 )
j
0
10
R−Leja, N =2
10
2
Z’, ζ=3, η ∼ N(0,0.5 )
j
0
10
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the Posterior Expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 0.52 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
2
2
Z’, ζ=2, η ∼ N(0,0.1 )
K
K
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
Σν ε
Σν ε
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
10
2
10
# Λ
3
10
4
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
−4
0
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
10
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
10
K
Leja, NK=3
−3
|∆ν(Θ)|
K
Leja, N =2
−1
10
K
Leja, NK=3
10
k
K
R−Leja, N =4
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
N(Λ )
K
R−Leja, N =3
R−Leja, N =4
K
Leja, N =2
−2
R−Leja, N =2
K
R−Leja, N =3
R−Leja, N =4
10
j
0
10
R−Leja, N =2
K
R−Leja, N =3
−1
Z’, ζ=4, η ∼ N(0,0.1 )
j
0
10
R−Leja, N =2
10
2
Z’, ζ=3, η ∼ N(0,0.1 )
j
0
10
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 the
cardinality of the index set ΛN based on the sequences CC, L and RL with
K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 0.12 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
Z’, ζ=2, ηj∼ N(0,1)
0
10
Z’, ζ=3, ηj∼ N(0,1)
0
10
R−Leja, N =2
R−Leja, N =4
R−Leja, N =4
K
Leja, NK=3
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
# PDE solves
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
10
Σν ε
|∆ν(Θ)|
Σν ε
k
N(Λ )
−4
10
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
10
K
Leja, NK=3
Leja, NK=4
Clenshaw−Curtis, NK=4
10
Leja, N =2
−1
10
K
Leja, NK=3
Clenshaw−Curtis, NK=2
−2
K
Leja, N =2
−1
10
K
10
K
R−Leja, NK=3
R−Leja, N =4
Leja, N =2
−1
R−Leja, N =2
K
R−Leja, NK=3
K
10
Z’, ζ=4, ηj∼ N(0,1)
0
10
R−Leja, N =2
K
R−Leja, NK=3
10
−8
0
10
1
2
10
10
# PDE solves
3
10
10
0
10
1
2
10
10
3
10
# PDE solves
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 with
respect to the number of PDE solves needed based on the sequences CC, L and RL
with K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
2
2
Z’, ζ=2, ηj∼ N(0,0.5 )
0
10
Leja, NK=3
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
2
10
10
# PDE solves
3
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
Σν ε
10
Σν ε
|∆ν(Θ)|
−4
10
0
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
10
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
K
Leja, NK=3
10
k
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
N(Λ )
R−Leja, N =4
K
Leja, N =2
−1
10
K
−2
K
R−Leja, NK=3
R−Leja, N =4
K
Leja, N =2
10
R−Leja, N =2
K
R−Leja, NK=3
R−Leja, N =4
−1
Z’, ζ=4, ηj∼ N(0,0.5 )
0
10
R−Leja, N =2
K
R−Leja, NK=3
10
2
Z’, ζ=3, ηj∼ N(0,0.5 )
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# PDE solves
3
10
10
0
10
1
2
10
10
3
10
# PDE solves
Figure: Comparison of the error curves of the posterior expectation of QoI ZΓ0 with
respect to the number of PDE solves needed based on the sequences CC, L and RL
with K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 0.52 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI ZΓ0
2
2
Z’, ζ=2, ηj∼ N(0,0.1 )
0
10
Leja, NK=3
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
|∆ν(Θ)|
N(Λ )
k
k
−4
10
−5
−5
10
−5
10
−6
10
−6
10
−6
10
−7
10
−7
10
−7
10
−8
10
−8
1
10
2
10
# PDE solves
3
10
4
10
−4
10
Σν ε
|∆ν(Θ)|
N(Λ )
Σν ε
10
Σν ε
|∆ν(Θ)|
−4
10
0
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
10
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
−3
10
K
Leja, NK=3
10
k
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
N(Λ )
R−Leja, N =4
K
Leja, N =2
−1
10
K
−2
K
R−Leja, NK=3
R−Leja, N =4
K
Leja, N =2
10
R−Leja, N =2
K
R−Leja, NK=3
R−Leja, N =4
−1
Z’, ζ=4, ηj∼ N(0,0.1 )
0
10
R−Leja, N =2
K
R−Leja, NK=3
10
2
Z’, ζ=3, ηj∼ N(0,0.1 )
0
10
R−Leja, N =2
10
−8
0
10
1
2
10
10
# PDE solves
3
10
10
0
10
1
2
10
3
10
10
# PDE solves
Figure: Comparison of the error curves of the Posterior Expectation of QoI ZΓ0 with
respect to the number of PDE solves needed based on the sequences CC, L and RL
with K = 2NK − 1 , K = 1, 3, 9, η ∼ N (0, 0.12 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
15 / 19
Posterior Expectation of QoI Z 0
Z’, ζ=2, η ∼ N(0,1)
Z’, ζ=3, η ∼ N(0,1)
j
0
10
R−Leja, N =3
R−Leja, N =4
R−Leja, N =4
R−Leja, N =4
K
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
||(∆ν(Θ))j||oo
−4
10
−5
10
−6
−5
10
−6
10
−7
−7
−7
10
−8
1
2
10
# Λ
3
10
10
−5
10
10
10
−8
−4
10
−6
10
10
10
k
Σν ε
k
N(Λ )
Σν ε
k
N(Λ )
−3
10
N(Λ )
||(∆ν(Θ))j||oo
−4
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
10
K
Leja, NK=3
10
10
Leja, N =2
−1
10
K
Leja, NK=3
−3
||(∆ν(Θ))j||oo
K
Leja, N =2
−1
10
K
Leja, NK=3
Clenshaw−Curtis, NK=4
Σν ε
K
K
Leja, N =2
−2
K
R−Leja, N =3
K
10
R−Leja, N =2
K
R−Leja, N =3
K
−1
j
0
10
R−Leja, N =2
K
10
Z’, ζ=4, η ∼ N(0,1)
j
0
10
R−Leja, N =2
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the L∞ error curves of QoI Z 0 with respect to the cardinality of
the index set ΛN based on the sequences CC, L and RL with K = 2NK − 1 , K = 1, 3, 9,
η ∼ N (0, 1) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
16 / 19
Posterior Expectation of QoI Z 0
2
2
Z’, ζ=2, η ∼ N(0,0.5 )
K
K
K
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
||(∆ν(Θ))j||oo
−4
10
−5
10
−6
−5
10
−6
10
−7
−7
−7
10
−8
1
2
10
# Λ
3
10
10
−5
10
10
10
−8
−4
10
−6
10
10
10
k
Σν ε
k
N(Λ )
Σν ε
k
N(Λ )
−3
10
N(Λ )
||(∆ν(Θ))j||oo
−4
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
10
K
Leja, NK=3
10
10
Leja, N =2
−1
10
K
Leja, NK=3
−3
||(∆ν(Θ))j||oo
K
R−Leja, N =4
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
Σν ε
K
R−Leja, N =3
R−Leja, N =4
K
Leja, N =2
−2
R−Leja, N =2
K
R−Leja, N =3
R−Leja, N =4
10
j
0
10
R−Leja, N =2
K
R−Leja, N =3
−1
Z’, ζ=4, η ∼ N(0,0.5 )
j
0
10
R−Leja, N =2
10
2
Z’, ζ=3, η ∼ N(0,0.5 )
j
0
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the L∞ error curves of QoI Z 0 with respect to cardinality of the
index set ΛN based on the sequences CC, L and RL with K = 2NK − 1 , K = 1, 3, 9,
η ∼ N (0, 0.52 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
16 / 19
Posterior Expectation of QoI Z 0
2
2
Z’, ζ=2, η ∼ N(0,0.1 )
K
K
K
Leja, NK=3
Leja, NK=4
Leja, NK=4
Leja, NK=4
Clenshaw−Curtis, NK=2
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
||(∆ν(Θ))j||oo
−4
10
−5
10
−6
−5
10
−6
10
−7
−7
−7
10
−8
1
2
10
# Λ
3
10
4
10
10
−5
10
10
10
−8
−4
10
−6
10
10
10
k
Σν ε
k
N(Λ )
Σν ε
k
N(Λ )
−3
10
N(Λ )
||(∆ν(Θ))j||oo
−4
10
Clenshaw−Curtis, NK=3
Clenshaw−Curtis, NK=4
10
0
Clenshaw−Curtis, NK=2
−2
10
Clenshaw−Curtis, NK=3
−3
10
K
Leja, NK=3
10
10
Leja, N =2
−1
10
K
Leja, NK=3
−3
||(∆ν(Θ))j||oo
K
R−Leja, N =4
K
Leja, N =2
−1
10
K
Clenshaw−Curtis, NK=4
Σν ε
K
R−Leja, N =3
R−Leja, N =4
K
Leja, N =2
−2
R−Leja, N =2
K
R−Leja, N =3
R−Leja, N =4
10
j
0
10
R−Leja, N =2
K
R−Leja, N =3
−1
Z’, ζ=4, η ∼ N(0,0.1 )
j
0
10
R−Leja, N =2
10
2
Z’, ζ=3, η ∼ N(0,0.1 )
j
0
10
−8
0
10
1
2
10
10
# Λ
3
10
10
0
10
1
2
10
10
3
10
# Λ
Figure: Comparison of the L∞ error curves of QoI Z 0 with respect to the cardinality of
the index set ΛN based on the sequences CC, L and RL with K = 2NK − 1 , K = 1, 3, 9,
η ∼ N (0, 0.12 ) and ζ = 2 (l.), ζ = 3 (m.) and ζ = 4 (r.).
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
16 / 19
Conclusions and Outlook
deterministic, gpc-based data-adaptive quadrature for
Bayesian inversion and estimation problems for parametric
operator equations with distributed uncertainty u
Dimension-independent convergence bound CΓ N −s
Γ independent rate s = 1/p − 1 > 1/2, N = # PDE-solves
Γ dependent constant CΓ ∼ C exp(−b/Γ), b, C > 0 ind. of Γ
Γ ↓ 0? Asymptotic Expansion
δ
Eµ [φ(u)] =
X
ZΓ0
∼
ak Γk
ZΓ
as Γ ↓ 0
k≥0
=⇒ Richardson-Extrapolation w.r. to Γ ↓ 0 (SC & Schillings’13)
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
17 / 19
Conclusions and Outlook
deterministic, gpc-based data-adaptive quadrature for
Bayesian inversion and estimation problems for parametric
operator equations with distributed uncertainty u
Dimension-independent convergence bound CΓ N −s
Γ independent rate s = 1/p − 1 > 1/2, N = # PDE-solves
Γ dependent constant CΓ ∼ C exp(−b/Γ), b, C > 0 ind. of Γ
Γ ↓ 0? Asymptotic Expansion
δ
Eµ [φ(u)] =
X
ZΓ0
∼
ak Γk
ZΓ
as Γ ↓ 0
k≥0
=⇒ Richardson-Extrapolation w.r. to Γ ↓ 0 (SC & Schillings’13)
Adaptive control of the discretization error of the forward problem
with respect to significance in Smolyak detail ∆ν [ΨΓ ]
No MCMC burn-in
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
17 / 19
References
Calvi J-P and Phung Van M 2011 On the Lebesgue constant of Leja sequences for the unit disk and its applications to
multivariate interpolation Journal of Approximation Theory 163 608-622
Calvi J-P and Phung Van M 2012 Lagrange interpolation at real projections of Leja sequences for the unit disk
Proceedings of the American Mathematical Society 140 4271–4284
C HKIFA A 2013 On the Lebesgue constant of Leja sequences for the unit disk Journal of Approximation Theory 166
176-200
Chkifa A, Cohen A and Schwab C 2013 High-dimensional adaptive sparse polynomial interpolation and applications to
parametric PDEs JFoCM 2013
Cohen A, DeVore R and Schwab C 2011 Analytic regularity and polynomial approximation of parametric and stochastic
elliptic PDEs Analysis and Applications 9 1-37
Gerstner T and Griebel M 2003 Dimension-adaptive tensor-product quadrature Computing 71 65-87
Hansen M and Schwab C 2013 Sparse adaptive approximation of high dimensional parametric initial value problems
Vietnam J. Math. 1 1-35
Hansen M, Schillings C and Schwab C 2012 Sparse approximation algorithms for high dimensional parametric initial value
problems Proceedings 5th Conf. High Performance Computing, Hanoi(Springer LNCSE 2013)
Schillings C and Schwab C 2013 Sparse, adaptive Smolyak algorithms for Bayesian inverse problems (Inverse Problems)
Schwab C and Stuart A M 2012 Sparse deterministic approximation of Bayesian inverse problems Inverse Problems 28
045003
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
18 / 19
Thank you for your attention & comments
Postdoctoral Fellowships
www.sam.math.ethz.ch
Ch. Schwab (SAM)
Sparsity in Bayesian Inversion
June 20, 2013
19 / 19
Download