L Techniques for various Problems in: Eitan Tadmor University of Maryland

advertisement
L1 Techniques for various Problems in:
Nonlinear PDEs, Numerics, Signal and Image Processing
Eitan Tadmor
University of Maryland
www.cscamm.umd.edu/˜tadmor
Nonlinear Approximation Techniques Using L1
Texas A&M, May 2008
The Inviscid, 2D Nonlinear Transport:
A BV Approach∗
∗
with Dongming Wei (Maryland)
2D nonlinear transport: from viscous to the inviscid
• The 1D setup — ut + uux = uxx (Burgers eq.):
1. Compactness: ku(·, t)kBV ≤ ku0kBV =⇒ u → u (“L1 hard”);
1 2
(u )
= uxx (obvious);
2. Conservation form: ut +
x
2
1 2
3. Passing to the limit: ut + (u )x = 0 (“diagonal” regularity);
2
2D nonlinear transport: from viscous to the inviscid
• The 1D setup — ut + uux = uxx (Burgers eq.):
1. Compactness: ku(·, t)kBV ≤ ku0kBV =⇒ u → u (“L1 hard”);
1 2
(u )
= uxx (obvious);
2. Conservation form: ut +
x
2
1 2
3. Passing to the limit: ut + (u )x = 0 (“diagonal” regularity);
2
• The 2D setup — ut + u · ∇xu = ∆x u:
1
< k u k
1. We prove compactness: ku(·, t)kBV ∼
0 C01 (“L hard”);
2D nonlinear transport: from viscous to the inviscid
• The 1D setup — ut + uux = uxx (Burgers eq.):
1. Compactness: ku(·, t)kBV ≤ ku0kBV =⇒ u → u (“L1 hard”);
1 2
2. Conservation form: ut +
(u )
= uxx (obvious);
x
2
1 2
3. Passing to the limit: ut + (u )x = 0 (“diagonal” regularity);
2
• The 2D setup — ut + u · ∇xu = ∆x u:
1 hard”);
< k u k
1. We prove compactness: ku(·, t)kBV ∼
(“L
1
0 C0
2. Conservation form is not obvious: ut + ∇x(u ⊗ u) 6= ∆u;
3. Limit u → u exists — but what is the dynamics of u(·, t):
“ ut + u · ∇xu = 0 ”?
(Zeldovitch system)
BV bound by spectral dynamics
ut + u · ∇xu = ∇xF,
F = F[u, ∂xi u, ∂x2i,xj u, . . . ]
BV bound by spectral dynamics
ut + u · ∇xu = ∇xF,
F = F[u, ∂xi u, ∂x2i,xj u, . . . ]
• λ = λ(M ) – an eigenvalue of M =
∂ ui
∂xj
!
w/eigenpair h`, ri = 1
BV bound by spectral dynamics
ut + u · ∇xu = ∇xF,
F = F[u, ∂xi u, ∂x2i,xj u, . . . ]
• λ = λ(M ) – an eigenvalue of M =
∂ ui
∂xj
!
∂tλi + u · ∇xλi + λ2
i = h ∇xF`i, ri i
w/eigenpair h`, ri = 1
i = 1, 2, . . . , N
BV bound by spectral dynamics
ut + u · ∇xu = ∇xF,
F = F[u, ∂xi u, ∂x2i,xj u, . . . ]
• λ = λ(M ) – an eigenvalue of M =
∂ ui
∂xj
!
∂tλi + u · ∇xλi + λ2
i = h ∇xF`i, ri i
w/eigenpair h`, ri = 1
i = 1, 2, . . . , N
— Difficult interaction of eigenstructure–forcing · · · h ∇xF`,r i
BV bound by spectral dynamics
ut + u · ∇xu = ∇xF,
F = F[u, ∂xi u, ∂x2i,xj u, . . . ]
• λ = λ(M ) – an eigenvalue of M =
∂ ui
∂xj
!
∂tλi + u · ∇xλi + λ2
i = h ∇xF`i, ri i
w/eigenpair h`, ri = 1
i = 1, 2, . . . , N
— Difficult interaction of eigenstructure–forcing · · · h ∇xF`,r i
• viscosity forcing: F[. . . ] = ∆u
If M = ∇u is symmetric with λ1(x, t) < . . . < λN (x, t):
∂t λ1 + u · ∇xλ1 + λ2
1 ≥ ∆λ1
∂t λN + u · ∇xλN + λ2
N ≤ ∆λN
• Ricatti equation =⇒ λ1 < . . . < λN ≤ Const. 7→ ∇xu ≤ Const.
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
• N =2:
ηt + u · ∇xη + (∇x · u)η ≤ ∆η
ηt + ∇x(uη) ≤ ∆u =⇒ kη(·, t)kL1 ≤ kη0 kL1
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
• N =2:
ηt + u · ∇xη + (∇x · u)η ≤ ∆η
ηt + ∇x(uη) ≤ ∆u =⇒ kη(·, t)kL1 ≤ kη0 kL1
OSLC: λ1 + λ2 = ∇x · u ≤ Const. (majorized by Ricatti’s λ2 )
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
• N =2:
ηt + u · ∇xη + (∇x · u)η ≤ ∆η
ηt + ∇x(uη) ≤ ∆u =⇒ kη(·, t)kL1 ≤ kη0 kL1
OSLC: λ1 + λ2 = ∇x · u ≤ Const. (majorized by Ricatti’s λ2 )
< ku k
• Conclusions: λ1 ± λ2 ∈ L1 =⇒ ku(·, t)kBV ∼
0 C1
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
• N =2:
ηt + u · ∇xη + (∇x · u)η ≤ ∆η
ηt + ∇x(uη) ≤ ∆u =⇒ kη(·, t)kL1 ≤ kη0 kL1
OSLC: λ1 + λ2 = ∇x · u ≤ Const. (majorized by Ricatti’s λ2 )
< ku k
• Conclusions: λ1 ± λ2 ∈ L1 =⇒ ku(·, t)kBV ∼
0 C1
#1. 2D symmetric M : u = ∇ϕ
2 = ∆ϕ
Uniform BV bound on ∇ϕ of eikonal eq.: ϕt + 1
|∇
ϕ|
x
2
Uniform BV bounds on viscous transport eqs
• A bound on the spectral gap: η := λN − λ1
ηt + u · ∇xη + (λ1 + λN )η ≤ ∆η
• N =2:
ηt + u · ∇xη + (∇x · u)η ≤ ∆η
ηt + ∇x(uη) ≤ ∆u =⇒ kη(·, t)kL1 ≤ kη0 kL1
OSLC: λ1 + λ2 = ∇x · u ≤ Const. (majorized by Ricatti’s λ2 )
< ku k
• Conclusions: λ1 ± λ2 ∈ L1 =⇒ ku(·, t)kBV ∼
0 C1
#1. 2D symmetric M : u = ∇ϕ
2 = ∆ϕ
Uniform BV bound on ∇ϕ of eikonal eq.: ϕt + 1
|∇
ϕ|
x
2
#2. 2D nonsymmetric M : L1 spectral dynamics of symmM
Uniform BV bound on u of transport eq.: (∂t + u · ∇x)u = ∆u
Passage to the limit:
ut + u · ∇xu = ∆xu
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
Passage to the limit:
ut + u · ∇xu = ∆xu
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
2=0
• Irrotational case — u0 = ∇xϕ0: u = ∇xϕ :7→ ∂t ϕ + 1
|∇
ϕ|
x
2
Passage to the limit:
ut + u · ∇xu = ∆xu
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
2=0
• Irrotational case — u0 = ∇xϕ0: u = ∇xϕ :7→ ∂t ϕ + 1
|∇
ϕ|
x
2
• Non-conservative product (Volpert, LeFloch...): ?
Passage to the limit:
ut + u · ∇xu = ∆xu
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
2=0
• Irrotational case — u0 = ∇xϕ0: u = ∇xϕ :7→ ∂t ϕ + 1
|∇
ϕ|
x
2
• Non-conservative product (Volpert, LeFloch...): ?
• (ρ, u) is solution of pressureless eqs: ?
∂t ρ + ∇x(ρu) = 0,
∂t(ρu) + ∇x(ρu ⊗ u) = 0
ut + u · ∇xu = ∆xu
Passage to the limit:
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
2=0
• Irrotational case — u0 = ∇xϕ0: u = ∇xϕ :7→ ∂t ϕ + 1
|∇
ϕ|
x
2
• Non-conservative product (Volpert, LeFloch...): ?
• (ρ, u) is solution of pressureless eqs: ?
∂t ρ + ∇x(ρu) = 0,
∂t(ρu) + ∇x(ρu ⊗ u) = 0
• Dual solution: the viscous case
φt + ∇x(uφ) = −∆φ :
φT , u (·, T ) = φ (·, 0), u0
ut + u · ∇xu = ∆xu
Passage to the limit:
ku (·, t)k ≤ Const. :=⇒ u (x, t) −→ u(x, t) in L
∞
1
2
[0, T ], L (IR )
2=0
• Irrotational case — u0 = ∇xϕ0: u = ∇xϕ :7→ ∂t ϕ + 1
|∇
ϕ|
x
2
• Non-conservative product (Volpert, LeFloch...): ?
• (ρ, u) is solution of pressureless eqs: ?
∂t ρ + ∇x(ρu) = 0,
∂t(ρu) + ∇x(ρu ⊗ u) = 0
• Dual solution: the viscous case
φt + ∇x(uφ) = −∆φ :
φT , u (·, T ) = φ (·, 0), u0
The inviscid case: L∞-φ exists by the OSLC ∇x · u ≤ Const.
φt + ∇x(uφ) = 0,
φT (·) 7→u φ(·, t), t ∈ [0, T ];
u is a ’dual solution’ if for all φT (·):
φT , u(·, T ) = φ(·, 0), u0
L1 Convergence Rate Estimates
for Hamilton-Jacobi Eqs∗
∗
with Chi-Tien Lin
Convex HJ eqs: ϕt + H(∇xϕ) ≈ 0
Convex HJ eqs: ϕt + H(∇xϕ) ≈ 0
2 H(p) > 0
HJ eqs with convex Hamiltonian: Dp
2
|p|
Example: eikonal eq.: H(ϕ) = 1
2
Viscosity slns, ϕt + H(∇xϕ) = 0, demi-concave: D2 ϕ(·, t) ≤ k(t)
Convex HJ eqs: ϕt + H(∇xϕ) ≈ 0
2 H(p) > 0
HJ eqs with convex Hamiltonian: Dp
2
|p|
Example: eikonal eq.: H(ϕ) = 1
2
Viscosity slns, ϕt + H(∇xϕ) = 0, demi-concave: D2 ϕ(·, t) ≤ k(t)
• APPROXIMATE SLNs: ϕt + H(∇x ϕ) ≈ 0
• Stability assumption: D2 ϕ(x, t) ≤ k(t) ∈ L1[0, T ]
Convex HJ eqs: ϕt + H(∇xϕ) ≈ 0
2 H(p) > 0
HJ eqs with convex Hamiltonian: Dp
2
|p|
Example: eikonal eq.: H(ϕ) = 1
2
Viscosity slns, ϕt + H(∇xϕ) = 0, demi-concave: D2 ϕ(·, t) ≤ k(t)
• APPROXIMATE SLNs: ϕt + H(∇x ϕ) ≈ 0
• Stability assumption: D2 ϕ(x, t) ≤ k(t) ∈ L1[0, T ]
=⇒ L1 ERROR ESTIMATE: supp ϕ = Ω0 ⊂ IRN
L1 consistency
}|
{
z
}|
{
kϕ (·, t) − ϕ(·, t)kL1 ∼t kϕ0 − ϕ0kL1(Ω ) + kϕt + H(∇ϕ )kL1([0,t]×Ωt )
0
<
z
initial error
Examples of approximate viscosity solutions
• Convex Hamiltonians — L1-framework: ϕt + H(∇ϕ) = ∆ϕ
< kϕ − ϕ k
< =⇒ kϕ(·, t) − ϕ (·, t)k
k∆ϕkL1([0,T ]×Ωt ) ∼
1
∼
t
0 L1 + 0
L
Examples of approximate viscosity solutions
• Convex Hamiltonians — L1-framework: ϕt + H(∇ϕ) = ∆ϕ
< kϕ − ϕ k
< =⇒ kϕ(·, t) − ϕ (·, t)k
k∆ϕkL1([0,T ]×Ωt ) ∼
1
∼
t
0 L1 + 0
L
• General Hamiltonians — L∞ framework:
Crandall-Lions, Souganidis, Oberman, ...
√
<
kϕ (·, t) − ϕ(·, t)kL∞ ∼t √
1
∞
L -rate + demi-concavity (interpolation): L -rate of order ∼ Examples of approximate viscosity solutions
• Convex Hamiltonians — L1-framework: ϕt + H(∇ϕ) = ∆ϕ
< kϕ − ϕ k
< =⇒ kϕ(·, t) − ϕ (·, t)k
k∆ϕkL1([0,T ]×Ωt ) ∼
1
∼
t
0 L1 + 0
L
• General Hamiltonians — L∞ framework:
Crandall-Lions, Souganidis, Oberman, ...
√
<
kϕ (·, t) − ϕ(·, t)kL∞ ∼t √
1
∞
L -rate + demi-concavity (interpolation): L -rate of order ∼ • Godunov type schemes: ϕ∆x(·, tn+1) = P∆xE(tn+∆t, tn)ϕ∆x(·, tn)
t
∆
x
∆
x
<
kϕ (·, t) − ϕ(·, t)kL1 ∼t
(·, t)
I − P∆x ϕ
∆t
L1([0,t]×Ωt )
1st- and 2nd-order demi-concave projections: kI − P∆xkL1 ∼ |∆x|2
Beyond monotone schemes for HJ eqs
−1.35
−1.4
−1.45
−1.5
−1.55
−1.6
1
0.8
1
0.6
0.8
0.6
0.4
0.4
0.2
0.2
0
Y
0
X
-1.35
-1.4
-1.45
-1.5
-1.55
-1.6
10
20
30
40
50
Central-scheme solution of eikonal eq.
with second-order limiters (Lin-T. and Kurganov-T.)
Edge Detection in piecewise smooth
spectral data by Separation of Scales:
A BV approach∗
∗ with
S. Engelberg (Jerusalem) and J. Zou (Wachovia)
Gibbs phenomenon and concentration factors
SN [f ](x) =
N
X
−N
fˆ(k)eikx,
Z
1 π
ˆ
f (k) :=
f (y)e−iky dy
2π −π
Gibbs phenomenon and concentration factors
SN [f ](x) =
N
X
Z
1 π
ˆ
f (k) :=
f (y)e−iky dy
2π −π
fˆ(k)eikx,
−N
• Piecewise smooth f ’s
f(x)
f(x)
1
1
0.75
0.75
0.5
0.5
0.25
0.25
0
0
-0.25
-0.25
-0.5
-0.5
-0.75
-1
Gibbs’ phenomenon:
7→
-0.75
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
1
1.5
2
2.5
3
-1
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
1
1.5
2
2.5
3
Gibbs phenomenon and concentration factors
SN [f ](x) =
N
X
Z
1 π
ˆ
f (k) :=
f (y)e−iky dy
2π −π
fˆ(k)eikx,
−N
• Piecewise smooth f ’s
f(x)
f(x)
1
1
0.75
0.75
0.5
0.5
0.25
0.25
0
0
-0.25
-0.25
-0.5
-0.5
-0.75
-1
Gibbs’ phenomenon:
7→
-0.75
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
1
1.5
2
2.5
3
-1
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
• First-order accuracy (global):
1
1.5
2
2.5
3
|fˆ(k)| ∼ [f ](c)
e−ikc
1
+O
2πik
|k|2
!
Gibbs phenomenon and concentration factors
N
X
SN [f ](x) =
Z
1 π
ˆ
f (k) :=
f (y)e−iky dy
2π −π
fˆ(k)eikx,
−N
• Piecewise smooth f ’s
Gibbs’ phenomenon:
7→
1
f(x)
f(x)
1
1
0.75
0.75
0.5
0.5
0.25
0.25
0
0
-0.25
-0.25
-0.5
-0.5
-0.75
-1
0.5
0
-0.5
-1
-0.75
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
1
1.5
2
2.5
3
-1
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
• First-order accuracy (global):
1
1.5
2
2.5
3
-0.5
|fˆ(k)| ∼ [f ](c)
0
e−ikc
0.5
1
+O
2πik
|k|2
!
Concentration factors
fˆ(k)
7→
|k|
πi
fe(k) := sgn(k)σ
fˆ(k)
cσ
N
cσ =
Z 1
σ(θ)
0
θ
dθ
Concentration factors
fˆ(k)
7→
|k|
πi
fe(k) := sgn(k)σ
fˆ(k)
cσ
N
SN f (x)
7→
σ f (x) :=
SN
X
k∈Ω
cσ =
fe(k)eikx
Z 1
σ(θ)
0
θ
dθ
Concentration factors
fˆ(k)
7→
|k|
πi
fe(k) := sgn(k)σ
fˆ(k)
cσ
N
SN f (x)
7→
σ f (x) :=
SN
X
k∈Ω
σ [f ] = [f ](x) + O(ε ) ∼
SN
N


 O(1),
cσ =
Z 1
σ(θ)
0
θ
dθ
fe(k)eikx
x ∼ singsupp[f ]

 O(ε ), f (·)
N
|∼x smooth
Edge detection by separation of scales: εN ∼
1
1
s
σ
(N × dist(x))
X πi
|k|
sgn(k)σ
fˆ(k)eikx –
cσ
N
σ [f ] = K σ ∗S f ≡ K σ ∗f :
SN
N
N
N
how to choose σ(θk )?
X
|k|
1
σ
KN (x) := −
σ
sin kx
cσ |k|≤N
N
• ‘Heroic’ cancelation of oscillations — all σ 0s are admissible:
1 )x
cos(N
+
1
1
σ (x) ∼
2
+
+. . . ∼ δ 0(N x)
KN
2π sin(x/2)
Nx
N
X πi
|k|
sgn(k)σ
fˆ(k)eikx –
cσ
N
σ [f ] = K σ ∗S f ≡ K σ ∗f :
SN
N
N
N
how to choose σ(θk )?
X
|k|
1
σ
KN (x) := −
σ
sin kx
cσ |k|≤N
N
• ‘Heroic’ cancelation of oscillations — all σ 0s are admissible:
1 )x
cos(N
+
1
1
σ (x) ∼
2
+
+. . . ∼ δ 0(N x)
KN
2π sin(x/2)
Nx
N
• Example: Linear (Fejer): σ(θk ) = θk ;
• Example: Exponential(Gelb-Tadmor): σ(θk ) = θk ·
c
e θk (θk −1)
X πi
|k|
sgn(k)σ
fˆ(k)eikx –
cσ
N
how to choose σ(θk )?
X
|k|
1
σ
KN (x) := −
σ
sin kx
cσ |k|≤N
N
σ [f ] = K σ ∗S f ≡ K σ ∗f :
SN
N
N
N
• ‘Heroic’ cancelation of oscillations — all σ 0s are admissible:
1 )x
cos(N
+
1
1 0
σ
2
+
+. . . ∼ δ (N x)
KN (x) ∼
2π sin(x/2)
Nx
N
• Example: Linear (Fejer): σ(θk ) = θk ;
• Example: Exponential(Gelb-Tadmor): σ(θk ) = θk · e
f(x)
c
θk (θk −1)
[f](x)
1.5
1
0.75
1
1
0.5
0.5
0.5
0
0
-0.5
-0.5
0.25
0
-0.25
-0.5
-1
-1
-1
-0.75
x
-3 -2.5 -2 -1.5 -1 -0.5
0
0.5
left: reconstructed f ;
1
1.5
2
2.5
3
-0.5
0
0.5
-1.5
-3
σ=lin , S exp }
center: Minmod detection mm{SN
N
-2
-1
0
1
2
3
x
right: enhanced detection
−ikc
Noisy data: fˆ(k) ∼ [f ](c) e2πik + n̂(k)
N
N
X
σ(θk )
1 X
σ
SN [f ](x) =
cos(k(x − c))“ ⊕ ”
σ(θk )n̂(k) cos(kx) ,
cσ k=1 N θk
k=1
|k|
:
• Concentration factors — σ(θk ), θk :=
N
Z 1
σ(θ)
0
θ
dθ = 1
−ikc
Noisy data: fˆ(k) ∼ [f ](c) e2πik + n̂(k)
N
N
X
σ(θk )
1 X
σ
SN [f ](x) =
cos(k(x − c))“ ⊕ ”
σ(θk )n̂(k) cos(kx) ,
cσ k=1 N θk
k=1
|k|
:
• Concentration factors — σ(θk ), θk :=
N
Least squares :
P
min
σ(θk )
k N θk =1
X σ(θk ) 2
k
N θk
+λ
Z 1
σ(θ)
0
X
k
θ
dθ = 1
σ 2(θk )E(|n̂(k)|2 )
−ikc
Noisy data: fˆ(k) ∼ [f ](c) e2πik + n̂(k)
N
N
X
σ(θk )
1 X
σ
SN [f ](x) =
cos(k(x − c))“ ⊕ ”
σ(θk )n̂(k) cos(kx) ,
cσ k=1 N θk
k=1
|k|
:
• Concentration factors — σ(θk ), θk :=
N
Least squares :
P
min
σ(θk )
k N θk =1
X σ(θk ) 2
k
N θk
+λ
Z 1
σ(θ)
0
X
θ
dθ = 1
σ 2(θk )E(|n̂(k)|2 )
k
• A new small scale: white noise with variance η := E(|n̂(k)|2):
−ikc
Noisy data: fˆ(k) ∼ [f ](c) e2πik + n̂(k)
N
N
X
σ(θk )
1 X
σ
SN [f ](x) =
cos(k(x − c))“ ⊕ ”
σ(θk )n̂(k) cos(kx) ,
cσ k=1 N θk
k=1
|k|
:
• Concentration factors — σ(θk ), θk :=
N
Least squares :
P
X σ(θk ) 2
min
σ(θk )
k N θk =1
k
N θk
+λ
Z 1
σ(θ)
0
X
θ
dθ = 1
σ 2(θk )E(|n̂(k)|2 )
k
• A new small scale: white noise with variance η := E(|n̂(k)|2):
√
ηλN θ
ση (θ) ∼
,
2
1 + ηλ(N θ)
cσ ∼
(
N
η = 0(noiseless)
√
tan−1( ηλN ) η > 0
−ikc
Noisy data: fˆ(k) ∼ [f ](c) e2πik + n̂(k)
N
N
X
σ(θk )
1 X
σ
SN [f ](x) =
cos(k(x − c))“ ⊕ ”
σ(θk )n̂(k) cos(kx) ,
cσ k=1 N θk
k=1
|k|
:
• Concentration factors — σ(θk ), θk :=
N
Least squares :
P
X σ(θk ) 2
min
σ(θk )
k N θk =1
k
N θk
+λ
Z 1
σ(θ)
0
X
θ
dθ = 1
σ 2(θk )E(|n̂(k)|2 )
k
• A new small scale: white noise with variance η := E(|n̂(k)|2):
√
ση (θ) ∼
(i)
(ii)
ηλN θ
,
2
1 + ηλ(N θ)
√
ηλN 1:
√
1
<
if N ∼ ηλ 1:
if
cσ ∼
(
N
η = 0(noiseless)
√
tan−1( ηλN ) η > 0
)
the pointwise error N := log(N
N
√ √
the pointwise error η := ηλ log
ηλ .
regular
z }| {
jump
z }| {
noise
z }| {
Noisy data: f (x) = r(x) + j(x) + n(x)
• (BV , L2)– minimization:
P
min
σ(θk )
k N θk =1
σ
σ
σ
2
k0kSN
(r)(x)kBV + kSN
(j)(x)k2
+
λkS
(
n
)(x)k
2
N
L
L2
|k|
Concentration factors: ση
N
!
(|k| − k0)+
,
=
cσ · (1 + ηλk2 )
k0 ≤ |k| ≤ N0 < N
regular
jump
z }| {
noise
z }| {
z }| {
Noisy data: f (x) = r(x) + j(x) + n(x)
• (BV , L2)– minimization:
P
min
σ(θk )
k N θk =1
σ
σ
σ
2
k0kSN
(r)(x)kBV + kSN
(j)(x)k2
+
λkS
(
n
)(x)k
2
N
L
L2
|k|
Concentration factors: ση
N
!
(|k| − k0)+
,
=
cσ · (1 + ηλk2 )
Signal plus noise, η = 0.000020
Concentration Factor, η = 0.000020
3
1
2.5
0.5
2
0
1.5
−0.5
1
−1
0.5
0
0
0.5
1
1.5
2
−1.5
0
Signal plus noise, η = 0.000045
0.5
1
1.5
2
Concentration Factor, η = 0.000045
3
1
2.5
0.5
2
1.5
0
1
−0.5
0.5
−1
0
−0.5
0
0.5
1
1.5
2
−1.5
0
0.5
1
1.5
2
k0 ≤ |k| ≤ N0 < N
Incomplete data: fˆ(k), k ∈ Ω ⊂ {−N, . . . N }
σ f (x) =
incomplete SΩ
X
k∈Ω
fe(k)eikx,
σ
(x) ∼ Jf (x) :=
• Jump function: SN
|k|
πi
e
f (k) := sgn(k)σ
fˆ(k).
cσ
N
X
j
[f ](cj )1[xνc ,xν +1] (x)
cj
j
Incomplete data: fˆ(k), k ∈ Ω ⊂ {−N, . . . N }
X
σ f (x) =
incomplete SΩ
k∈Ω
fe(k)eikx,
σ
(x) ∼ Jf (x) :=
• Jump function: SN
X
j
prescribed data
zX }|
{
fe(k)eikx
Jf (x) =
k∈Ω
|k|
πi
e
f (k) := sgn(k)σ
fˆ(k).
cσ
N
+
[f ](cj )1[xνc ,xν +1] (x)
cj
j
missing data
zX }|
{
ikx
e
J(k)e
.
k6∈Ω
e
• {J(k)|k
6∈ Ω} — sought as minimizers of the kJ(x)kBV :
prescribed
z }| {
X
σ f (x) +
ikx .
e
J(k)e
Jf (x) = argminJf(k) kJkBV Jf (x) = SΩ
f
k6∈Ω
Incomplete data: fˆ(k), k ∈ Ω ⊂ {−N, . . . N }
X
σ f (x) =
incomplete SΩ
k∈Ω
fe(k)eikx,
σ
(x) ∼ Jf (x) :=
• Jump function: SN
X
j
prescribed data
zX }|
{
ikx
fe(k)e
Jf (x) =
k∈Ω
|k|
πi
fe(k) := sgn(k)σ
fˆ(k).
cσ
N
+
[f ](cj )1[xνc ,xν +1] (x)
cj
j
missing data
zX }|
{
ikx
e
J(k)e
.
k6∈Ω
e
6∈ Ω} — sought as minimizers of the kJ(x)kBV :
• {J(k)|k
prescribed
z }| {
X
σ f (x) +
ikx .
e
Jf (x) = argminJf(k) kJkBV Jf (x) = SΩ
J(k)e
f
k6∈Ω
n
o
f
complement the prescribed
• The rationale: Missing Jf (k)
|k6
∈
Ω
n
o
e
f (k)
as the approximate Fourier coefficients of (Jcf )k
|k∈Ω
Incomplete data: fˆ(k), k ∈ Ω ⊂ {−N, . . . N }
Left: the original image; center: recovered by the back projection
method; right: recovered edges by the compressed sensing edge
detection method.
Here, 3000 Fourier coefficients are uniformly and randomly distributed over the 1282 gridpoints. We construct the spectral
data of the edges captured by the Sobel edge detection method.
Incomplete data: fˆ(k), k ∈ Ω ⊂ {−N, . . . N }
Left: the original image; center: recovered by the back projection
method; right: recovered edges by the compressed sensing edge
detection method.
Here, 3000 Fourier coefficients are uniformly and randomly distributed over the 1282 gridpoints. We construct the spectral
data of the edges captured by the Sobel edge detection method.
REF: The Computation of Gibbs Phenomenon [Acta Numerica 2007]
Multiscale Decomposition of Images in (BV,Lp)∗
∗
with Prashant Athavale (Maryland) Suzanne Nezzar (Stockton)
and Luminita Vese (UCLA)
Image processing: (X, Y )-multiscale decompositions
f ∈ L2(Ω), (grayscale) f = (f1 , f2, f3) ∈ L2(Ω)3, (colored): Ω ⊂ IR2
R
Noticeable features: edges quantified in X = BV, ..., kukX := Ω φ(Dp u), ..
Other features: blur, patterns, noise, texture,...: in Y = L2, ..., L1, ..
Image processing: (X, Y )-multiscale decompositions
f ∈ L2(Ω), (grayscale) f = (f1 , f2, f3) ∈ L2(Ω)3, (colored): Ω ⊂ IR2
R
Noticeable features: edges quantified in X = BV, ..., kukX := Ω φ(Dp u), ..
Other features: blur, patterns, noise, texture,...: in Y = L2, ..., L1, ..
Images in intermediate spaces ‘between’ X = BV and Y = L2, . . .
• T : Y ,→ Y is a blurring operator:
QT (f, λ; X, Y ) :=
inf
u∈X(Ω)
kukX + λkf − T uk2
Y
Image processing: (X, Y )-multiscale decompositions
f ∈ L2(Ω), (grayscale) f = (f1 , f2, f3) ∈ L2(Ω)3, (colored): Ω ⊂ IR2
R
Noticeable features: edges quantified in X = BV, ..., kukX := Ω φ(Dp u), ..
Other features: blur, patterns, noise, texture,...: in Y = L2, ..., L1, ..
Images in intermediate spaces ‘between’ X = BV and Y = L2, . . .
• T : Y ,→ Y is a blurring operator:
QT (f, λ; X, Y ) :=
inf
u∈X(Ω)
kukX + λkf − T uk2
Y
Example: λ as a fixed threshold of noisy part: kf − T ukL2 ≤ σλ
f = T uλ + v λ ,
• uλ extracts the edges;
[uλ, vλ] = arg inf Q(f, λ; X, Y ).
T u+v=f
vλ captures textures
Multiscale decompositions — blurry and noisy images
• Distinction is scale dependent – ‘texture’ at a λ-scale consists
of significant edges when viewed under a refined 2λ-scale
vλ = T u2λ + v2λ,
[u2λ, v2λ] = arg inf Q(vλ, 2λ).
T u+v=v
λ
Multiscale decompositions — blurry and noisy images
• Distinction is scale dependent – ‘texture’ at a λ-scale consists
of significant edges when viewed under a refined 2λ-scale
vλ = T u2λ + v2λ,
[u2λ, v2λ] = arg inf Q(vλ, 2λ).
T u+v=v
λ
• A better 2-scale representation: f = T uλ + vλ ≈ T uλ + T u2λ... + v2λ
Multiscale decompositions — blurry and noisy images
• Distinction is scale dependent – ‘texture’ at a λ-scale consists
of significant edges when viewed under a refined 2λ-scale
vλ = T u2λ + v2λ,
[u2λ, v2λ] = arg inf Q(vλ, 2λ).
T u+v=v
λ
• A better 2-scale representation: f = T uλ + vλ ≈ T uλ + T u2λ... + v2λ
This process can continue...
[uj+1, vj+1] = arg
inf
T uj+1+vj+1=vj
QT (vj , λj+1),
λj := λ0 2j , j = 0, 1, . . .
After k hierarchical step:
f = T u0 + v 0 =
= T u0 + T u1 + v 1 =
= ......
=
= T u0 + T u1 + · · · + T uk + v k .
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
j≥0
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
|
R
R
j
j
R
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
|
R
R
j
j
R
• Energy decomposition:
∞ X
1
j=0 λj
2 , f ∈X
kuj kX +kT uj k2
=
kf
k
2
L
L2
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
|
R
R
j
j
R
• Energy decomposition:
∞ X
1
j=0 λj
2 , f ∈X
kuj kX +kT uj k2
=
kf
k
2
L
L2
Three basic questions about multiscale decompositions
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
|
R
R
j
j
R
• Energy decomposition:
∞ X
1
j=0 λj
2 , f ∈X
kuj kX +kT uj k2
=
kf
k
2
L
L2
Three basic questions about multiscale decompositions
∼
• Where to start: u =
X
j≥j0
uj :
1
< 2j0 λ0 kT ∗f k∗ ≤ 1
2
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
|
R
R
j
j
R
• Energy decomposition:
∞ X
1
j=0 λj
2 , f ∈X
kuj kX +kT uj k2
=
kf
k
2
L
L2
Three basic questions about multiscale decompositions
∼
• Where to start: u =
X
uj :
j≥j0
∼
• When to stop? u =
k
X
j≥j0
uj
1
< 2j0 λ0 kT ∗f k∗ ≤ 1
2
Hierarchical expansion and energy decomposition
• Iterated Tikhonov regularization – expansion of denoised/deblurred image:
∼
u=
k
X
uj
⇐=
f = T u0 + T u1 + ... + T uk−1 + T uk + vk
j≥0
Example. f = χB
R
|B |
−j universal
R χ
: vj = λ1R χB − λ1R |Ω\B
∼
2
Ω\B
R
R
j
j
R|
• Energy decomposition:
∞ X
1
j=0 λj
2
kuj kX +kT uj k2
=
kf
k
2
L
L2 , f ∈ X
Three basic questions about multiscale decompositions
∼
• Where to start: u =
X
uj :
j≥j0
∼
• When to stop? u =
k
X
1
< 2j0 λ0 kT ∗f k∗ ≤ 1
2
uj
j≥j0
• Why dyadic scales — λ0 < λ1 < . . . < λk ,
λk ∼ 2 k ?
Examples of multiscale image decompositions
Examples of multiscale image decompositions
• Multiscale X = BV, Y = L2-decomposition
Chambolle-Lions: [uλ, vλ] = arginf u
z
Q(f,λ;BV,L2 )
}|
{
kukBV + λkf − uk2
L2
2
Rudin-Osher-Fatemi: inf u kukBV : kf − uk2
=
σ
2
L
Examples of multiscale image decompositions
• Multiscale X = BV, Y = L2-decomposition
Chambolle-Lions: [uλ, vλ] = arginf u
Q(f,λ;BV,L2 )
}|
z
{
kukBV + λkf − uk2
L2
2
Rudin-Osher-Fatemi: inf u kukBV : kf − uk2
=
σ
2
L
• Chan-Esedoglu: X = BV, Y = L1-decomposition
• We advocate (BV, L2
1): [uλ, vλ] = arginf u
z
Q(f,λ;BV,L2
1)
}|
{
kukBV + λkf − uk2
L1
Examples of multiscale image decompositions
• Multiscale X = BV, Y = L2-decomposition
Chambolle-Lions: [uλ, vλ] = arginf u
Q(f,λ;BV,L2 )
}|
z
{
kukBV + λkf − uk2
L2
2
Rudin-Osher-Fatemi: inf u kukBV : kf − uk2
=
σ
2
L
• Chan-Esedoglu: X = BV, Y = L1-decomposition
• We advocate (BV, L2
1): [uλ, vλ] = arginf u
z
Q(f,λ;BV,L2
1)
}|
{
kukBV + λkf − uk2
L1
• Mumford-Shah: X = SBV, Y = L2-decomposition
Examples of multiscale image decompositions
• Multiscale X = BV, Y = L2-decomposition
Chambolle-Lions: [uλ, vλ] = arginf u
Q(f,λ;BV,L2 )
}|
z
{
kukBV + λkf − uk2
L2
2
Rudin-Osher-Fatemi: inf u kukBV : kf − uk2
=
σ
2
L
• Chan-Esedoglu: X = BV, Y = L1-decomposition
• We advocate (BV, L2
1): [uλ, vλ] = arginf u
z
Q(f,λ;BV,L2
1)
}|
kukBV + λkf − uk2
L1
• Mumford-Shah: X = SBV, Y = L2-decomposition
• Osher-Vese: X = BV, Y = BV ∗ ...
{
Examples of multiscale image decompositions
• Multiscale X = BV, Y = L2-decomposition
Chambolle-Lions: [uλ, vλ] = arginf u
Q(f,λ;BV,L2 )
}|
z
{
kukBV + λkf − uk2
L2
2
Rudin-Osher-Fatemi: inf u kukBV : kf − uk2
=
σ
2
L
• Chan-Esedoglu: X = BV, Y = L1-decomposition
• We advocate (BV, L2
1): [uλ, vλ] = arginf u
z
Q(f,λ;BV,L2
1)
}|
kukBV + λkf − uk2
L1
• Mumford-Shah: X = SBV, Y = L2-decomposition
• Osher-Vese: X = BV, Y = BV ∗ ...
2
f
kukBV (Ω) + λ − 1
• Multiplicative noise:
inf
u
u∈BV+(Ω)
L2(Ω)
{
(BV, L2)-decomposition of Barbara
uj+1 − 2λ1 div
j+1
∇uj+1
|∇uj+1|
1 div
= − 2λ
j
∇uj
|∇uj |
(BV, L2)-decomposition of colored images
f
P2
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P5
i=0 uλi
P6
i=0 uλi
P7
i=0 uλi
P8
i=0 uλi
(BV, L2)-decomposition of colored images
f
P2
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P5
i=0 uλi
P6
i=0 uλi
P7
i=0 uλi
P8
i=0 uλi
f
P2
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P5
i=0 uλi
P6
i=0 uλi
P7
i=0 uλi
P8
i=0 uλi
Multiscale QT (BV, L2) deblurring+denoising
f
uRO
vRO + 128
uλ 0
P1
i=0 uλi
P2
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P5
i=0 uλi
vλ5
1st row, from left to right: blurry-noisy version f , and Rudin-Osher restoration
uRO , vRO = f − uRO + 128, rmse =0.2028. RO parameters λ = 0.5, h = 1,
4t = 0.025. The other rows, left to right: the hierarchical recovery of
u from the same blurry-noisy initial image f using 6 steps. Parameters:
λk = .01 ∗ 2k+1, ∆t = 0.025, h = 1; rmse= 0.2011: avoids staircasing
(BV, L1) vs. (BV, L2
1)-decomposition
k
X
∇uk
|uk − f |
uj = f +
∇
, λk = 2λk−1.
2λk kuk − f kL1
|∇uk |
(BV, L1) vs. (BV, L2
1)-decomposition
k
X
∇uk
|uk − f |
uj = f +
∇
, λk = 2λk−1.
2λk kuk − f kL1
|∇uk |
(BV,L21)
(BV,L21)
λ=6.5536
λ=13.1072
(BV,L21)
(BV,L21)
λ=26.2144
λ=52.4288
Original noisy image
(BV, L1) vs. (BV, L2
1)-decomposition enhanced
k
X
∇uk
|uk − f | g(|G ∗ ∇uk |)
uj = f +
∇
, λk = 2λk−1.
2λk kuk − f kL1
|∇uk |
(BV, L1) vs. (BV, L2
1)-decomposition enhanced
k
X
∇uk
|uk − f | g(|G ∗ ∇uk |)
uj = f +
∇
, λk = 2λk−1.
2λk kuk − f kL1
|∇uk |
(BV,L21) with g(|G*Du|)
(BV,L21) with g(|G*Du|)
λ=6.5536
λ=13.1072
(BV,L21) with g(|G*Du|)
(BV,L21) with g(|G*Du|)
λ=26.2144
λ=52.4288
Original noisy image
(BV, L1) vs. (BV, L2
1) -decomposition enhanced
(BV, L1) vs. (BV, L2
1) -decomposition enhanced
k
X
∇uk
|uk − f | g(|G ∗ ∇uk |)
uj = f +
∇
, λk = 2λk−1.
λk
|∇uk |
(BV,L ) with g(|G*Du|)
(BV,L ) with g(|G*Du|)
λ=6.5536
λ=13.1072
(BV,L ) with g(|G*Du|)
(BV,L ) with g(|G*Du|)
λ=26.2144
λ=52.4288
1
1
Original noisy image
1
1
(BV, L1) vs. (BV, L2
1) in normal direction
(BV, L1) vs. (BV, L2
1) in normal direction
k
X
∇uk
|uk − f | g(|G ∗ ∇uk |)|∇uk |
uj = f +
∇
, λk = 2λk−1.
2λk kuk − f kL1
|∇uk |
(BV,L2) with g(|G*Du|)|Du|
(BV,L2) with g(|G*Du|)|Du|
λ=6.5536
λ=13.1072
(BV,L1) with g(|G*Du|)|Du|
(BV,L1) with g(|G*Du|)|Du|
λ=26.2144
λ=52.4288
1
1
Original noisy image
2
2
Multiscale (SBV, L2) decomposition
• The regularized Mumford-Shah by Ambrosio and Tortorelli (AT):
ATρ(f, λ) :=
( sZ
inf
{u,v,w | u+v=f }
µ
Ω
(w2 + ρρ )|∇u|2 dx+
+ρk∇wk2
+
L2(Ω)
ρρ → 0, ρ ↓ 0
[uj+1, vj+1] = arg
inf
u,v,w,u+v=vj
ATρ (vj , λj ),
kw − 1k2
L2(Ω)
4ρ


+ λkvk2
,
L2(Ω)
λ0 < λ1 < λ2 < . . .

Multiscale (SBV, L2) decomposition
• The regularized Mumford-Shah by Ambrosio and Tortorelli (AT):
ATρ(f, λ) :=
inf
{u,v,w | u+v=f }
µ
Ω
(w2 + ρρ )|∇u|2 dx+
+ρk∇wk2
+
L2(Ω)
ρρ → 0, ρ ↓ 0
[uj+1, vj+1] = arg
inf
u,v,w,u+v=vj
• Energy scaling:
( sZ
∞ X
1
kw − 1k2
L2(Ω)
4ρ
ATρ (vj , λj ),
+ λkvk2
,
L2(Ω)
λ0 < λ1 < λ2 < . . .
2
|uj |H 1 (Ω) + kuj k2
≤
kf
k
2
L
L2
wj
λ
j
j=0



Multiscale (SBV, L2) decomposition
• The regularized Mumford-Shah by Ambrosio and Tortorelli (AT):
ATρ(f, λ) :=
inf
{u,v,w | u+v=f }
µ
Ω
(w2 + ρρ )|∇u|2 dx+
+ρk∇wk2
+
L2(Ω)
ρρ → 0, ρ ↓ 0
[uj+1, vj+1] = arg
inf
u,v,w,u+v=vj
• Energy scaling:
( sZ
kw − 1k2
L2(Ω)
∞ X
1
+ λkvk2
,
L2(Ω)

4ρ
ATρ (vj , λj ),


λ0 < λ1 < λ2 < . . .
2
|uj |H 1 (Ω) + kuj k2
≤
kf
k
2
L
L2
wj
λ
j
j=0
∼
Hierarchical decomposition: f =
X
j=0
uj , kf −
k
X
j=0
uj kH −1 (Ω) =
wk
µ
λk+1
Multiscale (SBV, L2) decomposition
• The regularized Mumford-Shah by Ambrosio and Tortorelli (AT):
ATρ(f, λ) :=
( sZ
inf
µ
{u,v,w | u+v=f }
Ω
(w2 + ρρ )|∇u|2 dx+
+ρk∇wk2
+
L2(Ω)
ρρ → 0, ρ ↓ 0
[uj+1, vj+1] = arg
inf
u,v,w,u+v=vj
∞ X
kw − 1k2
L2(Ω)
+ λkvk2
,
L2(Ω)

4ρ
ATρ (vj , λj ),


λ0 < λ1 < λ2 < . . .
1
2
|uj |H 1 (Ω) + kuj k2
≤
kf
k
• Energy scaling:
2
L
L2
wj
λ
j
j=0
∼
Hierarchical decomposition: f =
X
j=0
• Discrete AT-weights, 1 − wj :
1 − wj (x, y) ≈
(
0,
1,
uj , kf −
k
X
j=0
uj kH −1 (Ω) =
wk
edge detectors
in regions of smoothness
near edges
µ
λk+1
f
uλ 0
P1
i=0 uλi
P2
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P5
i=0 uλi
P6
i=0 uλi
P7
i=0 uλi
P8
i=0 uλi
P9
i=0 uλi
Ambrosio-Tortorelli hierarchical decomposition in 10 steps. Parameters: λ0 =
.25, µ = 5, ρ = .0002, and λk = 2k λ0.
f
wλ0
P1
λ0
i=0 λi wλi
P2
λ0
i=0 λi wλi
P3
λ0
i=0 λi wλi
P4
λ0
i=0 λi wλi
P5
λ0
i=0 λi wλi
P6
λ0
i=0 λi wλi
P7
λ0
i=0 λi wλi
P8
λ0
i=0 λi wλi
P9
λ0
i=0 λi wλi
The weighted sum of the wi’s using AT decomposition in 10 steps.
f
uλ 0
P1
i=0 uλi
P3
i=0 uλi
P4
i=0 uλi
P6
i=0 uλi
P7
i=0 uλi
P9
i=0 uλi
The AT hierarchical decomposition of a fingerprint.
Remove multiplicative noise
f
uλ 0
Q1
i=0 uλi
Q2
i=0 uλi
Q3
i=0 uλi
Q4
i=0 uλi
Q5
i=0 uλi
Q6
i=0 uλi
Q7
i=0 uλi
Q8
i=0 uλi
Q9
i=0 uλi
vλ9 · 120
The recovery of u given an initial image of a woman with multiplicative noise,
for 10 steps. Parameters: λ0 = .02, and λk = 2k λ0.
Evolution in inverse scale space
∇u
uj+1 − 2λ1 div |∇uj+1|
j+1
j+1
∇u
j
1 div
= − 2λ
|∇uj |
j
Evolution in inverse scale space
∇u
uj+1 − 2λ1 div |∇uj+1|
j+1
j+1
∇u
j
1 div
= − 2λ
|∇uj |
j
• Recursive; in fact, telescoping....
k
X
∇ x uk
1
uj − f =
divx
2λk
|∇x uk |
j=1
!
Z t
∇x u(t)
1
divx
7
→
u(s)ds − f =
λ(t)
|∇x u(t)|
s=0
!
Evolution in inverse scale space
∇u
uj+1 − 2λ1 div |∇uj+1|
j+1
j+1
∇u
j
1 div
= − 2λ
|∇uj |
j
• Recursive; in fact, telescoping....
k
X
∇ x uk
1
uj − f =
divx
2λk
|∇x uk |
j=1
!
Z t
∇x u(t)
1
divx
7
→
u(s)ds − f =
λ(t)
|∇x u(t)|
s=0
∇x u(t)
1
divx
... compared w/Perona-Malik: ut =
λ(t)
|∇x u(t)|
!
!
Evolution in inverse scale space
∇u
uj+1 − 2λ1 div |∇uj+1|
j+1
j+1
∇u
j
1 div
= − 2λ
|∇uj |
j
• Recursive; in fact, telescoping....
k
X
∇ x uk
1
uj − f =
divx
2λk
|∇x uk |
j=1
!
Z t
∇x u(t)
1
divx
7
→
u(s)ds − f =
λ(t)
|∇x u(t)|
s=0
∇x u(t)
1
divx
... compared w/Perona-Malik: ut =
λ(t)
|∇x u(t)|
!
• The role of λ(t)? dictates the size of texture: kU (·, t) − f kBV ∗ !
!
Evolution in inverse scale space
uj+1 − 2λ1 div
j+1
∇uj+1
|∇uj+1|
1 div
= − 2λ
j
∇uj
|∇uj |
• Recursive; in fact, telescoping....
k
X
∇ x uk
1
uj − f =
divx
2λk
|∇x uk |
j=1
!
Z t
∇x u(t)
1
divx
7
→
u(s)ds − f =
λ(t)
|∇x u(t)|
s=0
... compared w/Perona-Malik: ut =
∇x u(t)
1
divx
λ(t)
|∇x u(t)|
!
!
• The role of λ(t)? dictates the size of texture: kU (·, t) − f kBV ∗ !
• Restrict the diffusion in directions orthogonal to edges
Z t
∇xu(x, t)
1
g(G ∗ ∇x u(x, t))|∇x u(x, t)| · divx
u(x, s)ds − f (x) =
λ(t)
|∇x u(x, t)|
s=0
!
255
The U part
The V part
10
255
The U part
The V part
10
255
The U part
The V part
10
255
The U part
The V part
0.2
10
1
The original image
2
The original image
3
The original image
The original image
255
The U part
The V part
20
255
The U part
The V part
20
255
The U part
The V part
20
255
The U part
The V part
0.2
20
1
The original image
2
The original image
3
The original image
The original image
THANK YOU
Download