lec06_beyond_snakes

advertisement
The University of
CS 4487/9587
Algorithms for Image Analysis
Ontario
Beyond Snakes:
Explicit vs. Implicit
representation of contours
Level-sets
Geodesic Active Contours
6-1
The University of
Beyond the Snakes
Ontario
Examples of Representation Models
for Contours and Surfaces

Parametric (snakes)
• Explicit (point-based)
• Spline snakes

Non-parametric (implicit models)
• Level-sets (mainly for Geometric Active Contours)
• Graph-based
6-2
The University of
Previous (polygonal) snakes
can be seen as
v1
Ontario
v3
v2
v0
N 1
C ( s )   vi  Bi ( s )
0  s  ( N  1)
i 0
N 1
BN 1 ( s)
B2 (s)
B0 ( s)
i 0
…
0
1
2
3
 B ( s)  1
N-2
s
i
N-1
• Contour as a linear combination of N basis functions
s
6-3
The University of
Other parametric representation models for contours:
B-spline snakes (See A. Blake and M. Isard, “Active Contours”)
v1
Ontario
v3
v2
v0
N 1
C ( s )   vi  Bi ( s )
0  s  ( N  1)
i 0
N 1
BN 1 ( s)
B2 (s)
B0 ( s)
i 0
…
0
1
2
3
 B ( s)  1
i
s
By convention, B-splines
sum to 1 at all points s
N-2
N-1
s
6-4
• Each Bi (s) is typically a simple combination of polynomials
The University of
Other parametric representation models for contours:
B-spline snakes (See A. Blake and M. Isard, “Active Contours”)
Ontario
N 1
Explicit representation of contour: C ( s)   vi  Bi ( s)
i 0
v1
v0
v3
v2
Smooth (differentiable) curve passes close to the poly-line
joining the knot (control) points
B-spline snakes - still parametric representation of
continuous contours via finite number of parameters:
{v0 , v1 , v2 ,..., vN 1}
6-5
The University of
[a slide borrowed from Daniel Cremers]
Evolution of Explicit Boundaries
Ontario
N 1
C( s)   v j  B j ( s)
j 0
control
points
basis
functions
F(s )
C(s)
dC   dv j  B j ( s)  F( s)  dt
some given
force
j

dv j  Bi , B j  Bi , F  dt
j
Bij
bi
 dv  B 1 b  dt
control point evolution
6-6
The University of
[a slide borrowed from Daniel Cremers]
Evolution of Explicit Boundaries
Ontario
Cremers, Tischhäuser, Weickert, Schnörr, “Diffusion Snakes”, IJCV6-7
'02
The University of
[a slide borrowed from Daniel Cremers]
Evolution of Explicit Boundaries
Ontario
Cremers, Tischhäuser, Weickert, Schnörr, “Diffusion Snakes”, IJCV6-8
'02
The University of
[a slide borrowed from Daniel Cremers]
Limitations of Explicit Representations
Ontario
Insufficient resolution / control point density
requires control point regridding mechanisms
Fixed topology
requires heuristic splitting mechanisms
6-9
The University of
Non-parametric implicit representation of
contours via Level Sets
Ontario
z  u ( x, y )
C  {( x, y ) : u ( x, y )  0}
(implicit contour representation)
• Let contour C be a zero-level set of some function u(x,y)
• Function u(x,y) is called a level-set function
6-10
The University of
Non-parametric implicit representation of
contours via Level Sets
Ontario
z  u ( x, y )
C  {( x, y ) : u ( x, y )  0}
(implicit contour representation)
• For a given u(x,y) one can get C by thresholding
Contour interior are points with negative values of u(x,y)
• How to get some level set function u(x,y) for a given C ?
6-11
Signed distance map: u( x, y)  dist x, y , C 
The University of
[Visualization is courtesy of O. Juan]
Example of distance map
Ontario
6-12
The University of
Advantages of level-set
representation of contours

Ontario
Level set functions can represent contours
corresponding to object boundaries with arbitrary
topological properties (holes, isolated parts)

No self intersections

Can change topology
as it evolves
6-13
The University of
[Visualization is courtesy of O. Juan]
Some more complex examples
Ontario
6-14
The University of
[Visualization is courtesy of O. Juan]
Some more complex examples
Ontario
6-15
The University of
Level-sets, how does that work?
- easier then you may think
Ontario
• Level set function u(x,y) is normally discretized/stored over image pixels
• Values of u(p) can be interpreted as distances or heights of image pixels
-1.7
-0.8
-0.8
0.2
A contour may be
approximated from
u(x,y) with
sub-pixel accuracy
u p  u( x p , y p)
-0.6
-0.4
-0.5
0.5
C
-0.2
0.6
0.3
0.7
6-16
The University of
Dervieux, Thomasset, ’79, ’81, Osher and Sethian, 89
Contour evolution via Level Sets
Consider some motion or
evolution of contour C


dC p   p  N p
motion of
contour points
du p    p  | u p |
change of
“pixel’s height”
gradient of u(x,y)

|| dC p ||  |  p |

dC p
speed
p
Note: contour’s motion
can be detected visually only
in the direction orthogonal

to the contour (i.e. N ).
Tangential motion is
geometrically irrelevant.
contour
normals
FACT: contour evolution above can
be implicitly replicated by updating
level-set function u(x,y) as follows:
Ontario
u (x)
The scaling by  | u | is easily
verified in one dimension
du   | u | dC
dC
x
p
6-17
The University of
[Visualization is courtesy of O. Juan]
Simple evolution
Ontario
6-18
Where could contour speed 
come from in image segmentation?

dC    N

The University of
Ontario
this geometric contour motion formula
can be seen as a mathematical way of
describing generic form of region growing
Empirical approach: chose  based on some image heuristics
• BOUNDARY BIAS: contour speed  could be large far from object
boundaries and approach zero near high image gradients in the image
• REGIONAL BIAS: contour speed  at pixel p could measure how much
contour C likes (>0) or dislikes (<0) to have p in its interior
BAD IDEA: empirical choices of  are often equivalent to
“region growing” or “thresholding”, ...
examples?
- no energy formulation => no quality or convergence guarantees
BETTER APPROACH: formulate an energy functional and optimize it
- generally much safer way to robust and numerically stable algorithms
6-19
The University of
May contour motion correspond
to some energy minimization?
Ontario
gradient descent w.r.t. energy E

dC    N
visible motion
? dC  dt  E

Np
p
(as in slide 5-42)
 dxdEp 
E p   dE 
 dy p 
POTENTIAL PROBLEMS:


Energy gradient E may not be orthogonal to contour creating
invisible geometrically-irrelevant motion,…
example?
Contour parameterization (e.g. control points selection) may affect
gradient E and, therefore, visible contour motion
6-20
The University of
What if energy E evaluates only
appearance of contour C ?
Ontario
gradient descent w.r.t. energy E

dC    N
visible motion
dC  dt  E

Np
E p
p
• Energy E(C) is geometric if it evaluates only geometric properties
of contour C which are visually observable and independent of
contour representation or parameterization choice
– then gradient descent can generate only visible motion (Why?)
– implicit representation of contours via level-sets fits well (Why?)
6-21
The University of
Ontario
MAIN MOTIVATION FOR USING
GEOMETRIC ENERGIES IN IMAGE ANALYSIS:
appearance/geometry of contour C defines segmentation and
that is all that really matters in the context of image analysis
invisible parameters of contours are irrelevant, in the end.
6-22
The University of
Bringing geometry into snakes:
first proposed by Caselles et al. in 1993-95
Elastic Snake
(Kass, Witkin, Terzopoulos, 87)
motivated by physics
Ontario
energy of elastic band with external force
1
1
2
2
v( s)
0
0
dv
E     | | ds   | I
ds
for uniform parameterization
| ds
C : [0,1]  R 2
• Reparameterization s  f (s) changes the energy form and affects contours behavior
- like choosing different irregularly spaced control points for a discrete snake
Riemannian length of C
L
Geodesic Active Contours
(Caselles, Kimmel, Shapiro, 95)
motivated by geometry
E   g (C ( s)) ds   g p ds
C
0
for arc-length parameterization
C : [0, L]  R 2
g  1 then E(C ) is simply Euclidean length of contour C
• if function g  0 weights each ds interval => “weighted” (Riemannian) length of C
6-23
• if
The University of
Caselles, Kimmel, Shapiro, 95
Geodesic Active Contours
Ontario
B


Energy E (C )   g (| I C ( s ) |) ds = “weighted length” of C
A
Geodesic: Shortest curve between two points (wrt. E).
distance
map
B
A
distance
map
B
A
Euclidian metric
(constant g ()  1)
E(C) = Euclidean length of C
Riemannian metric
(space varying g (| I |)  0 )
E(C) = image-weighted length of C
(very similar to live-wire)
6-24
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Geodesic Active Contours
Ontario
g p  g (| I ( p ) |)
I    H  I
Image I


“density”
function g p
smoothed
image gradients
Function g() transforms image gradients
into density function over image pixels g p .
Scalar function g p defines non-Euclidean 1
metric for measuring “length” of any curve
or path C in the image || C ||  g ds .
g

C
p
NOTE: live-wire uses a similar approach
by defining discrete image “metric” via
graph edge-weights
|| C || w  we

eC
0
1
g (| I |) 
0
2
1 | I |
| I |
6-25
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Geodesic Active Contours
Ontario
g p  g (| I ( p ) |)
I    H  I
Image I
C2


“density”
function g p
smoothed
image gradients
|| C || g   g p  ds
C
For strong enough image edges
and for appropriate g we have
E(C2 ) ? E(C1 )

C2
( 0)  ds <<

C1
(1)  ds
C1
g (| I |)
1
0
gp 1
gp  0
| I |
Other examples of g() ?
g ' ()  0
…restriction: g ()  0  6-26
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Implementing Geodesic Active Contours
Ontario
gradient descent
|| C || g   g p  ds
dC p  dt  E p
C
dot product
“closed” formula
FACT:
for any g>=0


E p  ( g p   p  g p , N p )  N p

Np
p
1

r
p
curvature of
contour C
E p
normal of
contour C
NO tangential
(geometrically invisible)
motion

dC p ~ E p ~ N p
6-27
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Implementing Geodesic Active Contours
Ontario


 Gradient Descent dC  dt  E  dt  ( g    g , N )  N
 dC 
 du 
-2
-1
0
+1
+2
u ( x, y )
div u    

 N

via
level-sets

N  u
1
r
 du  dt  ( g  divu   g , u )
6-28
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Implementing Geodesic Active Contours
Ontario


 Gradient Descent dC  dt  E  dt  ( g    g , N )  N
 dC 
 du 
-2
-1
0
+1
+2
u ( x, y )
div u    

 N

via
level-sets

N  u
1
r
 du  dt  divg  u 
6-29
The University of
Caselles et al.’93, Caselles et al.’95, Kichenassamy et al. ‘95
Implementing Geodesic Active Contours
Ontario


 Gradient Descent dC  dt  E  dt  ( g    g , N )  N
 dC 
 du 
-2
-1
0
+1
+2
u ( x, y )
div (u )  u   
Laplacian


N  u

 N

via
level-sets
if u(x,y) is a signed
distance map, then
1
r
du  dt  div g  u 
| u | 1
6-30
The University of
[example from Goldenberg, Kimmel, Rivlin, Rudzsky, IEEE TIP ’01]
Geodesic Active Contours via Level-sets
up  
(dt ) 

 ( g  x u )
x

 ( g  y u )
y

Ontario
6-31
The University of
Simple Example:
Mean Curvature evolution
Ontario
If no image gradients, then g  1
|| C || g   g p  ds   1 ds || C ||
C
C
Our contour energy is its Euclidean length

Gradient Descent:

via level-sets
g  1  g  0

dC  dt  E  dt    N
u p   (dt ) 

 2u
x 2
Laplacian

 2u
y 2
u

6-32
The University of
Simple Example:
Mean Curvature evolution
Ontario
Gradient descent of contour w.r.t. Euclidean length
u p   (dt ) 

 2u
x 2

 2u
y 2

Laplacian
6-33
The University of
Simple Example:
Mean Curvature evolution
Ontario
Gradient descent of contour w.r.t. Euclidean length
u p   (dt ) 

 2u
x 2

 2u
y 2

Laplacian
6-34
The University of
[Visualization is courtesy of O. Juan]
More examples of Mean Curvature Motion
Ontario
6-35
The University of
[Visualization is courtesy of O. Juan]
More examples of Mean Curvature Motion
Ontario
6-36
The University of
Other geometric energy functionals
besides length [courtesy of Ron Kimmel]
Ontario
Geometric measures commonly used in segmentation
gradient descent 
evolution
Functional E( C )
weighted length
dC    N
E (C )   g () ds

 ~ g    g , N
C

weighted area
E (C )   f da
 ~ f

alignment
(flux)
E (C )  
C
 
v, N ds

 ~ div( v)
6-37
The University of
Optimizing Geometric Functionals E(C )


Differential approach:
- Gradient descent
- Explicit
- Implicit (level-sets)
- Local optimization
Ontario

dC  dt  E    N
du   
Integral approach:
-
-
DP (2D), remember livewire
Graph Cuts (N-D) – starting next lecture
Global optimization (combinatorial graph algorithms)
6-38
The University of
Continuous Global Optimization

Ontario
TV approach for image segmentation:
E (u )   f ob  u  f bg  1  u   u dx1dx2



with u ( x)  {0,1} cannot be solved directly.
E(u) is convex if we allow u ( x)  [0,1]
The globally optimal u * ( x) may have values  {0,1}
Threshold Theorem: (Chan, Esedoglu, Nikolova, SIAM 2006)
0 , if u * ( x) < 
u ( x )  
*
1 , if u ( x)  
is also globally optimal for any   (0,1)
6-39
Download