Slide 1

advertisement
Western
Interactive Segmentation
with Super-Labels
Andrew Delong
Lena Gorelick Frank Schmidt Olga Veksler
Yuri Boykov
Natural Images: GMM or MRF?
are pixels in this image i.i.d.? NO!
2
Natural Images: GMM or MRF?
3
Natural Images: GMM or MRF?
4
Natural Images: GMM or MRF?
5
Boykov-Jolly / Grab-Cut
[Boykov & Jolly, ICCV 2001] [Rother, Kolmogorov, Blake, SIGGRAPH 2004]
6
Boykov-Jolly / Grab-Cut
[Boykov & Jolly, ICCV 2001] [Rother, Kolmogorov, Blake, SIGGRAPH 2004]
7
Boykov-Jolly / Grab-Cut
[Boykov & Jolly, ICCV 2001] [Rother, Kolmogorov, Blake, SIGGRAPH 2004]
8
A Spectrum of Complexity
• Objects within image can be as
complex as image itself
• Where do we draw the line?
Gaussian?
GMM?
MRF?
object recognition??
9
Single Model Per Class Label
10
Multiple Models Per Class Label
11
Multiple Models Per Class Label
12
Our Energy ¼ Supervised Zhu & Yuille!
• Zhu & Yuille. PAMI’96; Tu & Zhu. PAMI’02
• Unsupervised
of pixels MDL
color clustering
boundary
similarity
+
length
+
regularizer
13
Our Energy ¼ Supervised Zhu & Yuille!
• Zhu & Yuille. PAMI’96; Tu & Zhu. PAMI’02
color
similarity
+
boundary
length
+
MDL
regularizer
14
Interactive Segmentation Example
15
Boykov-Jolly / Grab Cut
segmentation
colour models
16
Ours
segmentation
“sub-labeling”
colour models
17
Main Idea
• Standard MRF:
image-level MRF
object GMM
• Two-level MRF:
background GMM
image-level MRF
object MRF
GMMs
background MRF
GMMs
unknown number of labels in each group!
18
The “Super-Pixel” View
• Complex object ¼ group of super-pixels
• Interactive segmentation ¼
a“user-constrained super-pixel grouping”
19
The “Super-Pixel” View
• Why not just pre-compute super-pixels?
– boundaries may contradict user constraints
– user is helpful for making fine distinctions
• Combine automatic (unsupervised) and
interactive (supervised) into single energy
Spatially coherent clustering
+ MDL/complexity penalty
+ user constraints
= 2-level MRF
Like Zabih & Kolmogorov, CVPR 2004
Label Costs, CVPR 2010
Like Boykov & Jolly, ICCV 2001
20
Process Overview
user constraints
models from
1 propose
current super-labeling
2-level MRF
2 solve
via α-expansion
converged
E=452288
E=503005
3
refine all sub-models
Boykov-Jolly
+ unsupervised clustering (random sampling)
+ iterated multi-label graph cuts (like grab-cut)21
Our Problem Statement
• Input: set S of super-labels (e.g. ffg,bgg)
constraints g : P ! S [ fanyg
fg
any
bg
22
Our Problem Statement
• Output: set L of sub-labels
sub-labeling f : P ! L
model params µ` for each `2L
label grouping ¼ : L ! S
f
`2
`1
GMM `2
GMM `1
`3
¼ ±f
white
dark
green
light
green
23
Our Energy Functional
• Minimize single energy w.r.t. L, µ, f, ¼
data costs
X
E (L ; µ; ¼; f ) =
D p (`) =
`3
X
D p (f p ) +
p2 P
½
smooth costs
¡ ln Pr(I p jµ` )
1
label costs
`1 X
wpq V (f p ; f q ) +
pq2 N
`4
`2
h` ±` (f )
`2 L
if gp = any _ gp = ¼(`)
ot herwise
24
Our Energy Functional
• Minimize single energy w.r.t. L, µ, f, ¼
data costs
X
E (L ; µ; ¼; f ) =
smooth costs
X
D p (f p ) +
p2 P
label costs
X
wpq V (f p ; f q ) +
pq2 N
h` ±` (f )
`2 L
pay c1 `within group’
V(¢; ¢) 2 f 0; c1 ; c2 g
pay c2 `between group’
25
Our Energy Functional
• Minimize single energy w.r.t. L, µ, f, ¼
data costs
X
E (L ; µ; ¼; f ) =
smooth costs
X
D p (f p ) +
p2 P
label costs
X
wpq V (f p ; f q ) +
pq2 N
h` ±` (f )
`2 L
• Penalize number of GMMs used
– prefer fewer, simpler models
– MDL / information criterion
regularize “unsupervised” aspect
GMMs
GMMs
26
More Examples
Boykov-Jolly
2-level MRF
27
More Examples
Boykov-Jolly
2-level MRF
28
More Examples
Boykov-Jolly
2-level MRF
29
More Examples
grad students
baby panda
GMM density
for blue model
Boykov-Jolly
2-level MRF
30
(like “iCoseg”, Batra et al., CVPR 2010)
Interactive Co-segmentation
image collection
Boykov-Jolly
2-level MRF
31
More Examples
Boykov-Jolly
2-level MRF
32
More Examples
Boykov-Jolly
2-level MRF
33
Beyond GMMs
GMMs only
GMMs
GMMs + planes
plane
34
Synthetic Example
GMM
GMM
GMM
Boykov-Jolly
(1 GMM each label)
GMM
plane
plane
GMM
GMM
2-level MRF
(GMMs only)
2-level MRF
(GMM + planes)
• object = two planes in (x,y,grey) space
• noise = one bi-modal GMM (black;white)
35
Synthetic Example
black
white
2 planes
detected
white
1 GMM
detected
plane
black
plane
y
GMM
x
36
As Semi-Supervised Learning
• Interactive segmentation ¼
a
semi-supervised learning
– Duchenne , Audibert, Keriven, Ponce, Segonne.
Segmentation by Transduction. CVPR 2008.
– s-t min cut [Blum & Chawla, ICML’01]
– random walker [Szummer & Jaakkola, NIPS’01]
37
Conclusions
• GMM not good enough for image )
GMM not good enough for complex objects
• Energy-based on 2-level MRF
– data costs + smooth costs + label costs
• Algorithm: iterative random sampling,
re-fitting, and ®-expansion.
• Semi-supervised learning of complex
subspaces with ®-expansion
38
Download