dimaio.icdm06.ppt

advertisement
Belief Propagation in
Large, Highly Connected
Graphs for 3D Part-Based
Object Recognition
Frank DiMaio and Jude Shavlik
Computer Sciences Department
University of Wisconsin – Madison
USA
Part-Based Object
Recognition

A part-based model describes an
object using a pairwise Markov Field
(Felzenszwalb et al 2000,
Sudderth et al 2004, Isard 2003)

Object described using

Undirected part graph G=(V,E)

Vertex potential functions

Edge potential functions
Part-Based Object
Recognition

Probability of a configuration U={ui}
– given an image I – is the product
of potential functions

 
P(U | I) 
edges s t
ψst (us ,ut )
 s (us | I)
vertices s

For part-based object recognition
 Skeletal graph for tightly coupled parts
 Occupancy graph ensures no other parts
collide in 3D space
Inferring Part Locations with
Belief Propagation (BP)

Want to find part configuration maximizing
product of potential functions

Use belief propagation (BP) to approximate
marginal distributions

Iterative, message-passing method (Pearl 1988)

A message, mi→j, from part i to part j
indicates where i currently expects to find j
Belief Propagation Example
b( head | image)
b( left arm | image)
b( right arm | image)
b( torso | image)
b( left leg | image)
b( right leg | image)
Belief Propagation Example
b( head | image)
m head→torso(torso)
mL.arm→torso
mR.arm→torso
b( torso | image)
mL.leg→torso
mR.leg→torso
Belief Propagation Example
b( head | image)
b( left arm | image)
b( right arm | image)
b( torso | image)
b( left leg | image)
b( right leg | image)
What if the Graph has
Thousands of Parts?

In a graph with N parts and E edges



BP running time and memory requirements O(E)
Skeleton graph typically sparse – O(N) edges
Occupancy graph fully connected – O(N 2) edges

In very large graphs, O(N 2) runtime intractable

AggBP (our system) approximates O(N 2)
occupancy messages using O(N) messages
Message Approximation
Illustrated
 b 
 474  b4
 b4 
 414  b4m14 

m
 7 4 
1
4
5
3
4
2
6
 b4 
 424  b 4m24 
 b4 
 484  b4m84 
7
8
Message Approximation
Illustrated
1
5
3
4
2
6
Accumulator
7
8
Experiment I: Density Map
Interpretation
GLU
TYR
PHE
THR
LEU
GLN
ILE
ARG
GLY
ARG
GLU
ARG
PHE
ARG8
ILE7
GLY9
ARG10
PHE13
GLN6
GLU11
ARG12
GLU14
MET15
PHE16
ARG17
LEU5
THR4
PHE3
TYR2
GLU1
GLU18
LEU19
ASN20
GLU21
ALA22
GLU24
LEU23
LEU25
LYS26
ASP27
…
ALA28 GLN29
GLY31
ALA30
LoopyBP vs. AggBP:
Runtime
Normalized Runtime
30
25
20
LoopyBP
15
10
AggBP
5
0
15
25
35
45
55
65
Number of Parts
75
85
95
LoopyBP vs. AggBP:
Accuracy
RMS Error
10
8
6
4
LoopyBP
2
AggBP
0
0
5
10
BP iteration
15
20
Experiment II: Synthetic Graph
Generator
increase
branching factor
allow spatial
overlap
[skeleton graph]
vary radii
LoopyBP vs. AggBP:
Accuracy
RMS Error
6
AggBP
4
LoopyBP
2
0
0
2
4
stdev(part size)
0
1
2
5
Conclusions

AggBP makes belief propagation tractable in
large, highly connected graphs

For part-based modeling, runtime and
storage is reduced from O(N 2) to O(N)

AggBP’s solutions on two datasets are as
good as standard BP’s in less time
Acknowledgements



Dr. George Phillips
NLM Grant 1R01 LM008796
NLM Grant 1T15 LM007359
Download