Mean Shift

advertisement
‫گروه بینایی ماشین و پردازش تصویر‬
Machine Vision and Image Processing Group
Mean Shift ; Theory and Applications
Presented by:
Reza Hemati
1
89 ‫دی‬
December 2010
Mean Shift
Theory and Applications
Yaron Ukrainitz & Bernard Sarel
2
Agenda
• Mean Shift Theory
• What is Mean Shift ?
• Density Estimation Methods
• Deriving the Mean Shift
• Mean shift properties
• Applications
• Clustering
• Discontinuity Preserving Smoothing
•Segmentation
• Object Tracking
• Object Contour Detection
3
Mean Shift Theory
4
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
5
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
6
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
7
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
8
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
9
Intuitive Description
Region of
interest
Center of
mass
Mean Shift
vector
Objective : Find the densest region
Distribution of identical billiard balls
10
Intuitive Description
Region of
interest
Center of
mass
Objective : Find the densest region
Distribution of identical billiard balls
11
What is Mean Shift ?
A tool for:
Finding modes in a set of data samples, manifesting an
underlying probability density function (PDF) in RN
PDF in feature space
• Color space
Non-parametric
• Scale spaceDensity Estimation
• Actually any feature space you can conceive
Discrete PDF Representation
•…
Data
Non-parametric
Density GRADIENT Estimation
(Mean Shift)
PDF Analysis
12
Non-Parametric Density Estimation
Assumption : The data points are sampled from an underlying PDF
Data point density
implies PDF value !
Assumed Underlying PDF
Real Data Samples
13
Non-Parametric Density Estimation
Assumed Underlying PDF
Real Data Samples
14
Non-Parametric
Density Estimation
?
Assumed Underlying PDF
Real Data Samples
15
Parametric Density Estimation
Assumption : The data points are sampled from an underlying PDF
PDF(x) =
c e

( x-μi )2
2 i 2
i
i
Estimate
Assumed Underlying PDF
Real Data Samples
16
Kernel Density Estimation
Parzen Windows - Function Forms
1 n
P ( x)   K ( x - x i )
n i 1
A function of some finite number of data points
x1…xn
Data
In practice one uses the forms:
d
K (x)  c k ( xi )
or
i 1
Same function on each dimension
K (x)  ck  x

Function of vector length only
18
Kernel Density Estimation
Various Kernels
1 n
P ( x)   K ( x - x i )
n i 1
Examples:
A function of some finite number of data points
x1…xn


 c 1 x
• Epanechnikov Kernel K E (x)  

 0
2
Data

x 1
otherwise
• Uniform Kernel
c
x 1
KU (x)  
 0 otherwise
• Normal Kernel
 1 2
K N (x)  c  exp   x 
 2

19
Kernel Density Estimation
Gradient
1 n
 P ( x)    K ( x - x i )
n i 1
Using the
Kernel form:
We get :
Give up estimating the PDF !
Estimate ONLY the gradient
 x - xi
K (x - xi )  ck 
 h

2



Size of window
 n

x
g
i i

c n
c n
 
i 1
P(x)   ki    gi   n
 x
n i 1
n  i 1  

gi

 i 1

20
g(x)  k (x)
Computing
Kernel Density
The Estimation
Mean Shift
Gradient
 n

x
g
i i

c n
c n
 
i 1
P(x)   ki    gi   n
 x
n i 1
n  i 1  

gi

 i 1

21
g(x)  k (x)
Computing The Mean Shift
 n

x
g
i i

c n
c n
 
i 1
P(x)   ki    gi   n
 x
n i 1
n  i 1  

gi

 i 1

Yet another Kernel
density estimation !
Simple Mean Shift procedure:
• Compute mean shift vector
 n

 x - xi 2 
  xi g 




h
 i 1

  x
m ( x)  

2
n


x
x


i
g





 h 

 i 1 

•Translate the Kernel window by m(x)
22
g(x)  k (x)
Mean Shift Mode Detection
What happens if we
reach a saddle point
?
Perturb the mode position
and check if we return back
Updated Mean Shift Procedure:
• Find all modes using the Simple Mean Shift Procedure
• Prune modes by perturbing them (find saddle points and plateaus)
• Prune nearby – take highest mode in the window
23
Mean Shift Properties
• Automatic convergence speed – the mean shift
vector size depends on the gradient itself.
• Near maxima, the steps are small and refined
Adaptive
Gradient
Ascent
• For Uniform Kernel (
), convergence is achieved in
a finite number of steps
• Normal Kernel (
) exhibits a smooth trajectory, but
is slower than Uniform Kernel (
).
24
Real Modality Analysis
Tessellate the space
with windows
Run the procedure in parallel
25
Real Modality Analysis
26
The blue data points were traversed by the windows towards the mode
Real Modality Analysis
An example
Window tracks signify the steepest ascent directions
27
Mean Shift Strengths & Weaknesses
Strengths :
Weaknesses :
• Application independent tool
• The window size (bandwidth
selection) is not trivial
• Suitable for real data analysis
• Does not assume any prior shape
(e.g. elliptical) on data clusters
• Can handle arbitrary feature
spaces
• Inappropriate window size can
cause modes to be merged,
or generate additional “shallow”
modes  Use adaptive window
size
• Only ONE parameter to choose
• h (window size) has a physical
meaning, unlike K-Means
28
Mean Shift Applications
29
Clustering
Cluster : All data points in the attraction basin of a mode
Attraction basin : the region for which all trajectories lead to the same mode
30
Mean Shift : A robust Approach Toward Feature Space Analysis, by Comaniciu, Meer
Clustering
Synthetic Examples
Simple Modal Structures
Complex Modal Structures
31
Clustering
Feature space:
L*u*v representation
Modes found
Real Example
Modes after
pruning
Initial window
centers
Final clusters
32
Clustering
Real Example
L*u*v space representation
33
Clustering
Real Example
2D (L*u)
space
representation
Final clusters
34
Discontinuity Preserving Smoothing
Feature space : Joint domain = spatial coordinates + color space
 xs   xr 
K (x)  C  k s 
  kr 

h
h
s
r


 
Meaning : treat the image as data points in the spatial and gray level domain
Image Data
(slice)
Mean Shift
vectors
Smoothing
result
35
Mean Shift : A robust Approach Toward Feature Space Analysis, by Comaniciu, Meer
Discontinuity Preserving Smoothing
Flat regions induce the modes !
z
37
y
Discontinuity Preserving Smoothing
The effect of
window size
in spatial and
range spaces
38
Discontinuity Preserving Smoothing
Example
39
Discontinuity Preserving Smoothing
Example
40
Segmentation
Segment = Cluster,
or Cluster of Clusters
Algorithm:
• Run Filtering (discontinuity preserving smoothing)
• Cluster the clusters which are closer than window size
Image Data
(slice)
Mean Shift
vectors
Smoothing
result
Segmentation
result
Mean Shift : A robust Approach Toward Feature Space Analysis, by Comaniciu, Meer41
http://www.caip.rutgers.edu/~comanici
Segmentation
Example
…when feature space is only
gray levels…
42
Segmentation
Example
43
Segmentation
Example
44
Segmentation
Example
45
Segmentation
Example
46
Segmentation
Example
47
Segmentation
Example
48
Non-Rigid Object Tracking
…
…
49
Mean-Shift Object Tracking
General Framework: Target Representation
Choose a
reference
model in the
current frame
…
Current
frame
Choose a
feature space
…
Represent the
model in the
chosen feature
space
51
Mean-Shift Object Tracking
General Framework: Target Localization
Start from the
position of the
model in the
current frame
Search in the
model’s
neighborhood
in next frame
Find best
candidate by
maximizing a
similarity func.
Repeat the
same process
in the next pair
of frames
…
Model
Candidate
Current
frame
…
52
Mean-Shift Object Tracking
Target Representation
Choose a
reference
target model
Represent the
model by its
PDF in the
feature space
Choose a
feature space
0.35
Quantized
Color Space
Probability
0.3
0.25
0.2
0.15
0.1
0.05
0
1
2
3
.
.
.
m
color
53
Kernel Based Object Tracking, by Comaniniu, Ramesh, Meer
Mean-Shift Object Tracking
Target Model
Target Candidate
(centered at 0)
(centered at y)
0.35
0.3
0.3
0.25
0.25
Probability
Probability
PDF Representation
0.2
0.15
0.1
0.2
0.15
0.1
0.05
0.05
0
0
1
2
3
.
.
.
m
1
2
color
q  qu u 1..m
3
.
.
.
m
color
m
q
u 1
u
1
Similarity f y  f  q , p y 
    
Function:
p  y    pu  y u 1..m
m
p
u 1
u
1
54
Mean-Shift Object Tracking
Finding the PDF of the target model
xi i1..n
candidate
model
Target pixel locations
y
0
k ( x)
A differentiable, isotropic, convex, monotonically decreasing kernel
b( x )
The color bin index (1..m) of pixel x
• Peripheral pixels are affected by occlusion and background interference
Probability of feature u in model
 

k xi
b ( xi ) u
 y  xi
pu  y   Ch  k 
 h
b ( xi ) u 
2
0.3
0.3
0.25
0.25
0.2
Pixel weight
0.15
0.1
Normalization
factor
Probability
Normalization
factor
Probability
qu  C
Probability of feature u in candidate
2



0.2
0.15
Pixel weight
0.1
0.05
0.05
0
0
1
2
3
.
color
.
.
m
1
2
3
.
color
.
.
m
56
Mean-Shift Object Tracking
Similarity Function
Target model:
q   q1,
, qm 
Target candidate:
p  y    p1  y  ,
Similarity function:
f  y   f  p  y  , q   ?
, pm  y 
The Bhattacharyya Coefficient
q 

q1 ,
, qm

p1  y  ,
p  y  

q
1
, pm  y 

y
1
m
p  y  q
f  y   cos  y 
  pu  y  qu
p  y   q u 1
p  y 
T
57
Mean-Shift Object Tracking
Target Localization Algorithm
Start from the
position of the
model in the
current frame
q
Search in the
model’s
neighborhood
in next frame
p  y
Find best
candidate by
maximizing a
similarity func.
f  p  y  , q 
58
Mean-Shift Object Tracking
Approximating the Similarity Function
m
f  y    pu  y  qu
u 1
Linear
approx.
(around y0)
Model location:
y0
Candidate location:
1 m
1 m
f  y    pu  y0  qu   pu  y 
2 u 1
2 u 1
y
qu
pu  y0 
 y  xi
pu  y   Ch  k 
 h
b ( xi ) u 
Independent of
y
Ch
2
 y  xi
wi k 

 h
i 1

n
2



2



Density
estimate!
(as a function
of y)
59
Mean-Shift Object Tracking
Maximizing the Similarity Function
The mode of
Ch
2
 y  xi
wi k 

 h
i 1

n
2

 = sought maximum

Important Assumption:
The target
representation
provides sufficient
discrimination
One mode in
the searched
neighborhood
60
Mean-Shift Object Tracking
Applying Mean-Shift
The mode of
Ch
2
 y  xi
wi k 

 h
i 1

2
n

 = sought maximum

 y0  xi 2 
xi g 



h
i 1


y1 
2
n
 y0  xi 
g



h
i 1


n
n
 y  xi
Original
Find mode of c k 
Mean-Shift:
i 1
 h
2



using
 y0  xi 2 
xi wi g 


2

h

i 1
y  xi


y

using

1
n
 y0  xi 2 
h

wi g 



h
i 1


61
n

Extended
Find mode of c wi k 
Mean-Shift:
i 1

n
Mean-Shift Object Tracking
About Kernels and Profiles
A special class of radially
symmetric kernels:
 
K  x   ck x
2
The profile of
kernel K
k   x   g  x 
 y0  xi 2 
xi wi g 


2

h

i 1
y  xi


y

using

1
n
 y0  xi 2 
h

wi g 



h
i 1


62
n

Extended
Find mode of c wi k 
Mean-Shift:
i 1

n
Mean-Shift Object Tracking
Choosing the Kernel
A special class of radially
symmetric kernels:
 
K  x   ck x
2
Epanechnikov kernel:
Uniform kernel:
1  x if x  1 
k  x  

0
otherwise


1 if x  1 
g  x   k  x   

0
otherwise


 y0  xi 2 
xi wi g 



h
i 1


y1 
n
 y0  xi 2 
wi g 



h
i 1


n
n
y1 
xw
i 1
n
i
i
w
i 1
i
63
64
Download