OBJECT TRACKING USING PARTICLE FILTERS

advertisement
OBJECT TRACKING USING
PARTICLE FILTERS
Table of Contents

Tracking
Tracking as a probabilistic inference problem
 Applications of Tracking



Different approaches for Object Tracking
Particle Filter
A Simple Particle Filter Algorithm
 Basic steps implemented in the project
 Files used in the project
 Demo

TRACKING
Tracking is the problem of generating an inference
about the motion of an object given a sequence
of images.
In a typical tracking problem , we have a model for
the object’s motion and some set of
measurements from a sequence of images.
The measurements could be the positions of some
image points, the positions and moments of
some image regions etc.
Tracking is an inference problem. The moving
object has some form of internal state, which is
measured at each frame . We need to combine
our measurements as effectively as possible to
estimate the object’s state.
Tracking as a probabilistic inference
problem



Prediction :
P(Xi | Y0=y0 , …………..,Yi-1=yi-1).
Data Association:
P(Xi | Y0=y0 , …………..,Yi-1=yi-1).
Correction:
P(Xi | Y0=y0 , …………..., Yi=yi).
Independence Assumptions

Only the immediate past matters:
P(Xi | X1 ,…………., Xi-1)=P(Xi | Xi-1 ).

Measurements depend only on the current state:
P(Yi |Yj ,...,Yk|Xi )=P(Yi|Xi )P(Yj ,….,Yk|Xi ).
Applications of Tracking
Motion
Capture
Recognition from motion
Surveillance
Targetting
Different approaches for Object
Tracking






Correlation – based
Feature based
Gradient based
Color Histograms
Kalman Filter
Particle Filter
PARTICLE FILTER
Particle Filters are powerful tools for bayesian state
estimation in non-linear systems.
The basic idea of particle filters is to approximate
a posterior distribution over unknown state
variables by a set of particles, drawn from this
distribution.
Particle Filters requires two types of information:
 Data
 Controls
 Measurements

Probabilistic model of the system
The data is given by zt=z1,z2,……,zt and
ut=u1,u2,…….,ut .
Particle Filters, like any member of the family of Bayes
filters such as kalman filters and HMM’s, estimate the
posterior distribution of the state of the dynamical
system conditioned on the data,p(xt |zt ,ut ). They do so
via the following recursive formula
P(xt|zt ,ut )=ht p(zt|xt) Ip(xt|ut ,xt-1) p(xt-1|zt-1 ,ut-1)dxt-1
To Calculate this posterior, three probability
distributions are required ( Probabilistic model of
the dynamical systems):
1)
A Measurement model,p(zt|xt), which describes the
probability of measuring zt when the system is in
state xt .
2)
A Control model, p(xt|ut ,xt-1 ), which characterizes
the effect of controls ut on the system state by
specifying the probability that the system is in state
xt after executing control ut in state xt-1 .
3)
An Intial state distribution , p(x0), which specifies
the user’s knowledge about the intial system state.
Problems with probabilistic filter
In many applications, the key concern in
implementing this probabilistic filter is the
continous nature of the staes x, controls u, and
measurements z. Even in discrete versions ,
these spaces might be prohibitively large to
compute the entire posterior.
Particle filter tackles the problem
The particle filter addresses these concerns by
approximating the posterior using the sets of state
samples (particles):
Xt ={xt[i]}i=1,……..,N
The set Xt consists of N particles xt[ i ] ,for some large
number of N. Together these particles approximates
the posterior p(xt|zt ,ut ). Xt is calculated recursively.
A Simple Particle Filter
Algorithm
Given a prior p(X ), a transition prior p(X |X ) and a likelihood
1
t
t-1
p(Yt|Xt ), the algorithm is as follows:
1)
Initialization , t=1
for i=1,……….,N, sample (X1( i ))~p(X1 ) and set t=2.
2) Importance Sampling step
For i=1,…….,N sample Xpt( i ) ~p(Xpt( i )|Xt-1( i ))
and set Xp1:t( i ) =( Xt( i ) , X1:t-1( i ) ).
For i=1,……,N, Evaluate Importance weights
wt ~p(Yt|Xt( i ) )
Normalise the importance weights.
Algorithm contd……
3) Selection Step
Resample with replacement N particles (Xi:t( i ); i=1,…….,N)
from the set ( Xp1:t( i ) ; i= 1,……., N) according to the
normalised importance weights.
Set t = t+1 and go to step 2
Basic steps implemented in the
project
Intially at time t=0, the particles x0[i] are generated from the initial
state distribution p(x0 ). The t-th particle set Xt , is the
calculated recursively from Xt-1 as follows:
1
Set Xt =Xtaux =0
2
For j=1 to N do
3
pick the j-th sample xt-1[ j ] e Xt-1
4
draw xt[ j ] ~p(xt|ut ,xt-1[ j ] )
5
set wt[ j ] =p(zt|xt[ j ])
6
add (xt[ j ],wt[ j ]) to Xtaux
7
End for
Basic Steps
8
9
10
11
contd….
For i= 1 to m do
draw xt[i] from Xtaux with probability propotional to wt[i]
add xt[i] to Xt
End for
Files used in the project




A set of image files, which was later converted
to a video file.
A matrix file consisting of data about the image.
pf_proj.m , which contains the main algorithm
for particle filters and drawing of the trajectories
for the moving object.
multinomialR.m , which contains the code for
resampling.
Sample of the code
for i = 1:N,
states(:, t, i) = A(:,:,i) * states(:, t-1, i) + B(:,:,i) * randn(100,1);
% Evaluate importance weights.
w(t, i) = (exp(-0.5*(states(:,t,i))'* (states (:,t,i)))) + 1e-99;
w_real(t, i) = w(t, i);
end;
w(t,:) = w(t,:) ./ sum(w(t, :));
% Normalise the weights.
DEMO
[ Shown separately]
Acknowledgement
Thanks to Dr. Longin Jan Latecki, for providing
me the opportunity to work on this project.
Download