Formation et Analyse d’Images Session 8 Daniela Hall 14 November 2005 1 Course Overview • Session 1 (19/09/05) – – – – Overview Human vision Homogenous coordinates Camera models • Session 2 (26/09/05) – Tensor notation – Image transformations – Homography computation • Session 3 (3/10/05) – Camera calibration – Reflection models – Color spaces • Session 4 (10/10/05) – Pixel based image analysis • 17/10/05 course is replaced by Modelisation surfacique 2 Course overview • Session 5 + 6 (24/10/05) 9:45 – 12:45 – Contrast description – Hough transform • Session 7 (7/11/05) – Kalman filter • Session 8 (14/11/05) – Tracking of regions, pixels, and lines • Session 9 (21/11/05) – Gaussian filter operators • Session 10 (5/12/05) – Scale Space • Session 11 (12/12/05) – Stereo vision – Epipolar geometry • Session 12 (16/01/06): exercises and questions 3 Session overview 1. 2. 3. 4. Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION 4 Robust tracking of objects Detection Measurements Trigger regions List of predictions Predict Correct List of targets Detection New targets 5 Tracking system • Tracking system: detects position of targets at each time instant (using i.e. background differencing) 6 Tracking system • Supervisor – calls image acquisition, target observation and detection in a cycle • Target observation module – ensures robust tracking by prediction of target positions using a Kalman filter • Detection module – verifies the predicted positions by measuring detection energy within the search region given by the Kalman filter – creates new targets by evaluating detection energy within trigger regions • Parameters – noise threshold, detection energy threshold, parameters for splitting and merging 7 Detection by background differencing • I=(IR,IG,IB) image, B=(BR,BG,BB) background • Compute a binary difference image Id, where all pixels that have a difference diff larger than the noise threshold w are set to one. • Then we compute the connected components of Id to detect the pixels that belong to a target. • For each target, we compute mean and covariance of its pixels. The covariance is transformed to width and height of the bounding box and orientation of the target. 8 Real-time target detection • Computing connected components for an image is computationally expensive. • Idea: – Restrict search of targets to a small number of search regions. • These regions are: – Entry regions marked by the user – Search region obtained from the Kalman filter that predicts the next most likely position of a current target. 9 Background adaption to increase robustness of detection • In long-term tracking, illumination of a scene changes. Image differencing with a static background causes lots of false detections. • The background is updated regularily by • t time, α=0.1 background adaption parameter • Background adaption allows that the background incorporates slow illumination changes. 10 Example • Detection module • Parameters: detection energy threshold – energy threshold too high: targets are missed or targets are split – energy threshold too low: false detections • Problem: energy threshold depends on illumination and target appearance 11 Session overview 1. 2. 3. 4. Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION 12 Tracking • Targets are represented by position (x,y) and covariance. • A first order Kalman filter is used to predict the position of the target in the next frame. • The Kalman filter provides a ROI where to look for the target. ROI is computed from the a posteriori estimate xk and from the a posteriori error covariance Pk 13 Example 14 Example: Tracking bouncing ball • Specifications: – constant background – colored ball • Problems: – noisy observations – motion blur – rapid motion changes Thanks to B. Fisher UEdin for providing slides and figures of this example. http://homespages.inf.ed.ac.uk/rbf/AVAUDIO/lect8.pdf 15 Ball physical model • • • • Position zk = (x, y) Position update zk = zk-1 + vk-1Δt Velocity update vk = vk-1+ak-1Δt Acceleration (gravity down) ak=(0,g)T 16 Robust tracking of objects • • • • • x z k Measurement y State vector xk x, y, x' , y 'T 1 0 0 0 z k Hx k vk , H State equation 0 1 0 0 Prediction ˆxk Axˆk Bu k , State control Bu (0,0,0, gt )T k 17 Robust Tracking of objects • Measurement noise error covariance • Temporal matrix • Process noise error covariance • a affects the computation speed (large a increases uncertainty and therefore the search regions) 0.285 0.005 Rk 0.005 0.046 1 0 t 0 0 1 0 t A 0 0 1 0 0 0 0 1 Qk 0.01 I 18 Kalman filter successes 19 Kalman filter failures 20 Kalman filter analysis • • • • smoothes noisy observations dynamic model fails at bounce and stop could estimate ball radius could plot a boundary of 95% likelihood of ball position (the boundary would grow when the fit is bad). 21 Session overview 1. 2. 3. 4. Tracking of objects Architecture of the robust tracker Tracking using Kalman filter Tracking using CONDENSATION 22 Tracking by CONDENSATION • CONDENSATION: Conditional Density Propagation. Also known as Particle Filtering. Ref: M.Isard and A. Blake: CONDENSATION for visual tracking, Int Journal of Computer Vision, 29(1),1998. http://www.robots.ox.ac.uk/%7Econtours/ 23 CONDENSATION tracking • • • • Keeps multiple hypotheses updates using new data selects hypotheses probabilistically copes with very noisy data and process state changes • tunable computation load (by choosing number of particles). 24 CONDENSATION algorithm • Given a set of N hypotheses at time k Hk={x1,k, ... , xN,k} with associated probabilities {p(x1,k), ..., p(xN,k)} • Repeat N times to generate Hk+1 – 1. randomly select a hypothesis xu,k from Hk with p(xu,k) – 2. generate a new state vector sk from a distribution centered at xu,k – 3. get new state vector using dynamic model xk+1=f(sk) and kalman filter. – 4. evaluate probability p(zk+1|xk+1) of observed data zk+1 given state xk – 5. use bayes rule to get p(xk+1|zk+1) 25 CONDENSATION algorithm Figure from book Isard, Blake: Active Contours 26 Why does condensation tracking work? • many slightly different hypotheses suggests that maybe we find one that fits better. • dynamic model allows to switch between different motion models – Motion models of bouncing ball: bounce, freefall, stop • sampling by probability weeds out bad hypotheses 27 Tracking of bouncing ball 1. Select 100 hypotheses xk with probabilities p(xk) 2. use estimated covariance P() to create state samples sk 3. define a situation switching model 28 Tracking of bouncing ball • If in STOP situation: y'=0 • If in BOUNCE: x'=-0.7x', also add some random y' motion, y'=y'+r. • If in FREEFALL: use freefall motion model. y'=gΔt and x'=x'+r • then use Kalman filter for predicting ^xk • 4. estimate hypothesis goodness by 1/||Hxk – zk||2 • p(xk) is estimated from the goodness by normalization. 29 Example of sampling effects 30 Kalman filter failures fixed 31 Comparison Kalman vs condensation • Kalman: – assumes Gaussian motion model. – Easy to parametrize. – Fast. • Condensation: – – – – can track objects with non-gaussian motion. very good for multi-modal motion models simple algorithm reasonably fast 32