AAM based Face Tracking with Temporal Matching and Face

advertisement
CVPR 2010
AAM based Face Tracking with
Temporal Matching and Face
Segmentation
Mingcai Zhou1、 Lin Liang2 、 Jian Sun2 、Yangsheng Wang1
1Institute
of Automation Chinese Academy of Sciences, Beijing, China
2Microsoft
Research Asia Beijing, China
Outline
•
•
•
•
AAM Introduction
Related Work
Method and Theory
Experiment
2
AAM Introduction
• A statistical model of shape and greylevel appearance
Shape model
Appearance model
3
Shape Model Building
:mean shape
:shape bases
,shape parameters
learn by PCA
generate mean shape、
shape bases
4
Texture Model Building
:mean appearance
:appearance bases
:appearance parameters
灰階值
W(x)
Mean shape
Shape-free patch
5
AAM Model Building
6
AAM Model Search
• Find the optimal shape parameters p and appearance
parameters  to minimize the difference between the
warped-back appearance and synthesized appearance
W ( x, p) map every pixel x in the model coordinate to its corresponding image point
I (W ( x, p))
W ( x, p)
s0
7
Problems- AAM tracker
• Difficultly generalize to unseen images
• Clutterd backgrounds
8
How to do?
• A temporal matching constraint in AAM
fitting
-Enforce an inter-frame local appearance constraint
between frames
• Introduce color-based face segmentation as a
soft constraint
9
Related Work
temporal matching constraint
-feature-based (mismatched local feature)
Integrating multiple visual cues for robust real-time
3d face tracking, W.-K. Liao, D. Fidaleo, and G. G. Medioni. 2007
-intensity-based (fast illumination changes)
Improved face model fitting on video sequences, X. Liu, F.
Wheeler, and P. Tu. 2007
10
Method and Theory
• Extend basic AAM to Multi-band AAM
– The texture(appearance) is a concatenation of
three texture band values
• The intensity (b)
• X-direction gradient strength (c)
• Y-direction gradient strength (d)
11
Temporal Matching Constraint
1. Select feature points with salient local appearances at previous frame
2. I(t−1) to the Model coordinate and get the appearance A(t-1)
3. Use warping function W(x;pt) maps R(t-1) to a patch R(t) at frame t
12
Shape parameter Initialization
Face Motion Direction
,
13
Shape parameter Initialization
When r reaches the noise level expected in the
correspondences, the algorithm stops
14
Shape parameter Initialization
-Comparison
Motion direction
Feature matching
Previous frame’s shape
15
Face Segmentation Constraint
Where {xk } are the locations of
the selected outline points in the
model coordinate
16
Face Segmentation Constraint
-Face Segmentation
17
Face Segmentation Constraint
18
Experiments
Lost frame num
19
Experiments
20
Conclusion
─ Our tracking algorithm accurately localizes the facial
components, such as eyes, brows, noses and mouths,
under illumination changes as well as large expression
and pose variations.
─ Our tracking algorithm runs in real-time . On a
Pentium-4 3.0G computer, the algorithm’s speed is
about 50 fps for the video with 320 × 240 resolution
21
Future Work
─ Our tracker cannot robustly track profile views
with large angles
─ The tracker’s ability to handle large occlusion also
needs to be improved
22
Download