Computer Vision for Interactive Computer Graphics

advertisement
Computer Vision for
Interactive Computer
Graphics
Mrudang Rawal
Introduction
• Human-computer interaction
• Computers interpret user movements, gestures
and glances via fundamental visual algorithms.
• Visual algorithms: tracking, shape recognition
and motion analysis
• Interactive apps : response time is fast,
algorithms work for different subject and
environment, and economical.
Tracking Objects
• Interactive applications track objects –
large and small
• Different methods and techniques used.
Large Object Tracking
• Large objects like
hand or body tracked.
• Object is in front of
camera.
• Image properties
(Image moments),
and artificial retina
chip do the trick.
Step 1: Shape recognition
• Training and Testing of object.
• Technique = Orientation Histogram
• Set of each shape oriented in possible
direction.
• Match current shape orientation with the
ones in the set.
Step 2: Shape recognition
• Optical flow: sense movements & gestures
• Frequency of alternation of horizontal and vertical
velocity (frame avgs) used to determine gestures.
• Fast Flow Optical algorithm:
– Temporal difference, current – previous frame
– If pixel temporal diff != 0
if -ve motion towards adj pixel with greater luminance in current frame
if +ve towards lower luminance in current frame
– Apply the 1-d direction estimation rules to four orientations at each pixel
– Average out motion estimates at each pixel, then average flow estimate
compared to its neighboring 8 pixels
Small Object Tracking
• Large objects tracking
techniques not
adequate.
• Track small objects
through template
based technique –
normalized correlation
Normalized Correlation
• Examine the fit of an object template to every position in
the analyzed image.
• The Location of maximum correlation gives the position
of the candidate hand.
• The value of that correlation indicates how likely the
image region is to be a hand.
Example : Television Remote
• To turn on the television, the user holds up his
hand.
• A graphical hand icon with sliders and buttons
appears on the graphics display.
• Move hand to control the hand icon
Conclusion
• Simple vision algorithms with restrictive
interactivity allows human-computer
interaction possible.
• Advances in algorithms and availability of
low-cost hardware will make interactive
human-computer interactions possible in
everyday life.
References
[1] R. Bajcsy. Active perception. IEEE Proceedings, 76(8):996-1006, 1988.
[2] A. Blake and M. Isard. 3D position, attitude and shape input using video tracking of
hands and lips. In Proc. SIGGRAPH 94,pages 185{192, 1994. In Computer
Graphics, Annual Conference Series.
[3] T. Darrell, P. Maes, B. Blumberg, and A. P.Pentland. Situated vision and behavior for
interactive environments. Technical Report 261, M.I.T. Media Laboratory, Perceptual
Computing Group, 20 Ames St., Cambridge, MA 02139, 1994.
[4] I. Essa, editor. International Workshop on Automatic Face- and GestureRecognition.IEEE Computer Society, Killington, Vermont, 1997.
[5] W. T. Freeman and M. Roth. Orientation histograms for hand gesture recognition. In
M. Bichsel, editor, Intl. Workshop on automatic face and gesture-recognition, Zurich,
Switzerland, 1995. Dept. of Computer Science, University of Zurich, CH-8057.
[6] W. T. Freeman and C. Weissman. Television control by hand gestures. In M. Bichsel,
editor, Intl. Workshop on automatic face and gesture recognition, Zurich, Switzerland,
1995. Dept. of Computer Science, University of Zurich, CH-8057.
[7] B. K. P. Horn. Robot vision. MIT Press,1986.
[8] M. Krueger. Articial Reality. Addison-Wesley, 1983.
[9] K. Kyuma, E. Lange, J. Ohta, A. Hermanns,B.
Banish, and M. Oita. Nature, 372(197),1994.
[10] R. K. McConnell. Method of and apparatus for
pattern recognition. U. S. Patent No.4,567,610, Jan.
1986.
Download