MSCSProposalSmita

advertisement
Master Project
Motion Gesture Recognition for Human-Machine
Interaction.
Smita Karikatti
1. Committee Members and Signatures:
Approved by
Date
__________________________________
Advisor: Dr. Edward Chow
_____________
__________________________________
Committee member: Dr. Tim Chamillard
_____________
__________________________________
Committee member: Dr. Chuan Yue
_____________
2. Introduction
In order to meet human requirements better, motion estimation at a distance has recently
gained more and more interests from computer vision researchers. It has a particularly
attractive modality from a surveillance perspective, such as human computer interaction
(HCI) or more generally human machine interaction (HMI) etc.
Hand gesture recognition, as one of gesture recognition problem, is so important that the
motion of human hands can provide abundant information of human intention and implicit
meaning to the machines in real world. Many reports on intelligent human machine
interaction using hand gesture recognition have already been presented [1], which can be
mainly divided into “Data Glove-based” and “Vision-based approaches”.
The “Data Glove – based” methods use a special input device named “hand data sensor
glove: for digitizing hand and finger motions into multi-parametric data. It is possible to
analyse 3D space hand motion with the sensing data. However, the device is too expensive
and the users might feel uncomfortable when they communicate with a machine.
Without specialized tracking devices, one of the greatest challenges of the system is to
reliably detect and track the position of the hands using computer vision techniques. The
“Vision-based” methods use only the vision sensor; camera [2]. In general, the entire system
of the vision-based hand gesture recognition must be simpler than the Data Glove-based
approach, and it makes human-friendly interaction with no extra device. The vision-based
hand gesture recognition is a challenging problem in the field of computer vision and pattern
analysis, since it has some difficulties of algorithmic problems such as camera calibration,
image segmentation, feature extraction, and so on.
3. Project Plan
The main objective of the project work is to develop HCI (Human Computer Interface).
Following Fig 1.depicts the block diagram of proposed algorithm.
Webcam
Hand
Segmentation
Detect fingers or
color cap on
hand
Filter the binary
map by
morphological
operation
Mark the path of
the pointer and
make binary
pattern
Track the pointer
Feature
extraction from
pattern.
Train/Classify
pattern using
SVM
Perform relevant
operation
Fig 1. block diagram.
The project starts with hand detection in live video stream. For hand segmentation we use
either skin colour detection or motion detection method.
The second and most important step of this project is to detect the important point i.e. marker
on hand. The project is aiming to develop motion gesture recognition system where user has
to draw the pattern in air. Hence the efficiency of the project lies in accurate marker
detection. In this project two markers are required one marker is used to guide a pattern
whereas other marker will be used to start and stop the motion gesture. There could be two
solutions for maker detection.
1. Finger tip detection and use it as marker along with thumb.
2. Wearing two different colour caps in two fingers.
Both methods will be tested and either of the method will be selected based on the accuracy.
The coordinates of the marker in consecutive frames will be recorded and will be used to
create a binary pattern in an image. After binary pattern creation any noise will be filtered by
using morphological operations.
Another most important step of the project is to extract meaningful features from the pattern.
More relevant features results in more accuracy. Different feature extraction methods will be
tested and best method will be chosen.
The extracted features will be used to train the classifier at the initial stage. Once the
classifier is trained with sufficient data the system will be ready to interact with user. Support
Vector Machine (SVM) will be used as a classifier.
3.1 Tasks:
3.1.1 Already Complete

Study of different feature extraction methods such as characteristic loci, PCA etc.

Study of various classifiers such as ANN, SVM etc.

Acquiring frames from the webcam and process it.

MATLAB programming language
3.1.2 In Progress

Colour detection and segmentation algorithm development for gesture pointer.
3.1.3 Future - (Listed from highest to lowest priority)
Must be done

Development of motion gesture algorithm on live streaming.

Perform various operations in PC based on gesture.

Write a user manual for use of gesture recognition system.

Test the code with different feature extraction method.

Test the code on different methods to detect the pointer in motion gesture.

Compare the results of SVM and ANN for pattern recognition.

Write thesis
3.2 Deliverables:

Motion gesture recognition system developed in MATLAB.

A thesis report documenting the design and implementation of motion gesture
recognition system and comparative study of results using different methods.
4.0 References
1. Ying Wu and Thomas S Huang, "Vision-based gesture recognition: A review," LNCS:
Gesture-Based Communication in Human-Computer Interaction: International Gesture
Workshop, vol. 1739, pp. 103, 2004.
2. F. Quek, "Toward a vision-based hand gesture interface," in Proceedings of Virtual
Reality Software and Technology, Singapore, 1994, pp. 17-31.
3 Takahiro Watanabe and Masahiko Yachida, "Real-time gesture recognition using
eigenspace from multi-input image sequences," IEEE Computer and System in Japan, vol.
30, no. 13, pp. 810-821, 1999.
4. R. Bloem, H. N. Gabow, and F. Somenzi, "An algorithm for strongly connected
component analysis in n log n symbolic steps," pp. 37-54, Nov 2000, LNCS 1954.
5. http://en.wikipedia.org/wiki/Gesture_recognition
6. James, D. and S. Mubarak, "Recognizing Hand Gestures.", European Conference on
Computer Vision, pp. 331-340, Stockholm, Sweden, May 1994.
7. Just, A. and S. Marcel, "A comparative study of two state-of-the-art sequence processing
techniques for hand gesture recognition.", Computer Vision and Image Understanding, vol.
113, no. 4, pp. 532-543, Apr 2009
8. Yanghee, N. and W. KwangYun, "Recognition of Space-Time Hand-Gestures using
Hidden Markov Model.", ACM symposium on Virtual
Download