Uploaded by eternalaura15

minor synopsis - hand gesture and face recognition

advertisement
SYNOPSIS OF MINOR PROJECT
Topic Undertaken:
“Hand Gesture and Face Recognition, Detection and Tracking using MATLAB”
Submitted in partial fulfillment of the requirement for the award of degree of
Bachelor of Technology
In
Electronics and Communication Engineering
MAHARAJA SURAJMAL INSTITUTE OF TECHNOLOGY
C-4, Janakpuri, New Delhi-58
Affiliated to Guru Gobind Singh Indraprastha University, Delhi
December, 2012
S.no
Name of
students
(1)
Divyanshi
Sharma
(2)
Chahat
Bhatia
Aditya
Baloda
(3)
Roll no./
Enrollment no.
/Branch
02696302819
00596307320
02396302819
Topic of Minor
Project
Name & Sign of
supervisor
Hand Gesture
and Face
Recognition,
Detection and
Tracking using
MATLAB
Dr. Pradeep Sangwan
(HOD ECE
Department)
Sign of students in
the group
1. AIM OF THE PROJECT
This project’s focus will be to create a method to recognize hand gestures, and to recognize and
detect Faces using Viola Jonas Algorithm as well as Kanade-Lucas-Tomasi algorithm.
2. INTRODUCTION
Literature review
Within the last numerous years, numerous algorithms have been proposed for face detection.
While lots of development has been made in the direction of spotting faces below small versions
in lighting fixtures, facial expression and pose, dependable techniques for popularity beneath
greater excessive variations have been established elusive. Face detection is essential to many
face applications, together with face popularity and facial expression evaluation. But, the huge
visible versions of faces, together with occlusions, massive pose variations, and excessive
lighting, impose splendid demanding situations for those obligations in actual-world
applications.
Hand gesture recognition still an open research area .Different techniques and tools have been
applied for handling gesture recognition system, vary between mathematical models like Hidden
Markov Model (HMM) and Finite State Machine(FSM) to approaches based on software
computing methods such as fuzzy clustering Genetic Algorithms (GAs) and Artificial Neural
Network (ANN). Since the human hand is a complex articulated object with many connected
joints and links. Typically the implementation of gesture recognition system required different
kinds of devices for capturing and tracking image/ video image such as camera(s), instrumented
(data) gloves, and colored marker. Those devices are used for modeling the communication
between human and environments rather than traditional interface device such as keyboards, and
mice which are inconvenient and unnatural for HCI(human computer interface) system.
International Journal of Scientific Research and Management Studies (IJSRMS)
ISSN: 2349-3771 Volume 2 Issue 11, pg: 425-432
http://www.ijsrms.com ©IJSRMS pg. 426
Brief overview of the project
“Gestures are defined as a physical activity that conveys some message, whether it can be facial
expressions, body language, hand movements etc. Gesture can be defined as the motion of the
body in order to communicate with others. ”
Gestures are the very first form of communication. The goal is to make sure that no individual,
be it physically impaired, people suffering from developmental disability or any one for that
matter has to feel that they lack even the basics of communication.
This project is done in two parts; which are then later combined to make a single system.
The first part is face detection, recognition and tracking.
The second part is hand gesture recognition.
Part 1: Face Detection , Recognition and Tracking
Object detection and tracking are important in many computer vision applications. Here we are
developing a simple face tracking system by dividing the tracking problem into three parts:
1.
Detect a face
2.
Identify facial features to track
3.
Track the face
Steps:
1.We detect the face by using the vision.CascadeObjectDetector object to detect the location
of a face in a video frame.
2. To track the face over time, we uses the Kanade-Lucas-Tomasi (KLT) algorithm. This project
detects the face only once, and then the KLT algorithm tracks the face across the video frames.
3. The KLT algorithm tracks a set of feature points across the video frames. Once the detection
locates the face, it began to identify feature points that can be reliably tracked. We use "good
features to track" proposed by Shi and Tomasi.
4.Initialize a Tracker to Track the Points
5.Initialize a Video Player to Display the Results
6.Track the Face
Part 2: Hand Gesture Recognition
A gesture in a sign language is a particular “movement of the hands” with a specific shape made
out of them. A sign language usually provides sign for whole words. There are some people who
don’t have the ability to speak or they lose it in an accident. They find it difficult to express their
thoughts or to convey their message to other people. This project can be a medium between
dumb deaf people and society. Dumb people throughout the world use sign language to
communicate with others; this is possible only for those who have undergone special trainings.
Common people also face difficult to understand the gesture language. To overcome these real
time issues, we are developing this system. This reduces the communication gap between
dumb and ordinary people.
• “Speech” and “gestures” are the expressions, which are mostly used in communication
between human beings. In human communication, the use of speech and gestures is
completely coordinated. Therefore we have used ‘Gesture’ as key thing in our project.
• A number of hardware techniques are used for gathering information about body positioning;
typically either image-based (using cameras, moving lights) or device based (using instrumented
gloves, position trackers)
3. Material and Methodology to be used
1.
2.
3.
4.
MATLAB 7.01 (or any version after 2012a)
Viola Jones Algorithm
KLT Algorithm
Malkov Model
4. Expected Outcome
Almost deaf and discourse disabled people utilize gesture based communication to impart. This
framework application is to give a stage high precision to decipher the signs, empowering typical
individual to get motion.
We presented the implementation of a system that recognizes hand gestures and helps in
communication with differently abled people. In addition to this, it is also used for face detection
and tracking.
In the future we aspire to complete a software application with emotion detection and speech
processing for Autistic, differently able people and to apply it to fields in law enforcement to help
in video surveillance using a system which is both easy to use and affordable.
5. Applications
Visually impaired people can make use of hand gestures for human computer interaction like
controlling television, in games and also in gesture to speech conversion.
Georgia Institute of Technology researchers have created the Gesture Panel System to replace
traditional vehicle dashboard controls. Drivers would change, for example, the temperature or
sound-system volume by maneuvering their hand in various ways over a designated area. This
could increase safety by eliminating driver’s current need to take their eyes off the road to search
for controls.
Face Detection can be used in following fields:•
•
•
•
Biometric – Using a face is proved to be a successful approach since it is the way humans
recognize each other.
Identification Access – Face can be used to examine if a person exist or not in the list of
approved individuals.
Law Enforcement – Increase performance of surveillance
Modification in this project to add emotion detection can help autistic people with their
recovery.
6. Challenges
The main challenge in the project could be detection of face in early stages using vision object,
as for it to work a video must be recorded only by fixing a camera in one particular location or
fixing the person’s location and varying camera.
Input should be in standard formats
Another challenge could be using Neural networks in hand gestures recognition so that the
chosen image is pre-processed and clear for further uses.
Also, to make the system compatible to run on different platforms will be a big task.
Works with one image and one face at a time, but we are working on modifying it.
7. References
•
“Static Hand Gesture Recognition For ASL” – Ramesh Dadi
•
“Gesture Recognition Using MATLAB” - Amee Vishwakarma, Apoorva Srivastava,
Debolina Sur, Momita Saha & Monalisa Hazra
•
“MATLAB Application in Face Recognition” – Pavan Vadapalli
•
“Face detection and tracking in a video sequence” – Karthik G N
•
Resources in Mathworks.com
•
M. M. Abdelwahab, S. A. Aly, I. Yousry, Efficient WebBased Facial Recognition System
Employing 2DHOG, arXiv:1202.2449v1 [cs.CV].
•
W. Zhao, R. chellappa, P. J. Phillips, Face recognition: A literature survey, “ACM
Computing Surveys (CSUR)”, December 2003.
•
Jianbo Shi and Carlo Tomasi. Good Features to Track. IEEE Conference on Computer
Vision and Pattern Recognition, 1994.
•
Face Recognition Data, University of Essex, UK, Face 94,
•
http://cswww.essex.ac.uk/mv/all faces/faces94.html
Download