Implementation of Fingerspelling Glove Jina Jain , Leena Deodhar

advertisement
International Journal of Engineering Trends and Technology (IJETT) – Volume 9 Number 14 - Mar 2014
Implementation of Fingerspelling Glove
Jina Jain1, Leena Deodhar 2, Pooja Parmar 3, Vaishali Maheshkar 4
B.Tech Students, Department of Electrical and Electronics Engineering,
Veermata Jijabai Technological Institute, Matunga- 400 019,
ABSTRACT — In this paper, we describe an embedded system
which aims at translation of sign language to synthesized text and
voice. This system consists of a glove that can be worn by a
deaf/dumb person to facilitate the communication in real-time
with other people. The system translates the hand gestures to
corresponding letters using seven flex sensors, five contact
sensors and a 3-axis accelerometer. The gathered data from each
finger’s position and hand’s motion is used to differentiate
between the letters. The signals are converted to digital data
using comparator circuits and ADC of microcontroller which are
then compared with a lookup table to get the signed letter. Using
Bluetooth module, the translation is transmitted to the PC, which
displays as well as pronounces the letter .On the computer there
is a java-based game, which tests the user’s ability to sign, which
can be used for sign language education. This system will
simplify the communication of dumb people with people capable
of normal communications without the need of a human
translator.
Keywords— Embedded system, sign language, flex sensors, 3axis accelerometer, translator.
To help overcome these issues, we built a system for sign language
translation using a glove with seven Flex sensors to detect the
bending of the fingers, 3-axis accelerometer on the back of the palm
to measure different orientations, contact sensors for releasing
spacing between different fingers and a small circuit board
containing a programmable microcontroller to detect hand gestures
of American Sign Language. The device only translates the alphabet
currently, but we can add to the directory many more gestures and
customize a hand movement to mean a particular word or do a
particular function. The flow of the remaining paper goes as follows:
Section 2 introduces the finger-spelling used in sign language
translation, Section 3 explains the system description and the section
4 lists the specifications of the components used. Section 5 describes
the system functionality of the project. Section 6 explains the
software part of the project. Then we mention the limitations. Finally,
we conclude with the future scope of our project.
II.
FINGER SPELLING
I. INTRODUCTION
Sign language is now seen as the native communication and
education method for mute and deaf people [1]. It makes use of
three-dimensional signing space and are visual-gestural, in contrast
to spoken languages which are oral-aural. American Sign Language
(ASL) is the predominant sign language of deaf and dumb
communities in the United States and English-speaking parts of
Canada and its dialects are used in many countries around the
world[2].ASL is ranked as the 6th most widely used language in the
USA[3].ASL possesses a set of 26 signs known as the American
manual alphabet, which can be used to spell out words from the
English language.
Common current options for alternative communication modes
include writing, and interpreters. The ambiguity of handwriting and
slowness of writing makes it a very frustrating mode of
communication. Conversational rates (both spoken and signed) range
from between 175 to 225 WPM, while handwriting rates range from
15 to 25 WPM[4] .In addition, English is often the Dumb person’s
second language, American Sign Language (ASL) being their first.
Although many deaf people achieve a high level of proficiency in
English, not all deaf people can communicate well through written
language. Since the average Deaf adult reads at
approximately a fourth grade level, communication through
written English can be too slow and often is not preferred.
Interpreters are commonly used within the Deaf community,
but interpreters can charge high hourly rates and be awkward
in situations where privacy is of high concern, such as at a
doctor or lawyer’s office. It can also be difficult to find an
interpreter in unforeseen emergencies where timely
communication is extremely important, such as car accidents.
Finger-spelling is the process of spelling out words by using signs
that correspond to the letters of the word. An ASL user would use
the American Finger-spelled Alphabet, (also called the American
Manual Alphabet). The American Finger-spelled Alphabet consists
of 22 hand shapes that--when held in certain positions and/or are
produced with certain movements-- represent the 26 letters of the
American alphabet [5].
Transmitter
III.
SYSTEM DESCRIPTION
Glove
unit(flex
sesensors+
contact
sensors+ac
celeromete
r)
Detection
unit(com
parator
circuits+
microcontroller)
Bluetooth
module
Speaker(
Synthesized
voice
output)
Display
unit
Computer
bluetooth
Receiver
PC
ISSN: 2231-5381
http://www.ijettjournal.org
t
Page 732
International Journal of Engineering Trends and Technology (IJETT) – Volume 9 Number 14 - Mar 2014
V.
Block Diagram
The system consists of flex sensors, contact sensors and a 3 axis
accelerometer mounted on a glove unit which is worn by the user.
When the user signs a particular letter, the analog voltage level
captured from each flex sensor is given to its dedicated comparator
circuit for conversion from analog to digital. The output from the
accelerometer is fed to the microcontroller which has an in built
ADC for conversion to digital. Further the outputs of the contact
sensors are directly given to the microcontroller ports for recognition
of the signed letter. The microcontroller analyses the data from the
sensors to search a library of gestures and generate output signals
which are transmitted via a Bluetooth module to a personal computer
to produce synthesized voice or written text. The system consists of
three units: glove unit, detection unit and PC.
IV.
HARDWARE IMPLEMENTATION
We have designed and implemented a glove unit with 7 flex sensors,
4 contact sensors and an accelerometer. Initially the flex sensors were
connected to the ADC of the microcontroller to obtain digital values
corresponding to different bent angles of flex sensors. After
analysing the plot of the bent angles vs the voltage across the flex
sensors, we soon released that fitting a linear curve was inefficient
SYSTEM FUNCTIONALITY
Once the power is switched ON, the microcontroller periodically
reads the sensors to detect a hand gesture made by using the glove
unit. These values are then used to determine the signed letter
through the microcontroller code. Once a letter is recognized, the
microcontroller sends a packet to the Bluetooth module. If there is a
match, the signed letter is displayed on the computer screen and the
corresponding audio file is played through the interface of MATLAB.
Finally a Java based sign language game which helps the user to
become comfortable with the glove and also helps increase their
speed is implemented
Start
(power ON)
Glove of sign language
gesture
(signals are generated by flex
sensors,contact sensors and
accelerometer)
Fig. Glove prototype
due to the non linearity between the two. This gave rise to further
error computation problem. Moreover, the time response increased.
To overcome this problem, flex sensor outputs were sent to
comparator circuits. This meant only two states were possible; joint
is flexed or joint is not flexed. To calculate the threshold value, we
considered the worst case scenario. The glove was worn by five
volunteers and we tested each sensor to give us the voltage values for
flexed and non-flexed position. Minimum bent angle for the flexing
is considered to be around 30 degrees. And the root mean square of
all the voltage values gave us the threshold value
Values are captured by ADC of
atmega 16 and comparator
circuits
Do the
digital
values
match a
letter?
No
. To address the issue of noise, we conducted a preliminary
evaluation, looking at the response of the sensors of the glove when
the hand was held still in each of the two positions: flat on the table
(non-flexed) and with fist closed (flexed). The response showed very
less noise over a period of few seconds.
Yes
Send the letter via Bluetooth to
PC then display it using
software
Play the respective audio file
using MATLAB
Fig. Illustrates the flowchart of capturing and recognition process
Fig. Lower joint of little figure non-flexed
ISSN: 2231-5381
http://www.ijettjournal.org
Page 733
International Journal of Engineering Trends and Technology (IJETT) – Volume 9 Number 14 - Mar 2014
character is called. This backup code uses a method of data analysis
in which the input values are matched with all the entries and the one
resulting in least error is picked. But this does not guarantee that a
letter is always selected. Based on our analysis and testing each letter
entry also has a tolerance value. While taking the sensor readings,
de-bouncing of the input is done to avoid taking the readings during
the transition using the following state transition mechanism.[8] For
letters ‘J’ and ‘Z’, de-bouncing is not done since they are recognized
by their movement.
0
Fig. Lower joint of little figure flexed
0
WAIT
DETECTED_1
1
0
1/0
1
DETECTED_2
CONFIRM
1
Fig. Lower joint of middle figure non-flexed
Once in the confirm state the recognized letter is sent via the
Bluetooth module to the PC where it is displayed. Packets containing
the signed letter, address field and a sync field are sent to the PC. The
packet also contains compressed data about the signed letter. This
data is then forwarded to the Java application and MATLAB audio
synthesizes program.
VII.
LIMITATIONS
The system is trained for a limited Database. Possibility of
misinterpretation exists between the closely related gestures. Also,
the facial expressions of the user are not considered though it is
known that they form an important part of sign language.
Fig .Lower joint of middle figure flexed
The microcontroller used was ATmega16. Its a 8-bit high
performance microcontroller of Atmel’s Mega AVR family with low
power consumption. The output from accelerometer is connected to
ADC of microcontroller for producing digital signal. The multiplexer
of the microcontroller’s ADC is utilized to change channels and
process analog output from the 3-axis accelerometer MMA7361.
Also the UART of the microcontroller is utilized for serial
communication with the Bluetooth module. The data transmitted by
the Bluetooth Bee Module is received by the Computer Bluetooth.
VI.
SOFTWARE
While testing, contact sensors were found to be the most reliable of
all. Second in line were flex sensors and lastly, the accelerometer
readings. For this reason, our detection code is based on hierarchical
detection[7] in which firstly, all the sensor readings are captured by
the microcontroller and a tree search algorithm with three levels is
used to find the signed letter. The first level of the algorithm gives a
list of candidates that have the same contact values as the signed
letter. This list of candidates further go through the second level of
the algorithm to give another list of candidates having the same
flexes as the signed letter. If the second list contains more than one
candidate, it is sent through the third level which matches the
readings of the accelerometer with those of the candidates currently
in the list looking for a match. If the decision tree fails to yield a
unique letter, a backup routine which tries to find the most likely
ISSN: 2231-5381
VIII.
CONCLUSION AND FUTURE SCOPE
We have developed a system to eliminate the barrier in
communication between the mute community and the normal people
by discarding the need for an interpreter. It facilitates effective realtime communication.
The performance and accuracy of the translation can be
improved by enhancing the quality of the database used. The system
can be implemented in many application areas including accessing
government websites whereby no video clip for deaf and mute is
available or filling out forms online wherein no interpreter may be
present to help.
ACKNOWLEDGEMENT
We’d like to express our profound gratitude for our guide Ms.
Amutha Jeyakumar for her continual support. We’d also like to thank
the volunteers who agreed to help us test the system.
REFERENCES
[1]
Michelle
Jay,2008-2014,Start
ASL,History
of
Sign
Language.[Online].Available:
http://www.start-american-signlanguage.com/history-of-sign-language.html
[2]American Sign Language (2014) Wikipedia.[Online].Available:
http://en.wikipedia.org/wiki/American_Sign_Language
http://www.ijettjournal.org
Page 734
International Journal of Engineering Trends and Technology (IJETT) – Volume 9 Number 14 - Mar 2014
[3]SignGenius.com.Sign
Language
Software.[Online].Available:
http://www.signgenius.com/sign-language/ranking-of-asl-as-spokenlanguage.shtml
[4]R.Martin
Mcguire,Jose
Hernandez-Rebollar,Thad
Starner,Valerie
Henderson,Helen Brasher and Danielle S. Ross,”Towards a One-Way
American Sign Language Translator,”Sixth IEEE International Conference on
Automatic Face and Gesture Recognition(FGR’04) 0-7695-2122-3/04
[5]American
Sign
Language,Fingerspelling
&Numbers:Introduction.[Online].Available:
http://www.lifeprint.com/asl101/fingerspelling/fingerspelling.html
ISSN: 2231-5381
[6] Anuja Golliwar,Harshada Patil,Rohita Watpade,Sneha Moon,Sonal Patil
and V.D.Bondre,"Sign
Language Translator Using Hand
Gloves,"International Journal of Electrical,Electronics and Computer
Systems,Volume 2,pp.90,Issue-1,January,2014.
[7] Jose L. ,Hernandez-Rebollar,” Method and apparatus for translating hand
gestures,”U.S.Patent 20100063794A1,March 11,2010
[8]Sign
Language
Translation,The
sound
of
Signing.[Online]Available:http://people.ece.cornell.edu/land/courses/ece4760
/FinalProjects/s2012/sl787_rak248_sw525_fl229/sl787_rak248_sw525_fl229
http://www.ijettjournal.org
Page 735
Download