In-situ stand-off control of Robotic arm using human gestures 6

advertisement
International Journal of Engineering Trends and Technology (IJETT) – Volume 21 Number 6 – March 2015
In-situ stand-off control of Robotic arm using
human gestures
Anitha.G.K#1, Poonguzhali.R#2, Mythili.S#3 , JebaSingh Kirubakaran.S.J#4
1, 2,3
UG Scholar, 4 Assistant Professor, Vel Tech Multi Tech Dr.Rangarajan Dr.Sakunthala Engineering College, Avadi
Chennai-62 India.
Abstract— This paper presents the hand gesture recognition
system with a web camera, which is used for recognizing the
hand gesture. Since the hand tracking is quite difficult, greater
resolution is obtained by using a white background. For
background extraction, direct inverse algorithm is used.
Experimental evaluations on self build databases of histogram
values of hand gestures are maintained. Based on that pixel
values, the corresponding gesture is identified and specific
command is given to the Robotic system. This paper is mainly
used to develop the Human Robot Interaction.
Keywords— Hand gesture recognition, direct inverse algorithm,
Human-Robot Interaction (HRI).
I. INTRODUCTION
In current trend, Robots are mainly used in various fields such
as search operations and for protection against dangerous
conditions. Therefore, the natural communication between
human and robot is needed. This paper suggests the control of
robotic arm using the hand gestures. Instead of using a remote
control, the robotic arm is controlled by various gestures
performed by human body. Human robot interaction is an
active area in robotic research. Research in robotics results in
the development of two fields: Robotic manipulation and the
input feeding system .In the in-situ standoff control, human
gestures are given as input.
Gesture recognition is mainly achieved using the camera and
methodologies such as Grey scale conversion, Inverse
algorithm, Thresholding model. It involves Image processing,
Machine learning and the application development. Apart
from this, it also needs some kinds of hardwares for
interfacing the system with gesture control. Overall aim is to
make the robotic arm to understand the human body language
thereby building a bridge between human and the robot.
Because of the difficulty in hand gesture spotting, hand
gesture recognition is a challenging aspect.
The Robotic system consists of Robotic arm with mechanical
base. Using the hand gestures, the mechanical base is made
free to move forward, backward, left, right and the robotic arm
can change its position such as forward, backward, rotate etc..
The interaction used in this project is Remote Interaction.
i.e.,
the human and the robot are separated spatially. The
distance can be around 100meters.In the entire field of
robotics, learning plays an important role. Robots should adapt
their behaviour to their environment and the need of their
users. This requirement is achieved by the approaches as they
use learning techniques for learning new gestures. Since there
SSN: 2231-5381
is no need of any additional device, this method provides a
simple and natural kind of Human-Robot Interaction.
II. LITTERATURE SURVEY
Many approaches has been developed for recognizing the
hand movement and controlling the robotic system.
Previously, Keyboard and Mouse are the main interface
between man and machine. Then the machines are controlled
by the voice of human [1].It has some disadvantages that in
the noisy environments sound would cause disturbance.
Afterwards, Robots are controlled by face tracking [2]. It
tracks well in noisy environments. But the tracking of gesture
depends on the speed of the CPU. Later, real time hand
tracking using the set of classifiers [3] is used for controlling
the robots. But this needs more efficient algorithm. Then
Robots are controlled by motion trajectory [4].But the motion
trajectory is determined by using centroid of the hand image
which is inaccurate.
In this paper, we propose hand gesture recognition with a
web camera. The user can control the wireless robot using
different gestures. The operator can control the robotic arm
from the station such a laptop or computer with the web
camera. The gestures is captured in the web camera to give
commands to the Robotic arm.
III. PROPOSED HUMAN ROBOT INTERACTIVE
SYSTEM
A. BASE UNIT
The base unit consists of web camera, computer, arduino
board and xbee transmitter.
http://www.ijettjournal.org
Page300
International Journal of Engineering Trends and Technology (IJETT) – Volume 21 Number 6 – March 2015
B. REMOTE UNIT
The Remote unit consists of xbee, arduino and the Robotic
arm module.
Fig.1 Block diagram of the base unit
First, the real time image(hand gestures) is captured via the
web camera and is processed in the computer by using
labview. In the labview, the real time image is converted into
the grey level images. Using inverse algorithm, the grey level
image is transformed into the inverse images. Then by means
of Thresholding model, the pixels values are calculated and
compared with the database. Thereby generating a code.
Arduino pin will get enabled by the code. After processing,
the code for the specific gesture is identified and sent to the
xbee transmitter.
Fig.2 Block diagram of the remote unit
The transmitted signal is received by the xbee in remote unit
which is operating in the same frequency as transmitter. The
next proceeding step is to send the signal to the arduino board,
where the command for the robotic arm is generated. By
providing the command to the robotic arm, the motor
corresponding to the specific gesture will be operated. Many
approaches has been developed for recognizing the hand
movement and controlling the robotic system.
IV. TECHNOLOGIES USED
A. LABVIEW
Fig.3 Base unit module
SSN: 2231-5381
Labview is a short form of Laboratory Virtual Instruments
Engineering Workbench. It is platform independent and
development environment for Visual programming language
from National Instruments. The virtual instruments has three
component namely Block diagram, front panel and connector
panel. The block diagram contains a graphical source code.
The front panel is constructed using controls and indicators.
It is very easy to learn in which the different function nodes
can be connected by drawing wires. The benefits of using
labview are it can be easily interfaced with the devices, Code
compilation, large libraries, Code reuse, Parallel
Programming, low cost.
http://www.ijettjournal.org
Page301
International Journal of Engineering Trends and Technology (IJETT) – Volume 21 Number 6 – March 2015
B. ARDUINO UNO
THRESHOLDING MODEL
Arduino is a single board microcontroller. It provides a set of
analog and digital input/output pins. It can be interfaced with
different extension boards and circuits.
Arduino is an
microcontroller board with the on board power supply, USB port
to communicate with the PC or laptop and Atmel
microcontroller chip. It is an open source hardware, any one can
modify it according to their needs. Arduino can take inputs from
the switches or sensors for controlling the motors, lights.
The pixel values are automatically estimated for each gestures
in labview in the form of two dimensional matrix. The range
of mean value required for specific motor operation is decided
and is pre-feeded in the database.
C. XBEE TECHNOLOGY
HISTOGRAM COMPARISON
Every now and then when the gesture is recognized the
generated pixel values is compared with the histogram mean
value range in the database and generates the code corresponding
for the robotic arm movement is transmitted to the Arduino via
the USB cable.
Xbee is a IEEE802.15.4 network protocol wireless
communication. It covers over 100meters because of its low
power consumption. Using intermediate devices, longer
distances can be achieved through Xbee technology.It is
simpler and less expensive compared to other technology such
as wifi. Xbee is a fast point to multipoint or peer to peer
networking.
Fig.4 Real and their corresponding inverse images of sample gesture
D. L293D DRIVER CIRCUIT
Making Robot to move is not easy. It cannot move only by
connecting it with the main circuit. The motors of the robot
needs extra power. The additional power is supplied by motor
driver. It can be used in the circuits where the space is limited
and the power requires will be more. It is very flexible for the
robotic applications.
V. DESIGN
A. LABVIEW
The human gestures which are captured in the web camera are
processed in the labview software. There are two major
sections in labview namely the front panel and the block
diagram. Since it is the simple graphical representation, it is
easy to learn for the beginners. Several methodologies are
involved in the generation of codes.
GREY SCALE CONVERSION
The real time color images are converted into the grey scale
images . Grey scale image includes only two colors namely
black and white . In Grey scale the color range varies from 0
to 255. The dull color is represented as ―0‖ and the bright
color is depicted as ‗1‖ after the grey scale conversion.
INVERSE ALGORITHM
Inverse Algorithm is applied on the priorly converted Grey
scale images. As a result, the color inversion occurs. The
Inverse algorithm causes the black image to become white
and similarly the white images are converted to appear as
black.
SSN: 2231-5381
B. ARDUINO UNO
Arduino UNO used in base unit and remote unit are of
different types. In the transmitter section it is designed in such a
way that it can be easily interface with the Labview software.
On the other hand,Arduino board in the remote unit is
manufactured mainly for the control of DC motors of the robotic
arm and the mechanical base. The coding generated in the
labview enables the desired pin in the Arduino. In order to
interface the Labview program and the Arduino board, an
interlink file
have been used. Before running the labview
software, it is important to download the interlinking file. The
interlink file may be any of the png format files. While compiling,
if there occurs any firmware problem, then the file has to be dump
again.
C. XBEE SERIES2 MODULE
In this paper, we used xbee in both transmitter and receiver.
Xbee is a low powered device. In the transmitter section, the
enabled signal from the arduino is send to the xbee. For
interfacing the xbee and the arduino, pic microcontroller is used.
This is mainly used to reduce the cost. The transmitter signal
will be received by the xbee receiver. Though Xbee is a low
powered device, it can be operated over long distance by using
intermediate devices. This forms a mesh network. Both the xbee
transmitter and receiver is operated in the same frequency of
2.5GHZ. The distance between base unit and remote unit can be
around 100meters. Xbee is configured for 9600bps for serial
communication. Xbee modules also have an interactive
command interface. This interface allows us to
change the current transmission channel being used along with
many transmitter and receiver options.
http://www.ijettjournal.org
Page302
International Journal of Engineering Trends and Technology (IJETT) – Volume 21 Number 6 – March 2015
The pic microcontroller is a 40 pin IC. In this paper, it is used
only for interfacing the xbee and arduino. Since there is only
one transmitter port and receiver port in the Arduino board ,in
order to interface with the Xbee series 2 module ,PIC
microcontroller is used in between the Arduino and Xbee The
Arduino pins are connected to all 6 pins of port A and two
pins of port E in the PIC. It is a processor with a built in
memory and a RAM. PIC microcontroller will allow the
design to communicate with world through USB, Ethernet or
wireless connectivity.
D.ROBOTIC ARM
VI. GESTURE COMMANDS FOR ROBOTIC SYSTEM
LABEL
Fig.5 Robotic arm module
The robotic arm is a mechanical arm. It is programmable
according to user needs. It acts as a substitute for the human
arm. There are five motors in the robotic arm. It is highly
flexible and adaptable to any kind of environment. It is used to
perform the specific task depends on the application. The
robotic arm is the heart of this project since we are going to
control the remote unit (robotic section) using gestures. The
structure is similar to the hand consisting of shoulder, elbow,
wrist and gripper instead of fingers, and a led at the centre of
the gripper. The gripper can do the same work of the fingers in
pick and place operations. The degree of freedom is
guaranteed to the robotic arm using the DC motors. DC
motors are preferred since it assures the step by step robotic
movement.
E. L293D MOTOR DRIVER
1.Gripper
motion
2.Wrist motion
3.Elbow motion
4.shoulder
motion
SSN: 2231-5381
ANGLE
ON:7-17 OFF:18-30
0-1.77 degree
ON:31-40 OFF: 4150
ON:50-65 OFF:6677
120 degree
ON:78-100OFF:101150
180degree
300degree
The hand gestures are recognized and converted into the
commands for the robot navigation .There are five dc motors
in the robotic arm. Each gesture specifies a movement of
particular motor. Since it is a remote interaction, the distance
can be around 100meters.The code for the gesture is send to
the receiver via antenna. Here, we used few simple gestures in
order to control the motors of robotic arm.
The motor driver is used to supply the extra current
which is required by the motors to run. Each
motor is
connected to L293D which is the motor driver unit and
supplies additional current for the working of the DC motors.
A single L293D can control the operation of two motors at a
time. The robotic arm is placed in the mechanical base which is
provided with four wheels for the ease of movement. Out of four,
two wheels are connected with the DC motors and other two
remains as a dummy wheel. As a whole 7 motors are involved in
the operation which includes the 5 motors of arm and 2 two
motors of the mechanical base. The Zigbee receiver after
receiving the transmitted signal transfer it to the Arduino board.
The corresponding command for the operation of motors are
produced in the Arduino.MAX232 protocol is used as the
interface between the XBEE and the Arduino .MAX232
converts the TTL (Transistor-Transistor Logic) in to RS232. It
works based on the concept of H-bridge. It is a circuit in which
voltage is flown in all direction. In order to rotate the motor in
clockwise and anticlockwise direction, the voltage must change
its direction. Therefore H-bridge IC are ideal for driving DC
motor.
F. PIC MICROCONTROLLER
PIXEL VALUES
Fig.6 Few simple hand Gestures
VII. CONCLUSION
Robots play an important role in making our lives easier and
better. Robotics is a vast field. The way of recognizing has been
improving day by day. In this paper, we propose the hand
gesture recognition in order to control the robotic arm with the
help of the web camera. By calculating the pixel values and
comparing with the pre-feed database, the specific command is
provided to the robotic arm which depends on the gesture given.
It is cost effective and highly flexible. A system is developed to
demonstrate the proposed system. Experimental results show
http://www.ijettjournal.org
Page303
International Journal of Engineering Trends and Technology (IJETT) – Volume 21 Number 6 – March 2015
that the system can recognize the hand gesture in the real time
and perform the desired action.
[13]
[14]
VIII. APPLICATIONS AND FUTURE ENHANCEMENT
[15]
A. Applications
It can be used for the mass-production in industry and for
cutting edge precision within the medical field, Sign language
recognition, Gaming interface, Remote mining and land mine
detection.
B. Future enhancement
[16]
[17]
[18]
It can be improved by following techniques:
Touch Screen is the current familiar trend. By
integrating our technology, we can operate a machine at far
distance.
It can be used for the physically impaired to interact
with the machines.
It can also be used as a scurb nurse.
[19]
[20]
[21]
[22]
[23]
REFERENCES
[24]
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
Juang, B.H., Rabiner, L.R.: Hidden markov models for speech
recognition. Technometrics 33(3), 251–272(1991)
Bradski, G.: Computer vision face tracking for use in a perceptual user
interface. Intel Technol. J. (1998)
Barczak, A., Dadgostar, F.: Real-time hand tracking using a set of
cooperative classifiers based on haar-like features.Res. Lett. Inform.
Math. Sci. 7, 29–42 (2005)
Elmezain,M., Al-Hamadi, A., Appenrodt, J.,Michaelis, B.: A hidden
markov model-based continuous gesture recognition system for hand
motion trajectory. In: Proceedings of International Conference on
Pattern Recognition (ICPR), pp. 1–4 (2008)
Xu, D., Wu, X., Chen, Y.L., Xu, Y.:Online Dynamic Gesture
Recognition for Human Robot Interaction.J Intell Robot Syst.DOI
10.1007/s10846-014-0039-4(2014)
Bengio, Y., Frasconi, P.: Input-output hmms for sequence processing.
IEEE Trans. Neural Netw. 7(5), 1231–1249(1996)
Chen, Q., Georganas, N., Petriu, E.: Hand gesture recognition using
haar-like features and a stochastic context-free grammar. IEEE Trans.
Instrument.Meas. 57(8), 1562–1571(2008)
Cheng, Y.: Mean shift, mode seeking, and clustering. IEEETrans.
Pattern Anal. Mach. Intell. 17(8), 790–799 (1995)
Corradini, A.: Dynamic time warping for off-line recognition of a small
gesture vocabulary. In: ICCV Workshop on Recognition, Analysis, and
Tracking of Faces and Gestures in Real-Time Systems, pp. 82–89. IEEE
(2001)
Elmezain, M., Al-Hamadi, A., Sadek, S., Michaelis, B.:Robust
methods for hand gesture spotting and recognition using hidden
markov models and conditional random fields.In: IEEE International
Symposium on Signal Processing and Information Technology
(ISSPIT), pp. 131–136. IEEE(2010)
Garg, P., Aggarwal, N., Sofat, S.: Vision based hand gesture
recognition. World Acad. Sci. Eng. Technol. 49(1), 972–977 (2009)
SSN: 2231-5381
[25]
[26]
[27]
[28]
[29]
[30]
[31]
[32]
[33]
Holte, M.B., Moeslund, T.B., Fihl, P.: View-invariant gesture
recognition using 3d optical flow and harmonic motion context.
Comput. Vis. Image Understanding 114(12),1353–1361 (2010)
Keogh, E.J., Pazzani, M.J.: Derivative dynamic time warping. In:
The 1st SIAM International Conference on Data Mining (SDM2001), Chicago (2001)
Kurakin, A., Zhang, Z., Liu, Z.: A real time system for dynamic
hand gesture recognition with a depth sensor. In: Proceedings of the
20th European Signal Processing Conference (EUSIPCO), pp.
1975–1979. IEEE(2012)
Lee, J., Yoo, S.: An elliptical boundary model for skin color
detection. In: International Conference on Imaging Science, Systems,
and Technology. pp. 572–584 (2002)
Maggio, E., Cavallaro, A.: Hybrid particle filter and meanshift
tracker with adaptive transition model. In: InternationalConference
on Acoustics, Speech, and Signal Processing.pp. 221–224 (2005)
Manders, C., Farbiz, F., Chong, J., Tang, K., Chua, G., Loke, M.,
Yuan, M.: Robust hand tracking using a skin tone and depth joint
probability model. In: The 8th IEEE International Conference on
Automatic Face & Gesture Recognition, pp. 1–6. IEEE (2008)
Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans.
Syst.Man Cybern. Part C Appl. Rev. 37(3), 311–324 (2007)
OpenCvSharp: http://code.google.com/p/opencvsharp/(2011)
Rabiner, L., Juang, B.: An introduction to hidden markov models.
ASSP Mag. 3(1), 4–16 (1986)
Rabiner, L., Juang, B.: An introduction to hidden markov models.
IEEE ASSP Mag. 3(1), 4–16 (1986)
Ramamoorthy, A., Vaswani, N., Chaudhury, S., Banerjee,S.:
Recognition of dynamic hand gestures. Pattern Recogn. 36(9),
2069–2081 (2003)
Ren, Z., Meng, J., Yuan, J., Zhang, Z.: Robust hand gesture
erecognition with kinect sensor. In: Proceedings of the 19th ACM
international conference on Multimedia, pp. 759–760. ACM (2011)
Song, Y., Demirdjian, D., Davis, R.: Continuous body and hand
gesture recognition for natural human-computer interaction. ACM
Trans. Interact. Intell. Syst. (TiiS) 2(1), 5(2012)
Stiefmeier, T., Roggen, D., Tr¨oster, G.: Gestures are strings:
efficient online gesture spotting and classification using string
matching. In: Proceedings of the ICST 2nd international conference
on Body area networks, p. 16. ICST(Institute for Computer Sciences,
Social-Informatics and Telecommunications Engineering) (2007)
Wang, S., Quattoni, A., Morency, L., Demirdjian, D., Darrell, T.:
Hidden conditional random fields for gesture recognition. In: IEEE
Computer Society Conference on J Intell Robot Syst Computer
Vision and Pattern Recognition (CVPR), vol. 2, pp. 1521–1527.
IEEE (2006)
Wang, Z., Yang, X., Xu, Y., Yu, S.: Camshift guided particle filter
for visual tracking. Pattern Recogn. Lett. 30(4), 407–413 (2009)
Wu, Y., Liu, Q., Huang, T.S.: An adaptive self-organizing color
segmentation algorithm with application to robus treal-time human
hand localization. In: Asian Conference on Computer Vision
(ACCV), pp. 1106–1111 (2000)
Xia, L., Chen, C., Aggarwal, J.: Human detection using depth
information by kinect. In: Workshop on Human Activity
Understanding from 3D Data in Conjunction with CVPR 2011
(HAU3D) (2011)
Xu, J., Wu, Y., Katsaggelos, A.: Part-based initialization for hand
tracking. In: The 17th IEEE International Conference on Image
Processing (ICIP), pp. 3257–3260. IEEE(2010)
for human Yang, H.D., Park, A.Y., Lee, S.W.: Gesture spotting and
recognition –robot interaction. IEEE Trans. Robot. 23(2), 256–270
(2007)
Yang, J., Lu,W.,Waibel, A.: Skin-color modeling and
adaptation.Computer Vision-ACCV‘98, pp. 687–694 (1997)
http://www.ijettjournal.org
Page304
Download