International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net Design and Implementation of Gesture Controlled Robot with a Robotic Arm Chinmay Patil1*, Shivani Sharma2, Sneha Singh3 1,2,3Student of B.Tech Electronics & Telecommunication, NMIMS University’s MPSTME, Mumbai, India 400056. ---------------------------------------------------------------------***--------------------------------------------------------------------Abstract - This paper puts forward the design and implementation of Gesture Controlled Robot with Robotic Car in a very simple and efficient way. A Robot is a mechanical device that can perform physical tasks. Robotic arm is a mechanical system which is used in manipulating the movement of lifting, moving, and manipulating the workpiece to lighten the work of man. The 3D printed robotic arm is operated & controlled with a joystick while the robotic car is wirelessly gesture controlled with hand gesture which transmits signals to the prototype through the transmitter fixed on the gloves put on hands, instead of conventional methods used. An accelerometer and microcontroller is used to convert the hand gestures into signal for controlling of robot. Key Words: Accelerometer, Gesture Controlled Robot, Joystick, 3D printed robotic arm. Figure 1-Prototype 2.Related Work 1.INTRODUCTION Robotics is a new emerging technology in the field of science. Many researchers across the globe are coming up with new products to help humans. Robots can be controlled wirelessly or by using wired controllers. In many of the industries robots have taken the place of humans but still they need to be controlled by them only. Earlier robots were controlled using physical devices but nowadays they can be controlled via various methods. Hand gesture-controlled robots were developed in order to help humans to do things which are difficult for any human like disarming a bomb, handling nuclear materials or any day to day activity. A robotic arm constructed with diverse joints that replaces human arm joint. The entire joint is motor driven with a controller made up of joystick to operate each joint of the robotic arm. The main aim is to design a robotic arm which is implemented on robotic car as shown in figure 1, so that it can be used in places where human beings can’t reach. It can be used as a fire fighting robot, can be used in construction, medical and military fields. © 2019, IRJET | Impact Factor value: 7.34 | Several researches have been made in the field of robotics, to create robots which are user-friendly, implementing user interfaces such as 3D joystick, touch screens, etc. But these techniques do not give accurate results and provide slow response. A gesture controlled robotic car and a 3D printed arm controlled with a joystick are usually two different things and integrating them together is something new. A hand gesture-controlled robot using accelerometer, flex sensors and Zigbee which uses the Sixth Sense technology was implemented in [1]. Some of the basic hardware materials were used along with the Zigbee modules and simple assembly language was used. It had several applications in the machine industry for picking and placing. Another gesture-controlled robot used for research and development was implemented using 3D motion tracking wirelessly with the help of Bluetooth in [2]. This was predominantly development for the operator who does not need special competences to control the manipulator device. On the contrary, he can pay his full attention at the process or object which is of most interest to an observer. ISO 9001:2008 Certified Journal | Page 1351 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net It made the use of python coding along with the lynx motion AL5A module. An accelerometer-based gesture-controlled arm for an ROV (Remotely Operated Vehicle) was developed using a simple gyroscope and joystick and discussed in [3]. Hence this required no previous training for the operator as well as provide for a smooth and accurate movement. A wireless robotic arm that uses GUI for wireless communication was evolved in [4]. It had five separate movements controlled by five servo motors. Going to wireless from wired is much easier with less hassle and GUI helps to communicate between systems in an easier way. A simple hand gesture robotic arm using accelerometer was put into effect. An algorithm for gesture sensing has been developed in [5] to replace the approach of traditional controlling mechanism of robots, by an innovative hand gesture-based controlling. It made use of WIN AVR STUDIO 4.0, PROTEUS for coding and had a large range of communication. The light weight robotic arm using a joystick was implemented in [6]. The arm is constructed almost entirely of plastic elements: inflatable links, airbag actuators, and acrylonitrile butadiene styrene (ABS) joints. A new method of control was used and this was especially designed for medical and healthcare purposes. A joystick-controlled arm with auditory feedback was developed in [7]. Joystick-based operation is a popular method for operating various types of robots, such as excavators, cranes, and space tele robotics. The goal is to create efficient methods for controlling such devices. 3.System Description Figure 2 shows the block diagram of Transmitter section of Gesture Controlled Robot. Here, the accelerometer initiates the process by point data to the ATMEGA 328 microcontroller (8-bit) based on the gesture performed by the user. This data is processed by the microcontroller towards the RF transmitter module (433.92 MHz frequency range). Figure 2- Block schematic of Transmitter section of Gesture controlled robot Figure 3 shows the Block schematic of Receiver section of Gesture controlled robot. The receiver RF module receives (433 MHz frequency range) the data sent by the transmitter and processes it further to the microcontroller at the receiver end. And thus, the microcontroller (8-bit) directs the given signal to the L293D motor driver which eventually performs the required corresponding physical tasks. A gesture-controlled car using simple methods was implemented in [8]. It shows a robotic system that allows anyone, especially normal people with no technical background, to instruct a robot just showing it what it should do. A solely military purposed gesture-controlled robot was developed in [9]. There are numerous robots using commands from user or self-controlled, the requirement for gesture controlled robots are on the rise for military purposes, which is called as Unmanned ground vehicles (UGVs). The main advantage of this is that it is gesture functioning without base station assistance. Figure 3- Block schematic of Receiver section of Gesture controlled robot A Hand Motion Controlled Robotic Vehicle with obstacle detection which identifies trends in technology applications and usability are discussed in [10]. An approach is presented that is based on detection of motion of hand which will control vehicle movement and refrains movement of vehicle if an obstacle is detected in path. Figure 4 shows the block diagram of Transmitter section of Gesture Controlled Robot. The USB joystick (analog) is used by the user at the transmitter end. Once the user performs a specific gesture on joystick, based on the assigned threshold values the message is transmitted to RS232 which processes the corresponding message and sends it towards the receiver end. © 2019, IRJET ISO 9001:2008 Certified Journal | Impact Factor value: 7.34 | | Page 1352 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net directed. Similarly, If Y-pos < 300 is true, a left move is directed, else it moves to the next input that is if Y > 700 is true, a left move is directed. Again, this step terminates and stops until the whole message is gone and waits for the new message to come. The following message determines the opening and closing of the robotic arm. The values of this corresponding message are read and if Val<300 is true then the arm closes and if Val<300 is false then the arm opens. After the processing of this message the entire cycle of one particular gesture is terminated and the system waits for a new message signal. Figure 4- Block schematic of implemented Robotic arm At the receiver end the message is received and decoded by the SCB (Servo Controller Board) and responsible for required moment of the robotic arm. 4.System Methodology Figure 5 shows the process of the transmitter section, the process is initiated by setting the X and Y pin of the joystick. Further the X and Y pins are read and the threshold values are set for adjusting the sensitivity. Figure 6 shows the flow of the receiver section, that once it receives signals from the transmitter, it initializes Q1, Q2, Q3 and Q4 servo pins. After this the receiver’s PLL is started and the output for the motor control is set. Next are the nonblocking signals which are determined by using loops in the receiver’s code. If the statement is true then it goes to the next block that is int i = 0, which directly follows the next iteration that is i++. And if the condition is false it goes to the next block int i = 0 followed by i < buffer and if this is true it refreshes, otherwise checks the first data received. Figure 6- Flowchart of Receiver section Figure 5-Flowchart of Transmitter section A particular threshold is assigned for every particular gesture in a loop. If X < 340 is true, a forward move is directed else it moves to the next input that is if X > 400 is true, a backward move is directed. Similarly, If Y < 340 is true, a right move is directed, else it moves to the next input that is if Y > 400 is true, a left move is directed. This step terminates and stops until the whole message is gone and waits for the next message to come. In a similar way, the X and Y pins of the joystick are set. If X-pos < 300 is true, an up arm move is directed else it moves to the next input that is if X-pos > 700 is true, a down arm move is Now if the data received is char (buff[i])== ‘s’ (‘s’ represents Stop) and is true then it stops and if false then it moves to the next statement char (buff[i])== ‘f’ (‘f’ represents Forward) and when this statement is true it processes a forward move signal otherwise it moves to the next statement char (buff[i])== ‘b’ (‘b’ represents Backward) which when true processes a backward move signal otherwise moves to the next statement char (buff[i])== ‘l’ (‘l’ represents Left) which when true processes a left move signal otherwise moves to the next statement char (buff[i])== ‘r’ (‘r’ represents Right) which when true processes a right move signal otherwise moves to the next statement char (buff[i])== ‘u’ (‘u’ represents Up) which when © 2019, IRJET ISO 9001:2008 Certified Journal | Impact Factor value: 7.34 | | Page 1353 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net true processes an upward movement signal otherwise moves to the next statement char (buff[i])== ‘d’ (‘d’ represents Down) which when true processes a downward move signal. Microcontroller (ATMega328) In a Similar manner with continuation the movement of arm to open and close are also described as follows: If the condition of the downward movement is wrong then the code moves to the next statement which is char (buff[i])== ‘g’ (‘g’ represents Right arm) which when true processes a right move signal for the arm otherwise moves to the next statement char (buff[i])== ‘h’ (‘h’ represents Left arm) which when true processes a left move signal for the arm otherwise moves to the next statement char (buff[i])== ‘t’ (‘t’ represents Stop arm) which when true processes a stop signal for the arm otherwise moves to the next and the final statement char (buff[i])== ‘c’ (‘c’ represents Close) which when true processes a close move signal for the arm. Amongst all the conditions if any 1 condition is true then it directly progresses to next iteration i++ which is followed by the conditions of the robotic car and conditions of the joystick controlled robotic arm. And this is further followed by the termination step that is ‘Stop’. Accelerometer (ADXL337) - - - In this prototype, the gesture-controlled car is developed using an accelerometer, RF module and is controlled using microcontroller ATMega328. The accelerometer and the transmitter module along with the microcontroller comprise of the transmitter section. The data is transmitted using radio frequency via the transmitter’s end. The receiver section consists of the receiver module and the microcontroller which in turn commands the motor drivers. Hence the data is received by the receiver RF module and the commands are given to the motors via the motor driver. The weight of the car is precisely evaluated and calculated in order to balance out the weight of the robotic arm which is mounted on the car and thus allows the arm to move freely without any dominance. It drives the DC motors and is used to drive induction loads. L293D is a 16-pin IC which can control a set of two DC motors simultaneously in any direction. It means that you can control two DC motors with a single L293D IC. Dual H-bridge Motor Driver integrated circuit (IC). Specific set of threshold values are assigned for respective hand gesture moments such as forward, backward, left and right. On the transmitter end the accelerometer will be mounted on a human hand glove which will be used to perform hand gestures which will be responsible for the moment of the robotic car. While on the receiver end, there is a motor driver which takes the message sent by the transmitter and received by the receiver and converts these signals into mechanical moments via the motor driver. Car chassis Provides room for the DC motors as well as other Modules. The robotic arm will be a 3D printed prototype in order to keep it light weight and for easy mounting on the robotic car. The design of the arm is kept simple to implement and easy to handle. The joystick is used to control the arm for smooth movement. With the help of the microcontroller the commands of the joystick are transferred to the arm and the movements are done. Commands like forward, backward, left, right and stop are being used. Similar to that of the robotic car, even the joystick comprises of a transmitter and a receiver end. And in a similar way, specific set of threshold values are assigned for respective joystick commands which include up arm, down arm, left arm, right arm, arm close and arm open. RF Module (Transmitter and Receiver) - RF module (radio frequency module) is a (usually) small electronic device used to transmit and/or receive radio signals between two devices. In an embedded system it is often desirable to communicate with another device wirelessly. Wheels - They are attached to the chassis with the help of screws and then connected to the DC motors for movement. © 2019, IRJET | Impact Factor value: 7.34 It is used to control the robotic arm. 6.Implementation L293D Motor Driver - It is a device that measures proper acceleration. Analog Joystick 5.Hardware and Software Modules - A single-chip microcontroller having a 8-bit AVR and is used in many autonomous systems. | ISO 9001:2008 Certified Journal | Page 1354 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net Figure 7- Various gestures for movement of vehicle 7.Result and Conclusion: Figure 8,9 show the final hardware model of the joystick controlled robotic arm and gesture controlled robotic car respectively. Figure 9- Gesture controlled Robotic car The objective of this project, of developing the hardware and software for a gesture controlled robotic car and a joystickcontrolled arm has been accomplished. From the observations it is clear that the movement of an arm and car is precise, easy to control and user friendly. This robot control method is expected to overcome the problem. It can be used as a fire fighting robot to help the people from the fire accident. These can be used in construction field and also in industries to control trolley and lift. They can also be used in military and medical applications. Figure 8- Robotic Arm © 2019, IRJET | Impact Factor value: 7.34 | ISO 9001:2008 Certified Journal | Page 1355 International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 09 | Sep 2019 p-ISSN: 2395-0072 www.irjet.net 8.References: [1] Raheja J.L, Shyam R, Kumar U, Prasad P.B : “Robot Controlled Using Hand Motion Recognition”, IEEE Second International Conference on Machine Learning and Computing, 2010. [2] Karolis Root, Renaldas Urniezius, “Research and development of a gesture-controlled robot manipulator system”, IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI),2016. Ronny Mardiyanto, MochamadFajar Rinaldi Utomo, Djoko Purwanto, HeriSuryoatmojo, “Development of hand gesture recognition sensor based on accelerometer and gyroscope for controlling arm of underwater remotely operated robot”, IEEE International Seminar on Intelligent Technology and Its Applications (ISITIA), 2017. [3] [4] Muhammed M. B. , Mohd. Muji S. Z. B, Zakaria S. R, bin Mohd. Jenu M. Z, “MR999-E Wireless Robotic Arm”, IEEE Annual India Conference (INDICON), Banglore, India, December, 2016. [5] Setia Archika , Mittal Surbhi , Nigam Padmini , Singh Shalini, Gangwar Surendra, “Hand Gesture Recognition Based Robot Using Accelerometer Sensor”, International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering [6] Hye-Jong Kim, Yuto Tanaka, Akihiro Kawamura, Sadao Kawamura, Yasutaka Nishioka, “Development of an inflatable robotic arm system controlled by a joystick,'' 24th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN),2015. [7] Nikolaos Mavridis, Georgios Pierris, Paolo Gallina, Zacharoul Papamitsiou, UmairSaad, “On the subjective difficulty of Joystick-based robot arm teleoperation with auditory feedback”, IEEE 8th GCC Conference & Exhibition, 2015. [8] Ms. Asmita Jadhav, Ms. Deepika Pawar, Ms. Kashmira Pathare, Ms. Prachi Sale,Prof.Thakare.R, “Hand Gesture Controlled Robot Using Arduino”, International Journal for Research in Applied Science & Engineering Technology (IJRASET). [9] Sathyanarayanan M, Azharuddin S, Kumar S, Khan G, “Gesture Controlled Robot for Military Purpose”, International Journal For Technological Research In EngineeringVolume 1, Issue 11, July-2014. [10] Dang Ankur, Chandorkar Vibhum, Pawar Aishwarya, Pawar Dhairya, “Hand Motion Controlled Robotic Vehicle With Obstacle Detection”, International Research Journal of Engineering and Technology (IRJET) Volume: 04 Issue: 09, Sep -2017. © 2019, IRJET | Impact Factor value: 7.34 | ISO 9001:2008 Certified Journal | Page 1356