CALIFORNIA STATE UNIVERSITY, NORTHRIDGE TELEOPERATED HAND MOTION TRACKING ROBOTIC MANIPULATOR

advertisement
CALIFORNIA STATE UNIVERSITY, NORTHRIDGE
TELEOPERATED HAND MOTION TRACKING ROBOTIC MANIPULATOR
WITH HAPTIC FEEDBACK
A graduate project submitted in partial fulfilment of the requirements
For the degree of Master of Science in Electrical Engineering
By
Hasala Ranmal Senevirathne
May 2014
The graduate project of Hasala R. Senevirathne is approved:
____________________________
______________________
Dr. Ali Amini
Date
____________________________
______________________
Dr. C.T. Lin
Date
____________________________
______________________
Dr. Kourosh Sedghisigarchi, Chair
Date
California State University, Northridge
ii
ACKNOWLEDGEMENTS
I would like to thank Dr. Kourosh Sedghisigarchi for his guidance and generous
support for this project.
And I’m extending my gratitude towards to my other
committee members Dr. Ali Amini and Dr. C.T. Lin who participate in my project
review process.
iii
DEDICATION
This project is dedicated to my parents who are behind me all the time.
iv
TABLE OF CONTENTS
Signature Page
ii
Acknowledgements
iii
Dedication
iv
List of Figures
viii
List of Tables
xii
Abstract
xiii
Chapter 1: Introduction
1
1.1 Robotics
1
1.2 Robotic Manipulators
1
1.3 Objective
6
Chapter 2: System Overview
7
2.1 System Block Diagram
7
2.2 Functionality of the System
8
2.2.1 Hand Tracking Vision System
8
2.2.2 Host Computer
8
2.2.3 Handheld Master Gripper
9
2.2.4 Robotic Manipulator with the Slave Gripper
Chapter 3: Modeling and Calibration
11
13
3.1 Mathematical Modeling of the Robot Arm
13
3.1.1 Kinematics
13
3.1.2 Forward Kinematic Model
16
3.1.3 Inverse Kinematic Model
18
3.2 Servo Motor Control and Calibration
19
3.3 Handheld Gripper Modeling
20
v
3.3.2 PID Parameter Tuning
23
3.3.3 Pitch, Yaw and Roll Detection of Master Gripper
25
3.4 Force Sensor Calibration
26
Chapter 4: Hardware Design and Implementation
28
4.1 Handheld Master Gripper Design
28
4.1.1 Servo Motor Modification
28
4.1.2 Controller of Handheld Gripper
31
4.1.3 Potentiometer Interfacing
32
4.1.4 Motor Driving and Current Interfacing
33
4.1.5 IMU Sensor Interfacing
35
4.2 Robotic Manipulator and Slave Gripper Design
36
4.2.1 Robotic Manipulator Design
36
4.2.2 Force Sensing Gripper Design
37
4.2.3 Force Sensor Interfacing
38
4.2.4 Servo Motor Interfacing
39
4.3 Hand Motions Tracking
40
4.3.1 Microsoft Xbox Kinect Sensor
40
4.3.2 Servo Controller
42
4.3.3 Wi-Fi Module
43
Chapter 5: Software and Algorithm Implementation
44
5.1 Kinect Installation
44
5.2 Host Computer’s Program
45
5.3 Skeleton Detection
46
5.4 Hand Position Tracking
47
vi
5.4.1 Position Data Noise Elimination
49
5.5 Auto Mapping Function Generation
49
5.6 Servo Controller
52
5.7 Wi-Fi-Module Configuration
54
5.8 Robotic Manipulator’s Range of Operation and Coordinate System
55
5.9 Orientation Detection of the Handheld Gripper
57
5.9.1 Complementary Filter
57
Chapter 6: Results
59
6.1 Handheld Master Gripper Test Results
6.1.1 IMU Test Results
59
59
6.2 Hand Motion Tracking Results
61
Conclusion
63
References
64
Appendix A: Processing Main Program
67
Appendix B: Inverse Kinematic Model Code
70
Appendix C: Hand Tracking Code
71
vii
LIST OF FIGURES
Figure 1.1: Kinematic Links of a Cartesian Robotic Manipulator
2
Figure 1.2: Spherical Robotic Manipulator
2
Figure 1.3: Cylindrical Robotic Manipulator
3
Figure 1.4: SCARA Robotic Manipulator
3
Figure 1.5: Articulated robotic arm
4
Figure 1.6: Steward Platform
5
Figure 1.7: ASIMO by Honda
5
Figure 2.1: System Block Diagram
7
Figure 2.2: Computer Functionality and Communication
9
Figure 2.3: Handheld Master Gripper
9
Figure 2.4 Block Diagram of Robotic Manipulator with Slave Gripper
10
Figure 2.5: Inside Image of Handheld Master Gripper
10
Figure 2.6: Block Diagram of Robotic Manipulator with Slave Gripper
11
Figure 2.8: Robotic Manipulator with Slave Gripper and Controller
12
Figure 3.1: Dimensions and Servo Motors of the AL5D Robotic Manipulator
14
Figure 3.2: Coordinate Frame Assignment
15
Figure 3.3: Shoulder and Forearm Links
16
Figure 3.4: Top view of the Robotic Manipulator
17
Figure 3.5: Side view of the Robotic Manipulator
17
Figure 3.6: Servo Motor Calibration
19
Figure 3.7: DC Motor Model
21
Figure 3.8: PID Current Controller
22
Figure 3.9: Step Current Input
23
Figure 3.10: Kp Tuning
24
viii
Figure 3.11: Kd Tuning
24
Figure 3.12: Response of Tuned PID controller
24
Figure 3.13: IMU Unit
25
Figure 3.14: Force Sensor Calibration
27
Figure 4.1 DC Motor Used for the Initial Design of the Master Gripper
29
Figure 4.2 Components of a Servo Motor
29
Figure 4.3 MG995 Gear Wheels
30
Figure 4.4 Modified Gear Assembly
30
Figure 4.5 DC Motor and Potentiometer
30
Figure 4.6 Arduino Nano Controller
31
Figure 4.7 Potentiometer Interfacing to the Arduino Nano
32
Figure 4.8 Current Controlled DC motor of the Master Gripper
33
Figure 4.9 L298 Motor Driver Board
33
Figure 4.10 L298 Motor Driver Interfacing
34
Figure 4.11 ACS712 Based Current Sensing Module
34
Figure 4.12: Current Sensor Interfacing to the Microcontroller
35
Figure 4.13: MPU-6050 IMU Sensor
36
Figure 4.14: Assembled AL5D robotic manipulator with SSC 32 servo controller 36
Figure 4.15: Arduino MEGA development Board
37
Figure 4.16: Force sensing Gripper Design Stages
38
Figure 4.17: Force Sensor Interfacing
39
Figure 4.18 HS485 HB Servo Motor
39
Figure 4.19: Microsoft’s Xbox Kinect
40
Figure 4.20: Time of Flight Distance Measurement
41
Figure 4.21: Interfacing Kinect to the Computer
42
ix
Figure 4.22: SSC 32 Servo Controller
42
Figure 4.23: SSC 32 interfacing to the Controller
43
Figure 4.24: Wi-Fi Module interfacing to the Controller
43
Figure 5.1 : Kinect Programming Layers
44
Figure 5.2: Openni and Nite Logos
44
Figure 5.3: Host Computer Program’s Flow Chart
45
Figure 5.4: Skeleton Tracking Initialization
46
Figure 5.5: Skeleton Tracking
47
Figure 5.6: Reference Coordinate Frame for Handtracking
48
Figure 5.7: Moving Average Filter
49
Figure 5.8: Auto Mapping Function Generation
50
Figure 5.9: Marking the Range of Hand movements
51
Figure 5.10: Hand Position Tracking after Calibration
51
Figure 5.11: Servo Motor Input
52
Figure 5.12: Standard Servo Motor Control Signal
52
Figure 5.13: Comm Operator
54
Figure 5.14: Robotic Manipulator’s Coordinate System
55
Figure 5.15: Robotic Manipulator Range of Operation Flowchart
56
Figure 5.16 Shoulder Servo Obstructed by the Spring at 165°
57
Figure 5.17: Complementary Filter Block Diagram in Frequency Domain
58
Figure 6.1: Accelerometer Output
60
Figure 6.2: Gyroscope Output
60
Figure 6.3: Complementary Filter Output
61
Figure 6.4 : Hand Motion Tracking in 2D space
61
Figure 6.5: Hand Position along X axis
62
x
Figure 6.7 : Hand Position along Y axis
62
xi
LIST OF TABLES
Table 3.1: DH parameter Table
15
xii
ABSTRACT
TELEOPERATED HAND MOTION TRACKING ROBOTIC MANIPULATOR
WITH HAPTIC FEEDBACK
By
Hasala Ranmal Senevirathne
Master of Science in Electrical Engineering
The purpose of this project is to design and develop a remotely operated Robotic
Manipulator which can imitate human hand motions captured through a 3D vision
sensor and an Inertial Measurement Unit (IMU). The gripper attached at the end of the
Robotic Manipulator facilitates the user with gripping objects. Moreover there is a force
feedback system at user’s end which let the user feel the feedback force exerted on the
robot’s gripper by the object being gripped. This force feedback is vital in fragile object
handling. Main goal of this project is to make Teleoperated Robotic Manipulators more
affordable and flexible to use in daily consumer applications. This working model can
be further developed to real world applications such as hazardous material handling,
daily household activities and disable people assistance.
xiii
CHAPTER 1
INTRODUCTION
This chapter presents background and basic definitions of robotics followed by the
objective of this project.
1.1 Robotics
The concept of robots runs back to the classical times but noticeable growth in robotics
emerged in the 20th century after the industrial revolution. Robotics is the area in
Engineering which combines Computer Science, Mechanical Engineering and
Electrical Engineering disciplines. In a broader sense Robotics can be defined as
designing and developing machines which can replace the human presence to do
function faster and precisely while keeping humans away from the danger. Robots are
employed in many fields. Earliest robots were employed in the manufacturing industry
followed by military, medical and entertainment and consumer use. Depending on the
functionality there are many types of robots;

Robotic Manipulators (Static)

Wheeled Robots

Swimming Robots

Flying Robots (UAVs)
1.2 Robotic Manipulators
Robotic Manipulators are the one of the widely used and earliest type of robots used in
the industry. There are millions of robotics arms employed in manufacturing industry.
Depending on their mechanism there are also categorized in to several groups [1]. They
are briefly described as follows.
1
Cartesian Robots
These types of robots are mostly employed in assembly and welding operations in the
industry. They consist of with three prismatic joints as shown in figure 1.1.
Figure 1.1 Kinematic Links of a Cartesian Robotic Manipulator
Spherical robots
Figure 1.2 shows a spherical robotic manipulator. They consist of two rotary joints and
one prismatic joint [2]. These got its name because they form a spherical coordinate
system.
Figure 1.2 Spherical Robotic Manipulator
2
Cylindrical Robots
Figure 1.3 Cylindrical Robotic Manipulator
These robots are also used for machine tools handling, spot welding and assembling
purposes. One of the joints is revolute and one joint is prismatic.
SCARA Robots
Figure 1.4 SCARA Robotic Manipulator
SCARA stands for Selective Compliance Assembly Robot Arm or Selective
Compliance Articulated Robot Arm. These robots are widely used for assembling
purposes. One of its advantages is its rigid Z axis. Figure 1.4 shows kinematic diagram
of an SCARA robot.
3
Articulated Robots
These types of robots are the most popular type of robotic manipulators employed in
the industry. All the joints of this type of robots are rotary. They are mainly used for
spot welding, gas welding and machine tool handling. Compared to other type of
robotic manipulators these type of robots have a simple design since they consist of
rotary joints and motors can be used for rotary joints without any other complex
mechanisms. Figure 1.5 shows Articulated robotic arm.
Figure 1.5: Articulated robotic arm
Parallel Robots
These robots consists of an End Effector [3] which is controlled by several links. The
position and the orientation of the end effector can be determined by considering one
link of each stage. Stewart platform is an example for a parallel robot. Figure 1.6 shows
a Stewart Platform. Unwanted movements like vibrations of the end effector is
constrained by its own links. Therefore these robots provides greater stability for the
end effector. So these type of manipulator are used applications which demands high
4
precision such as in optical alignment applications, flight simulators, and automobile
simulators.
Figure 1.6: Steward Platform
Anthropomorphic Robots
These are the most advanced type of robots which try to resemble a real human. One of
such robot is Asimo designed by Honda. These type of robot are supposed to be adapted
to perform many tasks rather than a specific task as seen in aforementioned types of
robots.
Figure 1.7: ASIMO by Honda
5
1.3 Objective
The objective of this project is to design a low cost Teleoperated robotic platform
controlled through Wi-Fi / Internet which could be employed in daily consumer
applications.
6
CHAPTER 2
SYSTEM OVERVIEW
This chapter presents the overview of the system with basic functionality without
any detailed description. The systems is are described in terms of its subsystems are
how they are coordinated to work as a system. The system block diagram is shown
in Figure 2.1.
2.1 System Block Diagram
Hand Tracking
Vision System
USB (Universal Serial Bus)
Bluetooth
Link
Hand Held
Master
Gripper
Computer
WiFi
(Internet)
Robotic
Manipulator
with Force
feedback
Gripper
Located in
Remote Place
Figure 2.1: System Block Diagram
The System can be categorized into four sub systems which can also be seen
physically in the system. They are;

Hand Tracking Vision System

Handheld Master Gripper

Computer (Provides data processing for Hand Tracking, Coordinates all
subsystems)

Robotic Manipulator with Force Feedback Gripper
7
2.2 Functionality of the System
2.2.1 Hand Tracking Vision System
This consists of a Microsoft Xbox Kinect Sensor [4] which can computer the
environment with its depth information. Kinect is connected to a computer through
USB (Universal Serial Bus) for data processing. Chapter 4 contains a more detailed
description about the Kinect Sensor.
2.2.2 Host Computer
The main purpose of the Computer is to process the data from the vision system and
provides the hands position. The need of a computer could have been eliminated if the
data from the vision system can be processed with a microcontroller. The huge amount
of data flowing from the vision systems demands a higher processing power and RAM
(Random access memory) which makes it impossible to use of a microcontroller. Apart
from hand tracking the computer coordinates all the other subsystems. It sends the
position of the hand position to the remotely located Robotic Manipulator through
Wi-Fi. And it can read the gripper position of the handheld master gripper through
Bluetooth and send to the slave gripper located at the end of the robotic manipulator.
Moreover it reads the feedback force exerted on the slave gripper though Wi-Fi and
send it to the master gripper through Bluetooth. Computer is programmed with the
programming language called Processing [5]. Figure 2.1 shows the functionality and
communication of the computer in detail.
8
Vision System
Master Gripper’s
Position
Feedback force
Exerted on
Slave Gripper
Computer
Master Gripper’s
Position and
Hand Position
Feedback Force
from Slave
Gripper
Figure 2.2: Computer Functionality and
Communication
2.2.3 Handheld Master Gripper
This is a virtual gripper that resemble the gripper located at the robotic manipulator’s
end. Someparts of Lynxmotion’s Robot hand is used as the master gripper which shown
in Figure 2.3.
Figure 2.3: Handheld Master Gripper
The Master gripper is expected to perform two tasks; Tracking and sending the position
of the gripper and the orientation to the computer through a Bluetooth interface,
generating a feedback force based on the information received from the computer. To
perform these tasks there are two independent subsystems in the Master gripper. In
terms of hardware it consists of a geared DC motor with closed loop [6] current control
to generate the force feedback and rotary variable resistor to measure the position of
9
the gripper. Figure 2.4 shows the hardware block diagram of the handheld master
gripper and Figure 2.5 shows the actual inside image of it.
Geared
DC
Motor
Motor
Driver
Bluetooth
Interface
Current
Sensor
Microcontroller
Rotary
Potentio
Meter
IMUAccelerometer
and Gyroscope
Figure 2.4: Block Diagram of Handheld Master Gripper
Motor Driver
Bluetooth Module
IMU Unit
Current Sensor
DC Motor and
Potentiometer
Arduino Nano
Figure 2.5: Inside Image of Handheld Master Gripper
Handheld gripper is battery operated in order to facilitate the user with a natural
interface not attaching wires to the hand. MG995 metal gear servo motor was modified
to use its motor and potentiometer meter independently. Both the DC motor and
potentiometer is included in the same enclosure making the system more compact.
Atmega328 based Arduino Nano development board was chosen as the controller for
10
the handheld master gripper due to its smaller size. L298 [7] motor driver IC was used
to drive the motor and the potentiometer was read through the built in Analog to Digital
converter (ADC) [8] module of the microcontroller. And ACS712 [9] current sensor
was also used in the motor control system to achieve the closed loop control. A HC-05
[10] Bluetooth module is integrated to the system to provide the communication with
the computer.
2.2.4 Robotic Manipulator with the Slave Gripper
This unit is located at a remote place and it’s controlled through internet. A Wi-Fi
module is used to connect it to the internet. Figure 2.6 shows the hardware block
diagram of the robotic manipulator and slave gripper.
Servo
Controller
Microcontroller
Gripper
Force Sensor
Robotic
Manipulator
Wi-Fi Module
Slave Gripper
Figure 2.6: Block Diagram of Robotic Manipulator with
Slave Gripper
The robotic manipulator used here is Lynxmotion’s AL5D robotic manipulator which
is shipped as parts and end user is supposed to assemble it. The gripper is also included
in the package. Atmega 2560 microcontroller based Arduino Mega 2560 [11]
development board is used as the controller for the systems at the robotic manipulator’s
end. A Flexi Force 1lb force sensor [12] is attached at the gripper to measure the force
11
exerted by objects being gripped. Force sensor is read though microcontroller’s built in
Analog to Digital converter module. SSC 32 by Lynxmotion is used to control the five
servo motors of the robot arm and the microcontroller commands it through a RS232
[13] interface. Roving Network’s RN-XV [14] Wi-Fi module is used to connect the
robot to the internet and then to the computer. Figure 2.7 shows the complete system of
robotic manipulator, gripper, controller and Wi-Fi module .
SSC 32 Servo Driver
Wi-Fi Module
Slave Gripper
Arduino Mega2560
Robotic Manipulator with Slave Gripper
Controller
Figure 2.8: Robotic Manipulator with Slave Gripper and Controller
12
CHAPTER 3
MODELING AND CALIBRATION
When physical systems are modeled mathematically it provides greater flexibility in
controlling. Mathematical modelling is about predicting the system behavior through
mathematical equations and models to obtain a desired output by generating a control
input. Following sections presents the mathematical modeling of the each subsystem.
3.1 Mathematical Modelling of the Robotic Arm
Lynxmotion AL5D robot arm categorized under articulated robotic manipulator
category as described in Chapter 1 since all of joints are revolute. In practice dealing
with the position of the robot in Cartesian coordinate system makes more sense and it’s
widely preferred. Therefore a mathematical model that can convert joint angles [15] to
Cartesian coordinates and vice versa are needed in practical operation. This
mathematical modelling is known as Kinematics [16] which is explained and a model
is developed in following section.
3.1.1 Kinematics
Kinematics deals with the position, velocity and acceleration of the robot without
considering the forces acting on the robotic manipulator. The object of Kinematic
model developed here is to get the position of the robot’s end effector with respect to
its base.
Link Description
There are many types of robots with combinations of many types of joints. As an
examples there are revolve joints, prismatic joints, screw joints, cylindrical joints and
so on. Among these, robotic manipulators with revolve joints are more popular due to
13
the simplicity of their mechanism. As we know Electric Motor is the most convenient
and a popular way to convert electrical energy to mechanical energy. And we get the
mechanical energy output as a mechanical rotation. So in robotic manipulators when
DC motors are employed the relative movement between two links become rotational
which is known as a revolute joint. AL5D robotic manipulator consist of five servo
motors with rotational motions. Figure 3.1 shows the servo motors and dimensions of
the LynxMotion AL5D [17] robotic arm.
Shoulder
Forearm
Motor
Motor
Wrist Motor
14.6 cm = L1
8.6 cm
Shoulder
Motor
6.3 cm = H
Base Motor
Figure 3.1: Dimensions and Servo Motors of the AL5D Robotic
Manipulator
Denavit- Hartenberg Notation
Debavit-Harteberg notation or DH parameters [18] are one of the standard ways of
describing a robotic manipulator in terms of dimensions and angles. A coordinate
system is assigned to each of the joints of the robotic manipulator as shown in figure
3.2. Frame of reference is assigned at the base and it is not shown. Then based on that,
four parameters are assigned to each link of the manipulator which contains necessary
14
information to uniquely identify how it is connected to the adjoining link. Thus DH
parameter table can be obtained as shown on table 3.1.
X3
Z3
X4
Y3
Z1
Y1, Z2
Z4
Y4
X1, X2
X2
Figure 3.2: Coordinate Frame Assignment
i
α i-1
a i-1
di
θi
1
0
0
0
θ1
2
-90 °
0
0
θ2
3
0
L1
0
θ3
4
-90°
L2
0
θ4
Table 3.1: DH parameter Table
15
3.1.2 Forward Kinematic Model
Forward kinematic is mathematical modeling of the robot arm to determine the
Cartesian coordinates from the joint angles. Compared to inverse Kinematics, forward
kinematics is simple in terms of equations. And to get the inverse kinematic model first
Forward Kinematic should be derived and it should be solved backward for joint angles
or Inverse Kinematic model can directly be derived from geometry. Equations 3.1
shows the general form of Forward Kinematic model and equation 3.2 shows the
general form of Inverse Kinematic model. If the position of the gripper is considered
relative to the base coordinate system without taking gripper’s orientation in to the
account, the robotic manipulator has Three Degrees of Freedom [19]; base servo
rotation, shoulder servo ration and forearm servo rotation. Moreover to begin with the
Forward Kinematics. The base joint rotation is ignored and only shoulder and forearm
rotations are considered as depicted in figure 3.3. So the links L1 and L2 make a planar
robotic manipulator.
x  f ( 0 ,1 , 2 .....)
y  f ( 0 ,1 , 2 .....)
z  f ( 0 ,1 , 2 .....)
 0  f ( x, y, z ))
1  f ( x , y , z )
 2  f ( x, y , z )
where θ’s are joint angles
and x,y,z are Cartesian
coordinates
where θ’s are joint angles
and x,y,z are Cartesian
coordinates
Figure 3.3: Shoulder and Forearm Links
16
Equation 3.1
Equation 3.2
From geometry equation 3.3 and 3.4 can easily be derived. Axes are named as x/ and
y/ since they are not real x and y axes relative to the base coordinate system.
x   l1 cos(1 )  l 2 cos(1   2 )
Equation 3.3
y   l1 sin(1 )  l 2 sin(1   2 )
Equation 3.4
Following equation derivations are based on top and side views of the robot as shown
in figure 3.4 and 3.5. Relative to the base coordinate system X, Y, Z coordinates can
be derived as shown in from the equations 3.5, 3.6 and 3.7.
y
x/
z,y/
θ0
x
Figure 3.4: Top view of the Robotic Manipulator
Figure 3.5: Side view of the Robotic Manipulator
17
x  cos( 0 )(l1 cos( 1 )  l 2 cos( 1   2 ))
Equation 3.5
y  sin( 0 )(l1 cos( 1 )  l 2 cos( 1   2 ))
Equation 3.6
z  l1 sin( 1 )  l 2 sin( 1   2 )
Equation 3.7
Chapter 5 presents software implementation and testing of the forward kinematic
model.
3.1.3 Inverse Kinematic Model
For the robotic arm the Inverse Kinematic model is the equations 3.5, 3.6 and 3.7 solved
for θ0, θ1 and θ2. This sections brings out the solving steps. From equation 3.3 and 3.4
equation 3.8 can be obtained [20].
x 2  y  2  L12  L22
cos  0 
2 L1 L2
Equation 3.8
So θ2 can easily be derived taking the inverse cosine of equation 3.8. But it has been
found that inverse cosine function in software calculations is very inaccurate for small
angles. So in next step it has been converted to more accurate atan2 function [21]. So
θ2 can be derived as shown in equation 3.9 with atan2 function. Using above results θ1
can be derived as show in equation 3.10.
2

 x2  y2  L12  L22  x2  y2  L12  L22 

 ,
 2  a tan 2  1  

2
L
L
2 L1 L2

1 2




18
Equation 3.9
1  a tan 2( x, y)  a tan 2( p, q)
Equation 3.10
where
p  L1  L2 cos( 2 )
q  L2 sin( 2 )
Combining equations obtained above, joint angles can be calculated for any given
Cartesian coordinates.
3.2 Servo Motor Control and Calibration
For the servo motors used in base, shoulder, forearm and wrist the mapping from the
angle to the servo motor pulse should be obtained in order to implement the kinematic
model in practice. Figure 3.6 shows the calibration of the base servo motor. This
calibration process should be done carefully since a small error in the angle could lead
to a signification variation in Cartesian coordinates.
Figure 3.6: Servo Motor Calibration
Given below are codes which maps the angles to pulse width of the servo motor where
servo1 is base servo, servo 2 is shoulder servo and servo 3 is forearm servo.
19
servo1 = int(-8.51*theta1) + 2222 ;
servo3 = int(9.28*theta3) + 1015;
servo2 = int(9.22*theta2) + 640;
3.3 Handheld Gripper Modeling
In this section mathematical modeling of handheld gripper’s force feedback system and
orientation detection system is presented. Electronic designs of these are presented in
Chapter 5.
3.3.1 Force feedback System Modeling
The force feedback system is designed to regenerate the force exerted by objects on the
slave gripper. Motor modeling is done for the DC motor used there. For DC motors the
torque is proportional to the current flowing through it as depicted in equation 3.11.
  Kt I
where;   Torque
Equation 3.11
I  Current
K t  Torque Const.ant
So controlling the current, feedback force can be controlled. But in DC motors the
current cannot be directly controlled but the voltage can be controlled. So the initial
approach was to control the voltage to get the desired current.
20
La
ω
Ra
Ia
Va
J
Figure 3.7: DC Motor Model
The figure 3.7 shows the electric model of a DC motor [22] and using the Kirchhoff’s
voltage low, equation 3.12 for the motor can easily be derived.
Va  La
dI a
 Ra I a  Eb
dt
Equation 3.12
Where V- Input Voltage
L- Armature Inductance
R- Armature Resistance
Eb - Back EMF (Electro Motive Force)
In equation 3.12 back Emf voltage is proportional to the angular speed of the motor as
in shown equation 3.13.
Eb  K e
Equation 3.13
Where Ke - Back Emf Constant
ω – Angular Speed of the Motor
In this application back Emf can be neglected since maximum angular speed of the
master gripper is very low compared to the input voltage to generate a significant back
Emf. So ignoring back Emf , equation 3.13 can be simplified as shown in equation 3.14.
Va  La
dI a
 Ra I a
dt
Equation 3.14
21
Since PWM (Pulse Width Modulation) [23] duty cycle can be modeled as the
percentage of the supply voltage. The input voltage to the motor can be controlled by
varying the PWM duty cycle. So the system modeling and implementation become
simple due to less number of components. But the aforementioned open loop control
strategy could lead to a few issues; the L and R is not known since the data sheet of the
motor is not available and the R could vary with the temperature changes as well.
Moreover depending on the values of armature Inductance and Resistance motor’s time
constant could affect the time taken to generate a desired output torque which
eventually make a time delay between when an object is gripped by the slave gripper
and when user feel it at the master gripper. Another issue that should be considered is
the battery voltage variations. Since a battery is used and its voltage drops with time
the desired voltage would not be obtained as necessary. So considering above issues a
closed loop control strategy was used in the feedback force generation system
employing a current sensor to get the current feedback.
Iref
+
I error
-
PID
Motor
I out
Current
Sensor
Figure 3.8: PID Current Controller
Figure 3.8 shows the current controller’s block diagram where Iref is the reference
current, Iout is the actual current and I error is the difference between refernce and actual
current. PID (Propotional Derivative Integral) controller was used in the loop to get a
faster response from the system. Next section presents PID parameter tuning.
22
3.3.2 PID Parameter Tuning
It is common to use step response as a test bench to test the performance of a control
system since it contains high frequency components and if a system is tuned to perform
better under a step input, most of times it works well under other inputs. Inorder to tune
the current controller of the handheld gripper the current measurement read from the
current sensor was logged in to the computer through the contoller’s RS232 port. For
tuning process the data was sampled at a rate of 200Hz and logged for a period of 1 S
getting 200 samples. And as the output an 1A step current input was provided as shown
in figure 3.9. The current sensor readings were plotted in matlab to analyze the reponse
and to tune the PID parameters. First the Derivative gain (Kd) and Integral (KI) gain
were kept at zero and the Propotioinal gain was changed so that an overshoot before the
settling time could be observed. Figure 3.9 shows when KP was set at 420 and KI and
KD were set at 0. So the next step was adjusting the derivative gain so that the overshoot
and the oscillations after the overshoot is supressed. Figure 3.11 shows when the Kp
was 420 and Kd was at 24 .An over damped system can be observed there. It can be
observed that after the settling time the system take considerable time to reach the
desired current and to eliminate it, Kd was adjusted and it was found that optimal value
for kd is 8. Figure 3.12 shows the system response for a step output with tuned PID
parameters; Kp=420, Kd = 24, Ki =8.
Figure 3.9: Step Current Input
23
Figure 3.10: Kp Tuning
Figure 3.11: Kd Tuning
Figure 3.12: Response with Tuned PID controller
24
3.3.3 Pitch, Yaw and Roll Detection of Master Gripper
Figure 3.13: IMU Unit
A Complementary filter is employed to fuse the data acquired from the accelerometer
and gyroscope. This section describes mathematical modeling performed to calculate
the pitch angle and chapter 5 includes the implementation of the Complementary filter.
As mentioned earlier in case the robot is intended to have 6DOF in future, yaw and roll
of the handheld gripper are also included in the modeling. When the Accelerometer is
considered, Components of the gravitational acceleration along X,Y,Z axes are
measured to get the pitch, yaw and roll angles.
If , Ax = Accelerometer Reading of X axis
Ay = Accelerometer Reading of Y axis
Az = Accelerometer Reading of Z axis
The pitch angle can be derived from the accelerometer as shown in equation 3.15. And
the roll and yaw angles are given by equation 3.16 and 3.17 respectively.

Ax
pitch  tan 1 
 A2  A2
y
z





Equation 3.15
25

Ay
roll  tan 1 
 A2  A2
x
z






Az

yaw  tan 
 Ax2  A y2





1
Figure 3.16
Figure 3.17
To get the angles from the gyroscope the outputs were integrated with respect to time
as shown in equation 3.18, 3.19 and 3.20 .Time base was chosen as 5ms.
t
pitch  
0
t
roll  
d x
dt
dt
d y
dt
0
t
pitch  
0
Figure 3.18
Figure 3.19
dt
d z
dt
dt
Figure 3.20
3.4 Force Sensor Calibration
Calibration is necessary in order to get a meaningful output from the force applied on
it. First the sensor should be gone through a conditioning step before the calibration as
the manufacturer specifies. This is known as the break in step. For that a load of 110%
of the maximum load should be applied to the sensor. For Flexi sensor it is a force of
4.84 N or approximately 500g. The test weight was 200 g. First 1/3 of the test weight
was placed on the sensor. The weight was placed on the sensor for a time which is equal
to the actual measurement system time. In the gripper it can be approximated to one
second. The plot in figure 3.14 shows the calibration data of the force sensor. The sensor
interfacing is presented in Chapter 4.
26
Figure 3.14: Force Sensor Calibration
Above calibration shows that the voltage measurement is proportional to the applied
force. Finding the linear approximation is not important since the gain can be adjusted
as needed in the handled gripper so that user feel the feedback force properly.
27
CHAPTER 4
HARDWARE DESIGN AND IMPLEMENTATION
System modeling was used to identity the optimal system design. This chapter
describes the practical hardware design procedure followed to design and develop
the system. Software and Algorithm design is presented in Chapter 5.
4.1 Handheld Master Gripper Design
4.1.1 Servo Motor Modification
As described in Chapter 2 slave gripper consists of two main subsystems; force
feedback system and gripper position tracker. Lynxmotion’s robot hand which is
used to design the Handheld Master gripper, is intended to work with servo motors.
Earlier it was planned to employ a gearless high torque DC motor as shown in figure
4.1 to generate the feedback force and to use a rotary variable resistor attached to
the revolute joint of the gripper to track its position. This initial design idea was
larger in size and added more weight to the overall gripper reducing degree of
natural interaction for the user. Then the next idea was to use a servo motor
removing its feedback loop and then using its motor and potentiometer separately.
Most of servo motors include a torque enhancing gear mechanism. That is because
servo motors are driven by a small DC motor as shown in figure 4.2. The issue with
this idea was with the back drivability of the servomotor. The user doesn’t feel any
feedback force when he/she is moving the master gripper when the slave gripper is
not gripping any object. That is because the gear mechanism was exerting a
considerable amount of torque on the master gripper when the user tries to move it.
However the potentiometer was able to capture the gripper position. To deal with
back drivability issue the gear mechanism had to be modified.
28
Figure 4.1 DC Motor Used for the Initial Design of the Master Gripper
Normally a servo motor consists 4 gear wheels as shown in figure 4.2. A MG995
servo motor was modified so that number of gear wheels were reduced to two. This
helped to enhance the back drivability. MG995 servo motor used in this project is
shown in figure 4.3 before modification. In order to generate the same torque motor
demanded more current compared to the case where all the four gear wheel was
present. So it was measured that to generate the maximum torque needed, it required
1.8 A. So the motor controller driver had to be changed accordingly so that it could
handle at least a current of 2A. So instead of the 1 A TB6612FNG motor driver IC
which was intended earlier, L298 motor driver IC was chosen.
Figure 4.2 Components of a Servo Motor
29
1
3
4
2
1
Figure 4.3 MG995 Gear Wheels
Gear wheels 3, 4 as depicted on figure 4.3 were removed and they were placed as shown
in figure 4.4. The new gear reduction ratio was 1:45 which is decrement of a factor of
120. Presented next is a detailed description of the microcontroller, potentiometer
interfacing and closed loop control of the motor.
Figure 4.4 Modified Gear Assembly
Once gear modification was done the electronics in the servo motor was removed and
the motor and potentiometer were interfaced to the controller as in figure 4.5.
Figure 4.5 DC Motor and Potentiometer
30
4.1.2 Controller of Handheld Gripper
4.3 cm
1.5 cm
Figure 4.6 Arduino Nano Controller
ATmega328 based Arduino Nano Development Board in figure 4.6 was chosen as the
controller of the handheld gripper due its small size and number of built in modules.
Below are some of the specification which lead to select above development board.
Analog to Digital Converter Module:
Atmega328 microcontroller has a built in Analog to Digital Converter module with 8
input channels. It has a resolution of 10 bits which means it can identify 1024 ( 210 )
distinct voltage levels.
I2C Module:
Atmega328 has a built in I2C module which facilitates the interfacing of the MPU-6050
inertial measurement unit which can only be read through an I2C interface.
PWM Module
The microcontroller has a built in 8 bit PWM module which is useful in current
controlling the motor.
31
UART Module
Built in UART (Universal Asynchronous Receiver/Transmitter) enables the controller
to interface the HC-05 Bluetooth module and communicate with the computer.
4.1.3 Potentiometer Interfacing
Potentiometer of the modified servo motor is connected to the Analog to Digital
Converter (ADC) module of the microcontroller as shown in the figure 4.7. The voltage
reference is 5V which is also supplied from the microcontroller 5V output. Minimum
angle of the movement of the gripper is determined by the ADC resolution of the
microcontroller as calculated below. The maximum angle of the rotation of the
potentiometer is 300 °.
ADC resolution = 10 bit
Distinct Levels = 210 = 1024
Minimum angle = 300/1024 = 0.29 °
The calculation shows that minimum change in the angle that could be detected by the
controller is 0.29 ° which is acceptable in this application.
GND
Analog Out
5V
Figure 4.7 Potentiometer Interfacing to the Arduino Nano
32
4.1.4 Motor Driving and Current Sensor Interfacing
The motor of the modified servo motor is current controlled to generate the required
torque since for DC motors current is proportional to the torque. Mathematical
Modeling of the PID controlled current controller was presented in Chapter 3. The
block diagram of the current controller is shown in figure 4.8. L298 motor driver is
capable of handling a maximum current of 2A. L298 motor driver is shown in figure
4.9 and interfacing is shown in figure 4.10.
Motor driver interfacing to the
microcontroller is described below.
L298Motor
Driver
DC Motor
Microcontroller
ACS712
Current Sensor
Figure 4.8 Current Controlled DC motor of the Master Gripper
Figure 4.9 L298 Motor Driver Board
33
PWM output of the microcontroller is connected to the Enable A pin of the L298 motor
driver. IN1 and IN2 pins of the motor controller is used to control the polarity of the
voltage applied to the motor. When IN1 is high and IN2 is low a voltage is applied to
the motor from Out1 to Out2 and when IN1 low and IN2 is high voltage is applied to
the motor from Out2 to Out1.When both IN1 and IN2 are set high or low there is no
effective voltage applied across the motor since both Out1 and Out2 have the same
potential. The supply voltage for the motor driver is taken from the same battery that
powers the other modules of the system.
IN1
IN2
PWM
GND
Figure 4.10 L298 Motor Driver Interfacing
For the current measurement Sparkfun’s ACS712 based current sensor module was
employed. It’s a highly precise small sized current sensor with an analog output voltage
proportional to the current being measured. It is capable of measuring currents up to 5A
and there are two variable resistors on board to adjust the gain and the DC shift. Figure
4.11 shows an image an ACS712 based current sensor.
Figure 4.11 ACS712 Based Current Sensing Module
34
Figure 4.12 shows the current sensor module interfacing to the controller board. Two
current sensor modules were used to measure the current flowing in both directions
since this current sensor module is capable of measuring current flowing in one
direction. 1N4001 diode was used at each sensor to make sure that no current is flowing
in the reverse direction.
Figure 5.9 L298 Motor Driver Board Interfacing
Figure 4.12: Current Sensor Interfacing to the Microcontroller
Current sensor was presented in the previous chapter.
4.1.5 IMU Sensor Interfacing
MPU-6050 six degree of freedom IMU (Inertial Measurement Unit) is used to detect
the pith of the hand to change the pitch of the slave gripper accordingly. MPU-6050
includes 3 axis accelerometer and a 3 axis gyroscope. A Complementary filter is used
to fuse the data in order to get a better reading. The Complementary filter algorithm
runs on the microcontroller. MPU-6050 is connected to the microcontroller using the
I2C interface. Figure 4.13 shows an image of the IMU sensor used in the project.
35
Figure 4.13: MPU-6050 IMU Sensor
4.2 Robotic Manipulator and Slave Gripper Design
4.2.1 Robotic Manipulator Design
Since the robot manipulator is already designed by Lynxmotion only the assembling
was required. SSC 32 servo controller facilitates the accurate control of the servo
motors. Figure 4.14 shows an image of the assembled Robotic arm with SSC 32
controller. Atmega2560 based Arduino Mega development board is used at the robotic
manipulator’s end it is shown in figure 4.15.
Figure 4.14: Assembled AL5D robotic manipulator with SSC 32 servo
controller
36
Figure 4.15: Arduino MEGA development Board
Controller is commanding the SSC 32 controller via RS232 protocol through it’s built
in UART module. W-Fi module is controlled through its other UART module. Its built
in Analog to Digital Converter Module is used to read the force sensor. Some of the
specifications of the Arduino MEGA board are listed below.
Microcontroller: Atmega2560
Input Voltage: 5V
Clock Speed: 16 MHz
SRAM
: 8 KB
EEPROM : 4KB
4.2.2 Force Sensing Gripper Design
Force sensor was attached to the gripper using a slider mechanism as shown in Figure
4.16. This design guarantees total force exerted by an object on the gripper is sensed
by the force sensor since the frictional forces introduced by the slider mechanism is
comparably negligible.
37
Figure 4.16: Force sensing Gripper Design Stages
4.2.3 Force Sensor Interfacing
Tekscan’s Flexiforce 1 lb force sensor is used as the force sensor, it is a force sensitive
resistor which changes its resistance according to the applied force. Figure 4.17 shows
an image of the force sensor and interfacing to the development board. 1 Mega Ohm
resistor is used in series with the sensor and 5V is applied to obtain a change in voltage
according to the change in resistance caused by the force.
38
Figure 4.17: Force Sensor Interfacing
4.2.4 Servo Motor Interfacing
Lynxmotio’s AL5D robot has 5 servomotors for base, shoulder, forearm, wrist and
gripper, their models are HS-485 B, HS-805BB, HS-755HB, HS-645MG, HS-422
respectively.
Figure 4.18 HS485 HB Servo Motor
A servo motor requires an input voltage in the range of 5-7V. In this project 6V
regulated power supply with a current output of 5A was used. A servo motor has three
39
terminal as shown in figure 4.18, Signal In (Yellow), Supply (Red) and Ground (Black).
Signal in terminal can be directly connected to the microcontroller without any
intermediate interfaces.
4.3 Hand Motions Tracking
4.3.1 Microsoft Xbox Kinect Sensor
Microsoft introduced a Vision sensor called Kinect with their popular game console
Xbox 360. It consist of RGB camera and a time of flight based depth sensor. It is
manufactured by a PrimeSense who also provides the software development kits for the
sensor. Due to its flexibility of programming it is widely used in application where three
dimensional information of the environment is needed. Figure 4.19 shows the Kinect
sensor used in this project and following is a brief explanation of each component of it.
RGB
Camera
IR Projector
Microphone Array
Tilting
Motor
Monochrome CMOS
Sensor
Figure 4.19: Microsoft’s Xbox Kinect
RGB Camera
This is a RGB video camera which has a resolution of 640 x 480 at a maximum frame
rate of 30 frames per second. This can be used alone for any general purpose image
processing tasks.
40
Depth Sensing
In Microsoft Kinect depth sensing is achieved using the concept of time of flight of
light. The IR projector emits IR (Infrared) light pattern and the built in processor
measures time taken by the IR pattern to arrive at the Monochrome CMOS sensor.
Based on that time, the distance to a pixel in the environment can be determined.
Because of the time of flight of light measurement technology, measurements are not
affected by lighting level changes as seen on two RGB camera based stereo vision
systems. The depth sensing also has a resolution of 640 x 480 pixels at a rate of 30
frames per second.
Time of Flight Theory of Distance Measurement
CMOS sensor
d
IR Projector
θ
Distance h
X axis
Figure 4.20: Time of Flight Distance Measurement
Figure 4.20 shows a basic distance measuring system based on time of flight of light.
If the processor measures the time taken for IR pattern emitted at a known angle from
the projector to arrive at the CMOS sensor as t seconds, the distance can be calculated
based on θ, t, d.
41
Interfacing Kinect to the Computer
A power supply of following type comes with the Kinect to supply the power and
connect it to the computers shown in figure 4.21.
To Kinect
To PC
Figure 4.21: Interfacing Kinect to the Computer
4.3.2 Servo Controller
Figure 4.22: SSC 32 Servo Controller
In this project Lynxmotion’s SSC 32 servo controller which is shown in figure 4.22 was
used to control the servo motors which is capable driving 32 servo motors. It has a
42
RS232 interfacing making it possible to directly command from controller. Arduino
Mega2560 controller’s second serial port was directly connected to the SSC32 board as
shown in figure 4.23.
Figure 4.23: SSC 32 interfacing to the Controller
4.3.3 Wi-Fi Module
Roving Network’s RN-XV Wi-Fi module was used for wireless communication. It also
has a RS232 module which was connected to the third serial port of the Arduino Mega
2560 as shown in figure 4.24.
Figure 4.24: Wi-Fi Module interfacing to the Controller
43
CHAPTER 5
Software and Algorithm Implementation
5.1 Kinect Installation
Kinect Programming Layers
Figure 5.1: Kinect Programming Layers
Kinect Interfacing to the computer can be categorized in to software layers as shown
in figure 5.1 the bottom one is the device drivers. The immediate layer is the OpenNI
SDK (Software Development Kit) which gives the access to the user to program the
Kinect.
Software Installation for Kinect
First the driver for Kinect should be installed which is available from Microsoft Dev
center website. Then the 3D sensing framework OpenNi and NITE provided by
PrimeSense should be installed. OpenNi contains all function which gives the access to
Kinect sensor. OpenNi functions can be accessed through Visual C++ and Java. Figure
5.2 shows OpenNi and NITE logos.
Figure 5.2: Openni and Nite Logos
44
5.2 Host Computer’s Program
In the host computer there are four main parts of the program;

User Skeleton Detection

Auto Calibration

Hand position Tracking

Communication with other components
The flow chart of the computer program is shown in figure 5.3. All the communications
are performed as interrupts which means when the host computer receives data, it
directs them rather than constantly reading them from the other hardware. So this
interrupt driven method help spare more processing power for the hand tracking
algorithm
making
the
overall
system
work
smoother.
Start
Detect User and Track
Skeleton
Get Hand
Position
Auto
Calibration
Send
Coordinates
to the Robot
Master
Gripper’s
Orientation
Send pitch to Slave
Gripper
Read Master
Gripper’s
Position
Send Gripper
Position to Slave
Gripper
Read Force
Sensor
Send Force
Command to
Master Gripper
Interrupt Driven
Time Driven
Figure 5.3: Host Computer Program’s Flow Chart
45
Following sections present algorithms and techniques used to implement each
component in software.
5.3 Skeleton Detection
As highlighted in previous sections, for Skeleton detection Prime Sense’s OpenNi SDK
(Software Development Kit)’s and NITE’s built in libraries are used. Simple OpenNi
which is used in Processing is a wrapper library which gives the access to the OpenNi
and NITE libraries. The intensity of the image represents the depth. Pure black pixels
represent farthest points whereas pure white pixels represent closets points. As the
manufacture specifies the optimal depth sensing range is 1.2–3.5m. To initialize the
tracking the poses should be changed gradually from pose 1 to pose to as shown in
figure 5.4.
Pose 1
Pose 2
Figure 5.4: Skeleton Tracking Initialization
Since the depth sensor is based on IR (Infrared) its skeleton tracking is not affected by
lighting conditions and once a user is tracked it is not sensitive to changes in the back
ground such as moving objects. It enables the hand tracking system to be used in
46
dynamic enrolments. Figure 5.5 shows a test program of skeleton tracking and it can
identify following joints.








Neck
Right Hand , Left Hand
Right Shoulder, Left Shoulder
Right Elbow, Left Elbow
Right Hip, Left Hip
Right Knee, Left Knee
Torso (Middle of Torso)
Right Foot, Left Foot
Figure 5.5: Skeleton Tracking
5.4 Hand Position Tracking
Initial approach for hand tracking was to get the absolute X,Y,Z coordinates of the hand
position as provided by the Kinect. This approach was erroneous because a slight
position change of user could lead to a significant error in hand tracking. So the hand
position of the user was tracked with respect to another joint in the body. Initially torso
was selected as the reference point but its position depends on hip joints and they tend
to change its position more often. So it was not viable too. Finally the middle point of
the right and left shoulders were chosen as the reference point rather than choosing one
47
shoulder. Being the middle point, a relative movement of one shoulder didn’t drift the
reference point by a large amount. Figure 5.6 shows the reference point in red and the
left / right shoulder in green and the hand position in purple. The blue arrow represent
the position vector of the hand with respect to the reference point. The coordinate
frames (0, 0, 0) point is attached to the reference point
Y
(0,0,0)
X
Z
Figure 5.6: Reference Coordinate Frame for Handtracking
Processing’s built-in vector class called Pvector was used to perform the vector
calculations. Following lines of codes were used to calculate the middle position of the
reference point.
midpoint = PVector.add(leftshoulder, rightshoulder);
midpoint=PVector.div(midpoint , 2);
handtracking.getJointPositionSkeleton(userId,SimpleOpenNI.SKEL_RIGHT_HAND, r_hand_pos);
relative_position =PVector.sub(, r_hand_pos, midpoint);
First line and second line calculate the vector of the middle point of the left shoulder
and right shoulder. Third line reads the position of the right hand using the handtracking
48
object of Simple OpenNi Library. Forth line calculates the difference vector of the right
hand with respect to the mid-point.
5.4.1 Position Data Noise Elimination
When the robot was controlled from the hand position obtained from the above method
there was significant amount of noise present in the X, Y and Z position data. To
eliminate the noise the first method used was an averaging filter but it didn’t succeed
in noise elimination to a satisfactory level. Then a moving average filter with a window
size of 10 was used for X, Y and Z axes individually as shown in Figure 5.7. Although
it introduces a small delay to the system it didn’t affect the user’s overall experience.
Last 10 Samples
Output
Figure 5.7: Moving Average Filter
In general moving average filter can be defined mathematically as shown in equation
5.1 where M=10.
Equation 5.1
5.5 Auto Mapping Function Generation
The position of the hand should be mapped to a position of the coordinate system of the
robotic manipulator. The range of movements of the hand depends on user. So the need
of an automatic mapping function arises. To address this issue, automatic mapping
49
function generation algorithm was added before the hand tracking algorithm. Since the
mapping from hand position to robotic manipulator’s position is linear the extreme
position of hand positions were mapped to the extreme positions of robotic
manipulators this guarantees that the robotic manipulator works in its full range. As
shown in figure 5.8 the program is running the automatic mapping algorithm and it
displays the maximum and minimum position data along each axis. The first row
corresponds to the current X,Y,Z coordinates of the hand. As an example if the hand is
moved to a higher position value than -655 along X axis the max of X axis would be
that value. This algorithm runs immediately after the program identified the skeleton of
a user and it approximately provides the user 8 seconds to mark the extreme positions
of his/hers hand movements. It notifies the user with an on screen message when the
calibration process is completed.
Y
X
Z
Figure 5.8: Auto Mapping Function Generation
Figure 5.9 shows a user marking extreme positions of the hand along each axis. As an
example to define the X axis range, he/she should move the hand along the X in the
preferred range of hand operations so that the algorithm can identify the maximum and
minimum of X coordinates.
50
Figure 5.9: Marking the Range of Hand movements
Figure 5.10 shows hand tracking when the auto mapping function generation is
completed. The host computer commands the robot’s controller to move the servos
during this stage. The program displays the current coordinates on upper left corner.
Figure 5.10: Hand Position Tracking after Calibration
51
5.6 Servo Controller
All the servo motors used in this project are controlled by digital signals and they have
maximum rotation angle of 180°. As shown in figure 5.11 a servo motor is controlled
by pulses spaced by at least by 20ms. The pulse width contains the information about
the position. A standard servo motor moves to 0° when a pulse width of 600us is applied
and moves to 180° when a pulse width of 2400 is applied as shown in figure 5.11.
600-2400 µs
20ms
20ms
20ms
20ms
Figure 5.11: Servo Motor Input
SSC32 board which is used to control servo motors, perform aforementioned pulse
generation when it is commanded from the microcontroller.
Figure 5.12: Standard Servo Motor Control Signal
SSC32 is a servo motor controller manufactured by Lynxmotion and it is capable of
controling up to 32 servo motors and the communication is done through the RS232.
52
Data is sent as ASCII strings which contain the position data. To control a position of
a single servo motor the string that should be sent is as follows.
“#5 P1200 T800 <cr>”
Where the number corresponds after the ‘#’ corresponds to the motor number, the
number after P corresponds to the motor position and the number after T corresponds
to the time taken for the movement. <cr> is the carriage return character to mark the
end of the string. The corresponding Arduino command is given below. Servo1 is an
integer value that contains the servo pulse width. + sign is used to append the integer
value to the string. The second line sends the carriage return character. ‘Robot’ is an
object of a serial port class pointed to the SSC 32 board.
servo1=1200;
robot.write("#0 P"+servo1+"S300");
robot.write(13);
Moreover SSC32 servo controller board includes an option to perform group moves of
servo motors. This enables all motors to take same time to move to their desired
positions from their current positions. For group moves the following command is used.
“#0 P1000 #1 P800 #5 P1400 T1500 <cr>"
The number after ‘#’ corresponds to the motor and the value after the immediate ‘P’
corresponds to the motor position and the value after T corresponds to the time taken
by the movement. The equivalent Arduino code is as follows.
servo1=1000;
servo2=800;
servo3=1400;
robot.write("#0 P"+servo1+" #1 P"+servo2+"#5 P"+servo3+"T300");
robot.write(13);
53
5.7 Wi-Fi-Module Configuration
Roving Networks RN-XV Wi-Fi module is configured through a computer over a WiFi connection. When it is powered on, keeping Pin 9 at high it boots to the Wi-Fi adhoc
mode. When it is scanned with the computer it is shown as a WiFly-GS#-## where *
corresponds to some numbers. Using the software called Comm Operator Wi-Fi
module can be configured as necessary. Figure 5.13 shows a screen shot of Comm
Operator connecting to the Wi-Fi module. Following steps explains how to configure
the Wi-Fi module.

Connect computer to the Wi-Fi module through Comm Operator using the ip
1.2.3.4 and port 2000

Type "$$$" to enter command mode

Type “set wlan phrase <password of the Wi-Fi Network>” specify the password
of the intended Wi-Fi Network

Type “join <Wi-Fi network Name> “ to connect the network and this would
restart the module and the pin 9 of the module should be set to low.
Figure 5.13: Comm Operator
54
5.8 Robotic Manipulator’s Range of Operation and Coordinate System
Length of the first link of robotic manipulator is 15.6 cm and the length of the second
link (length from third joint to the middle of the gripper) is 26.5cm. So the Robotic
manipulator has a maximum reach about 40cm. Along X axis robot has a range of 40cm to 40cm but it cannot move to the negative side of the Y axis since the base servo
motor is capable of only 180° degree rotation. So Y axis has a range from 0 cm to 40cm.
Z starts at the intersection point of base servo’s axis and shoulder servo’s axis. If the
robot is not obstructed by the table it has a range of -40cm to 40cm along Z axis. So
the range of Z axis was set to -7cm to 40 cm. Based on above information, inequalities
in equation 5.2 were implemented in the code to make sure that inputs are in the range
of operation of the robotic manipulator. Figure 5.14 shows the coordinate system
assigned to the robotic manipulator.
x 2  y 2  z 2  40
Equation 5.2
y0
 7  z  40
Where x,y,z are the X,Y,Z coordinates of the robot.
Figure 5.14: Robotic Manipulator’s Coordinate System
55
Based on above information joint angles of the robot can be derived with the Inverse
Kinematic model as presented in Chapter 3. Once the joint angles are derived maximum
and minimum possible of each servo motor should be considered before commanding
servo motors. As it can be seen in figure 5.16, the spring that is used to stabilize robot
between shoulder and forearm servo obstructs the robots shoulder servo motor
exceeding 165°. So the inequality in equation 5.3 should also be implemented in the
code.
Equation 5.3
 2  165
All of the above condition checks can be depicted in the flow chart of figure 5.15.
Get X, Y, Z
Coordinates
If
False
x  y  z 2  40
2
2
and y  0
and  7  z  40
?
True
If
Calculate Joint
Angles
 2  165
?
True
Command
Servos
Figure 5.15: Robotic Manipulator Range of Operation Flowchart
56
False
Figure 5.16 Shoulder Servo Obstructed by the Spring at 165°
Above inequities were implemented in Processing as follows.
if ( sqrt(pow(xcor_abs,2.0) + pow(ycor_abs,2.0) + pow(zcor_abs,2.0)) < (l1+l2))
{
//Angle Calculation
if ( theta1 < 165)
{
//command servos
}
}
5.9 Orientation Detection of the Handheld Gripper
5.9.1 Complementary Filter
As described in previous chapter to fuse the Accelerometer and Gyroscope data a
Complementary filter is employed. This section presents the theoretical background
of the complementary filter. The complementary filer is chosen based on the type of
noise included in the gyroscope and accelerometer. The gyroscope is accurate for short
term measurement but it can provide accurate measurements under sudden variations
of the angles and in long term measurements, a drift in the measurement occurs due to
57
the integration. So it implies that the degree of belief on the high frequency portion of
the gyroscope is higher than that of the low frequency portion. When the Accelerometer
is considered it provides the absolute angle and it is not affected by long time operation.
But its short term measurements contains noise. So the low frequency components of
the accelerometer has higher degree of belief than high frequency components of it. So
the accelerometer data are sent through a low pass filter and gyroscope data are sent
through a high pass filter as shown in figure 5.17.
Low Pass Filter
Accelerometer
Reading
θaccel
G1(s)
+
High Pass Filter
Gyroscope
Reading
.
θgyro
1
S
Filtered Output
θfiltered
G2(s)
Figure 5.17: Complementary Filter Block Diagram in Frequency Domain
So to get the accurate angle estimation from the filter, the condition in equation 5.4
should be satisfied.
Equation 5.4
G1(S) + G2(S) = 1
In
time
domain
the
equation
5.5
defines
the
complementary
filter.
Equation 5.5
n1
n
n
 n filtered   ( accel
 gyro
dt )  (1   ). accel
where,
n
filtered
 Filtered Output
 n  Current Accelerome ter Angle
accel
 n1  Previous Accelerome ter Angle
accel
n
gyro
 Current Gyroscope Angle
  Parameter approximately taken as 0.98
A complimentary filter library which is included in the MPU-6050 library was used in
handheld gripper to filter the data.
58
CHAPTER 6
RESULTS
This chapter presents the overall test results of each sub-system of the system.
6.1 Handheld Master Gripper Test Results
In this section Handheld master gripper’s three sub units’ test results are presented.
6.1.1 MU Test Results
MPU 6050 IMU sensor consists of built in accelerometer and a gyroscope.
Gyroscope outputs the rate of change of angle whereas the accelerometer provides
the acceleration along X, Y and Z axes. The roll, pitch and yaw angles can easily
be calculated from the Gyroscope by integrating its output and they contain very
less noise. But its major drawback is the drift that occurs with time due to the
inaccuracy of integration. For the accelerometer, to calculate the angles, the
acceleration caused by the gravitational acceleration is measured along each axis
and it is assumed that the accelerometer is static. This is reasonable when the
accelerations caused by the hand movements are negligible compared to the
gravitational acceleration. The major drawback of accelerometers is the noise
contained in the output although it provides absolute values of the angles. The
accuracy of the gyroscope and the absolute nature of the accelerometer is fused
together employing a Complementary filtering method to get an accurate and noise
free measurement.
59
Figure 6.1: Accelerometer Output
Figure 6.2: Gyroscope Output
Figure 6.1 and 6.2 shows the pitch angles derived from Accelerometer and
Gyroscope individually. 200 data samples were obtained at a sampling rate of 5ms.
It can observed from figure 6.1 that Accelerometer contains noise components.
When the slave gripper’s pitch was controlled by the Accelerometer vibrations of
if were observed even when the handheld gripper was kept stationary. A low pass
filter can be used to eliminate these noises but it reduces the system’s response time.
In figure 6.2 the blue line corresponds to the same movement as the Accelerometer
as in Figure 6.1 and red line shows the same movement performed backward as
measured by the Gyroscope. It can be seen that with time there is a drift occurred.
Figure 6.3 is the Accelerometer and Gyroscope outputs fused through the
Complementary filter. The blue line corresponds to the same forward movement as
60
described above and the red line corresponds to the backward movement. And it
can be observed that there is no drift with the time and it containes very low noise
component.
Figure 6.3: Complementary Filter Output
6.2 Hand Motion Tracking Results
Here since it is clear to present visually in 2D, Z axis of the robot was kept constant and
the hand was moved along on the X-Y plane. Figure 6.4 shows the coordinates of the
hand with time when a motion along a straight line was followed (projection on X,Y
plane is shown there) with the moving average filter.
Figure 6.4: Hand Motion Tracking in 2D space
61
Figure 6.5 and 5.6 shows the respective X and Y axis components of the motion
performed in figure 6.4.
Figure 6.5: Hand Position along X axis
Figure 6.7: Hand Position along Y axis
62
CONCLUSION
The main objective of this project was to develop and design a Low cost Teleoperated
robotics manipulator which can replicate hand movements and provide Haptic
Feedback. Each chapter discusses the design stages from modeling to real
implementation of the system and the appendices include the codes written for the
system. Although there are mechanical instabilities in the robotic manipulator since it
is not an industrial standard one, as a whole it can be concluded that this project
provides a low cost platform for Teleoperated robotic manipulators for consumer
applications. As future improvements of the project, following can be suggested.

Using a mechanically stable robotic manipulator

Reducing the size of the handheld gripper

Attaching the robotic manipulator to a mobile robot

Using a more powerful Wi-Fi module to Increase the Range
63
REFERENCES
[1] Nasa.gov, Date Accessed : 5th April 2014
http://prime.jsc.nasa.gov/ROV/types.html
[2] John. J. Craig, Introduction to Robotics: Mechanics and Control (3rd Edition),
Pearson 2005
[3] Wikipedia.org , Date Accessed : 5th April 2014
http://en.wikipedia.org/wiki/Robot_end_effector
[4]
www.xbox.com, Date Accessed : 5th April 2014
http://www.xbox.com/en-US/kinect
[5]
processing.org, Date Accessed : 6th April 2014
www. processing.org
[6] Wikipedia.org , Date Accessed : 6th April 2014
http://en.wikipedia.org/wiki/Closed-loop_transfer_function
[7]
ww.st.com , Date Accessed : 6th April 2014
http://www.st.com/web/catalog/sense_power/FM142/CL851/SC1790/SS155
5/PF63147
[8]
www.arduino.cc, Date Accessed : 7th April 2014
http://arduino.cc/en/Reference/analogRead
[9]
www.allegromicro.com, Date Accessed : 7th April 2014
http://www.allegromicro.com/en/Products/Current-Sensor-ICs/Zero-ToFifty-Amp-Integrated-Conductor-Sensor-ICs/ACS712.aspx
[10] makezineblog.files.wordpress.com, Date Accessed : 1th April 2014
http://makezineblog.files.wordpress.com/2014/03/hc_hc-05-userinstructions-bluetooth.pdf
64
[11] arduino.cc, Date Accessed : 2th April 2014
http://arduino.cc/en/Main/arduinoBoardMega
[12] www.tekscan.com Date Accessed : 6th April 2014
http://www.tekscan.com/flexible-force-sensors
[13] wikipedia.org ,Date Accessed : 3th April 2014
http://en.wikipedia.org/wiki/RS-232
[14] microchip.com,Date Accessed : 6th April 2014
http://ww1.microchip.com/downloads/en/DeviceDoc/rn-171-xv-dsv1.04r.pdf
[15] www.usc.edu,Date Accessed : 7th April 2014
http://www.usc.edu/dept/LAS/kinesiology/exsc408l/lab/408_Lab3_AngKin_
F13.pdf
[16] B.V.R. Gupta ,Theory of Machines: Kinematics and Dynamics,
I. K. International Pvt Ltd, Nov 1, 2010
[17] lynxmotion.com,Date Accessed : 3th April 2014
http://www.lynxmotion.com/c-130-al5d.aspx
[18] wikipedia.org ,Date Accessed : 4th April 2014
http://en.wikipedia.org/wiki/Denavit%E2%80%93Hartenberg_parameters
[19] wikipedia.org ,Date Accessed : 6th April 2014
http://en.wikipedia.org/wiki/Degrees_of_freedom_(mechanics)
[20] www.hessmer.org,Date Accessed : 4th April 2014
http://www.hessmer.org/uploads/RobotArm/Inverse%20Kinematics%20for
%20Robot%20Arm.pdf
65
[21] www.processing.org,Date Accessed : 6th April 2014
https://www.processing.org/reference/atan2_.html
[22] Senevirathne, H.R.; Abeykoon, A.M.H.S.; Pillai, M.B., "Disturbance rejection
analysis of a disturbance observer based velocity controller,"Information and
Automation for Sustainability (ICIAfS), 2012 IEEE 6th International
Conference on , vol., no., pp.59,64, 27-29 Sept. 2012
[23] wikipedia.org,Date Accessed : 6th April 2014
http://en.wikipedia.org/wiki/PWM
66
APPENDIX A
Processing Main Program
import SimpleOpenNI.*;
SimpleOpenNI kinect;
import processing.serial.*;
import g4p_controls.*;
String position,position1;
int servo1,servo2,servo3;
float theta2,theta3,theta1;
float xcor,ycor,k1,k2,theta1_r,theta2_r,theta3_r;
float l1=15;
float l2=26.5;
float gamma,l,first,second;
float d,zdash;
float xcor_abs,ycor_abs,zcor_abs;
float amount=10;
int j=0;
int counter=0;
float a1,b1,c1;
float a1max =-100000.0;
float a1min =100000.0;
float b1max =-100000.0;
float b1min =100000.0;
float c1max =-100000.0;
float c1min =100000.0;
int a=0;
int b=0;
int c=0;
int i =0;
float [] xavg={0,1,2,3,4,5,6,7,8,9};
float [] yavg={0,1,2,3,4,5,6,7,8,9};
float [] zavg={0,1,2,3,4,5,6,7,8,9};
int calibrationflag=0;
PVector leftHand = new PVector();
PVector rightHand = new PVector();
PVector leftshoulder = new PVector();
PVector rightshoulder = new PVector();
PVector midpoint = new PVector();
Serial myPort;
Serial robot;
int armAngle = 0;
int angleChange = 5;
final int ANGLE_LIMIT = 135;
short portIndex = 2;
int lf = 10;
String inString;
int calibrating;
67
float angle1;
float dt;
float temp;
float angle2;
void setup() {
track = new SimpleOpenNI(this);
track.enableDepth();
kinect.enableUser(SimpleOpenNI.SKEL_PROFILE_ALL);
size(640, 480);
fill(255, 0, 0);
String portName = Serial.list()[2];
println(Serial.list());
myPort = new Serial(this, portName, 19200);
myPort.clear();
myPort.bufferUntil(lf);
portName = Serial.list()[2];
robot = new Serial(this, portName, 9600);
robot.clear();
robot.bufferUntil(lf);
}
void draw() {
int distance = 50;
int x_rotation = 90;
background(255);
track.update();
image(track.depthImage(), 0, 0);
IntVector users = new IntVector();
kinect.getUsers(users);
if (users.size() > 0) {
int user = userList.get(0);
if ( track.isTrackingSkeleton(user)) {
tracking(user);
}
}
myPort.write(123);
}
void serialEvent(Serial p) {
incoming = (myPort.readString());
try {
String[] dataStrings = split(incoming, '/');
String[] analog = split(incoming, ',');
if(analog[0].equals("a"))
{
angle2 = (float(analog[1])-600.0)/10.0;
int gripper = int(-4.54*(float(analog[1]) - 470) + 2250);
robot.write("#4 P"+gripper+" T100");
robot.write(13);
println(float(analog[1]));
}
if(analog[0].equals("p"))
68
{
angle2 = (float(analog[1])-600.0)/10.0;
}
int pitch= int(6.66*(x_acc_g+75.0)+800);
robot.write("#3 P"+pitch+" T100");
robot.write(13);
}
}
}
catch (Exception e) {
println("Caught Exception");
}
}
69
APPENDIX B
Inverse Kinematic Model Code
void calculate_joint_angles()
{
try {
theta1_r = atan2(ycor_abs,xcor_abs);
xcor = sqrt(pow(xcor_abs,2.0) + pow(ycor_abs,2.0));
ycor = zcor_abs ;
theta3_r = acos((pow(xcor,2.0) + pow(ycor,2.0) - pow(l1,2.0)-pow(l2,2.0)) /
(2*l1*l2));
k1= l1 + l2*cos(-theta3_r);
k2=l2*sin(-theta3_r);
theta2_r = atan2(ycor,xcor) - atan2(k2,k1);
theta1=degrees(theta1_r);
theta2=degrees(theta2_r);
theta3=degrees(theta3_r);
if ( sqrt(pow(xcor_abs,2.0) + pow(ycor_abs,2.0) + pow(zcor_abs,2.0)) < (l1+l2))
{
if( theta2 < 165 && theta3 < 160 )
{
servo1 = int(-8.51*theta1) + 2222 ;
servo3 = int(9.28*theta3) + 1015;
servo2 = int(9.22*theta2) + 640;
}
}
robot.write("#0 P"+servo1+" #1 P"+servo2+"#2 P"+servo3+"T300");
robot.write(13);
}
catch (Exception e){}
}
70
APPENDIX C
Hand Tracking Code
void tracking(int user) {
stroke(0);
strokeWeight(5);
track.getJointPositionSkeleton(user, SimpleOpenNI.SKEL_LEFT_SHOULDER,
leftshoulder);
track.getJointPositionSkeleton(user, SimpleOpenNI.SKEL_RIGHT_SHOULDER,
rightshoulder);
midpoint = PVector.add(leftshoulder, rightshoulder);
midpoint=PVector.div(midpoint, 2);
track.getJointPositionSkeleton(userId, SimpleOpenNI.SKEL_LEFT_HAND,
midpoint);
midpoint=PVector.sub(midpoint, rightshoulder);
a = a + int(midpoint.x);
b = b + int(midpoint.y);
c = c + int(midpoint.z);
i++;
if(i==10)
{
a1=a/10;
b1=b/10;
c1=c/10;
i=0;
a=0;
b=0;
c=0;
}
textSize(32);
textSize(32);
textSize(32);
ycor_abs = (-c1-200)/20.0;
if(ycor_abs<6)
{ycor_abs=6;}
if(ycor_abs>10)
{ycor_abs=10;}
xcor_abs=(a1+ 650)/80.0;
xcor_abs=xcor_abs+5;
if(xcor_abs<6)
{xcor_abs=6;}
if(xcor_abs>10)
{xcor_abs=14;}
xcor_abs = 3*xcor_abs;
ycor_abs=3*ycor_abs;
if(calibrationflag==0)
{
71
text(a1, 10, 30);
text(b1, 10, 70);
text(c1, 10, 110);
}
if(counter < 300)
{
if(j > 20)
{
if(a1 > a1max)
{
a1max=a1;
}
if(a1 < a1min )
{
a1min=a1;
}
if(b1 > b1max)
{
b1max=b1;
}
if(b1 < b1min )
{
b1min=b1;
}
if(c1 > c1max)
{
c1max=c1;
}
if(c1 < c1min )
{
c1min=c1;
}
}
j++;
if(i>100)
{j=50;}
}
else
{
if(calibrationflag==0)
{
text("CALIBRATION DONE", 200, 200);
calibrationflag = 1;
}
}
counter++;
if(counter > 5000)
72
{counter=4000;}
println(counter);
if(calibrationflag==0)
{
text(a1min, 200, 30);
text(a1max, 400, 30);
text(b1min, 200, 70);
text(b1max, 400, 70);
text(c1min, 200, 110);
text(c1max, 400, 110);
}
xcor_abs = (-80.0/(a1max-a1min))*(a1-a1max) - 40.0;
zcor_abs = (40.0/(b1max-b1min))*(b1-b1min);
ycor_abs = (40.0/(c1max-c1min))*(c1-c1min);
for(int q = 9; q > 0;q--)
{
xavg[q-1]=xavg[q];
}
xavg[9] = xcor_abs;
xcor_abs =
(xavg[0]+xavg[1]+xavg[2]+xavg[3]+xavg[4]+xavg[5]+xavg[6]+xavg[7]+xavg[8]+xa
vg[9])/10.0;
for(int q = 9; q > 0;q--)
{
yavg[q-1]=yavg[q];
}
yavg[9] = ycor_abs;
ycor_abs =
(yavg[0]+yavg[1]+yavg[2]+yavg[3]+yavg[4]+yavg[5]+yavg[6]+yavg[7]+yavg[8]+ya
vg[9])/10.0;
for(int q = 9; q > 0;q--)
{
zavg[q-1]=zavg[q];
}
zavg[9] =zcor_abs;
zcor_abs =
(zavg[0]+zavg[1]+zavg[2]+zavg[3]+zavg[4]+zavg[5]+zavg[6]+zavg[7]+zavg[8]+zav
g[9])/10.0;
if(xcor_abs < -40)
{
xcor_abs = -40
;
}
if(xcor_abs > 40.0)
{
xcor_abs = 40;
}
if(ycor_abs < 0)
{
ycor_abs = 0;
73
}
if(ycor_abs > 40.0)
{
ycor_abs = 40;
}
if(zcor_abs < -10)
{
zcor_abs = -10;
}
if(zcor_abs > 40.0)
{
zcor_abs = 40;
}
if(calibrationflag == 1)
{
calculate_joint_angles();
text(xcor_abs, 10, 30);
text(ycor_abs, 10, 70);
text(zcor_abs, 10, 110);
}
else
{
servo1 = int(-8.51*0) + 2222 ;
servo3 = int(9.28*90) + 1015;
servo2 = int(9.22*90) + 640;
robot.write("#0 P"+servo1+" #1 P"+servo2+"#2 P"+servo3+"T300");
robot.write(13);
}
noStroke();
}
}
74
Download