File - Xingchen Fan

advertisement

ME 135 Summer 2013

Group 1

Project Report

Xingchen Fan

Devin Matthews

Edoardo Colasante

Zuhra Abdurashidova

1

2

I.

Background & Motivation

The purpose of this project is to create a robot that can shoot a ping pong to a target. The original motivation is to create a beer pong shooter, a toy that people can play with during party. The technical aspects behind it, including object tracking and motion control, could also be applied to any real-time aiming devices.

We use motors to implement two degrees of freedom: the horizontal motion is created by a

Bell-Everman linear motor with PID control, the pitch motion is generated by a 5V servo motor with PPM (15% to 30% duty cycle). A 12V solenoid triggers the gun and is powered by PWM (similar to PPM, but with higher duty cycle). Microsoft Kinect tracks the object with specified color (red cup in our test) and sends position information to real-time target through UDP and a network variable. The system operates on two platforms simultaneously.

Image processing is done on PC, while the real-time control of actuators runs on National

Instruments sbRIO 9632, on which FPGA configures digital input/output and a microcontroller processes data. All programming is in LabVIEW, except the interface with

Kinect, which is in C language.

II.

Anticipated Challenges

When we first came up with the idea, we thought of challenges in terms of both hardware and software. We solved some of them already during planning and designing stage. However, others seemed to be more troublesome as we went further (see III. Actual Challenges).

Mechanical

Starting from the beginning, we thought our project is totally achievable with hardware.

Therefore we decided to build our own mechanical system. Nobody in our group had experience in designing and manufacturing a mechanical system from scratch though, and two team members just got the machine shop training.

Initially, we were thinking of a turret with two degrees of freedom, pitch and yaw, controlled by two motors (Figure 1). The gun would be mounted at the top. In addition, we were thinking of another motor to trigger the gun. Two major challenges came to our mind.

3

Figure 1 Turret design revision 1. Two cyan cylinders represent motors.

1.

Design and manufacturing of the turret

On the one hand, our orienting mechanism should be able to support the weight of the gun with its trigger mechanism, and maybe plus the weight of one motor that’s stacked on top of the other one. On the other hand, the design should be light with low friction so it can rotate more easily. More importantly, due to limited time, we wanted to keep the turret design as simple as possible, so we could have the hardware ready and started testing early.

We quickly altered our idea so that we were going to use linear stage in the lab as one degree of freedom to control the distance to the cup, and one motor to control the yaw angle of the gun. That would reduce the amount of mechanical work, but maintain the complexity of control with the same number of degrees of freedom. Our second design is shown in Figure 2.

Figure 2 Turret design revision 2. The servo motor to control yaw is shown in blue. The motor for triggering is roughly positioned and shown in cyan. We were planning to make our own gun with aluminum tubes.

4

2.

Consistent firing

We decided early that we were only going to control the orientation of the gun, not the launch speed of ping pong. Therefore, one of the major concerns about precision is inconsistent firing. Several factors could affect the trajectory of ping pong. Firstly, the spring may be loaded inconsistently. Then the impulse from the released spring could vary, resulting in an inconsistent launch speed. Secondly, the ping pong may interfere with the cartridge. The friction could add unexpected spin to the ball, which would make the trajectory unpredictable.

Finally, we were also afraid of wind changing the trajectory of ping pong.

Based on this criterion, we decided to buy a toy gun instead of making one by ourselves. The store-bought toy guns have precisely molded plastic pieces which guide the loading and travel of spring very well. That’s difficult to achieve with our machining skills given the limit on time and effort. Later we also chose to fire a solid wooden ball instead of a hollow plastic ping pong, which reduced the wind effect.

Software :

We were aware that software should be the focus of this project. We were lack of experience in interaction with hardware, real-time programming and multitasking. More challenges from software came to our mind. Fortunately, by using LabVIEW to write our host program, we were able to overcome some challenges easily.

1.

Motor control

Initially, we were not familiar with different types of motor. The only motor in our mind was a cylindrical DC motor (clear in our first turret design in Figure 1). Then we were thinking of algorithms to control the rotational position of the motor. Like what we did later with linear stage, we could implement PID control on voltage supply to the motor. Then we needed a way to sense motor position as feedback. All these things were new to us.

Later by talking to more experienced people, we realized that servo motor is specifically built for position control. It already has a built-in feedback control circuit. The only thing we need to do is inputting a reference position. We quickly switched to that option (already clear in our second turret design in Figure 2) and bought a servo motor. We successfully tested it with

Arduino’s built-in function, although we didn’t know how to input a position at lower level with sbRIO until very late.

2.

Interfacing the joystick

Initially, we wanted to add a manual mode to our beer pong shooter. A joystick could be used to control the yaw and pitch of the gun. We had no idea how the axis information of joystick could be transferred to LabVIEW. This idea was temporarily dropped as we replaced one

5 degree of freedom with linear stage. Furthermore, the mouse control feature later somehow compensated for this idea (see III. Actual Challenges - Software – 9. Testing without Kinect).

3.

Graphical user interface (GUI)

Before we got into LabVIEW, we didn’t realize its power in constructing GUI for the program. We were thinking of building our own GUI from scratch using C. Then every button would have many lines of code and take a long time to debug. GUI became not as challenging as we expected as we learnt more about LabVIEW front panel and its simple connection to variables in block diagram.

4.

Image processing

Since the real-time control of actuators depends on the target position data from the vision system, the object finding process of Kinect should be as fast as possible. Also, the program should be able to identify the cup with the specified color, and hopefully also exclude any other objects with the same color (see III. Actual Challenges – Software – 2. Vision system).

5.

Multitasking

Since most of us are used to sequential text-based programming languages, initially we were not sure how to run several tasks simultaneously. However, with loops running in parallel in

LabVIEW, we could implement multitasking easily, e.g. running the FPGA read/write loop and the PID control loop simultaneously to control the linear stage.

III.

Actual Challenges

Mechanical

Although we kept our mechanical design simple, we still encountered many real-life hardware problems. We solved some of them mechanically, and others in the form of software which will be discussed in software section.

1.

Oversimplified design

That was the time we ran into “Hello street” issue. Due to limited time and our lack of confidence in machining skills, we were thinking of removing one degree of freedom: the pitch control by servo. Then we would only have the horizontal position control left for linear stage. The cup would have to be put at a fixed distance from the linear stage. Since we already got the linear stage working, that would significantly reduce the amount of work left.

A week before the due date, we easily installed the trigger mechanism and mounted the gun to the linear stage at a fixed pitch angle (Figure 3).

6

Figure 3 Oversimplified design with the gun at a fixed pitch angle

The professor encouraged us to add one more degree of freedom with the servo motor. We thought the mechanical work was done and felt reluctant to apply that. However, we realized that with only one degree of freedom, there would also be much less control and software involved. Simply, the project wouldn’t be as cool as we wanted. Therefore, we decided to add a servo motor. We wanted the modification to be less intrusive so even we messed up, we could still restore the setup in Figure 3. Team members debated about the easiest and safest way to achieve that. Some wanted to use the servo to control yaw angle with the linear stage controlling the distance to the cup (like our turret design revision 2 in Figure 2), while others planned to use the servo to change pitch angle and the linear stage still to adjust horizontal position. We chose the latter one since we felt our servo is barely powerful enough to rotate the gun-solenoid assembly, and the control logic would be more straightforward.

2.

Tilted gun

Our servo has plastic gears and axle. We were afraid of it being not able to hold the weight of the gun and the solenoid (half pound). A bearing would be able to solve this problem, but the setup would be more complicated. After discussions with machine shop people, we planned to mount the gun-solenoid assembly directly onto the servo axle. Fortunately, the servo turns the assembly easily.

What we didn’t expect was the tilting of the gun: the long cantilever extending out of the servo axle made the axle deflect. Then instead of aiming directly forward, the gun aimed a few degrees to the left. The imperfection was amplified in the trajectory of the ball, which landed about 20 cm left to the target cup. The exact error also varied since the gun became more and more tilted during operation. We threaded the axle and used a screw to tighten the

7 connection, but that didn’t improve much. A code solution to this mechanical imperfection was implemented (see III. Actual Challenges - Software – 8. Position correction).

Software

We encountered many software challenges, including those about programming itself and code solutions to challenges that could not be resolved mechanically with ease (see block diagrams in LabVIEW project).

1.

Linear stage tuning

The linear stage, as one degree of freedom, should have a very good performance, including rising time (respond fast), settling time (settle down fast), oscillations (settle down steadily) and steady-state error (settle down at desired position).

Before any control algorithm worked, we were setting a large value (200 ms) for time step of the timed loop in LabVIEW. It prevented us from even stabilizing the system because it was constantly missing the reference position. The tuning process really started after we decreased the time step to 20 ms.

We first implemented a PD control to the system. The system became stable but with long settling time and large steady-state error. Then we added I control to our algorithm, which improved settling time and steady-state error, but also resulted in more oscillations and large overshoot. We played around with gains (weights of each action) to optimize the performance.

Since our cup rim is large enough compared to ball diameter and we could move the cup slowly, we decided to minimize I control (0.02 I gain in contrast to 0.12 P gain). Moreover, the integral windup problem made the control trickier. Integral action accumulated early large errors hence increased rapidly. When the linear stage moved close to the reference position, large integral action drove it away until proportional action brought it back again. Therefore we set a maximum position error (8 cm) below which I control started working. Moreover, I control continued to work as long as the error was non-zero. We had to set a dead band (0.5 cm) in order to really stop the linear stage. That also became our position tolerance for the linear stage.

2.

Vision system

In order to have the vision system run at maximum speed, the process was broken down into three different threads running concurrently. One thread consisted of gathering data from the

Kinect sensor, namely the color image and the depth field image. Skeletal tracking was disabled to maximize the frame rate of Kinect. Another thread was devoted to drawing the window, including the Kinect images, and sending the UDP messages to local LabVIEW VI.

If Kinect thread had new data, the images in the window thread would be updated, else the previous Kinect images would be drawn.

8

The final thread was responsible for finding the object in the Kinect images. The color to search for would be set by user mouse click in the drawn Kinect color image. This color, plus or minus a tolerance of 25 for each value of RGB, was then found in the color image, creating a mask. Initially, the centroid of the object was set as the coordinates of the mouse click.

Hard coded values for the width and height to search for (in real world coordinates) were converted to pixel distances, based on the distance of the initial centroid point. This is because at further depths from the Kinect sensor, the number of pixels that represents a unit of distance laterally is smaller compared to the number of pixels for the same lateral distance at a closer depth. In order to search the same real world area, then, different numbers of pixels need to be searched.

A new centroid was then determined by comparing the depth of the original centroid point to the depths of each positive point in the mask (plus or minus certain tolerance), and averaging the locations of each pixel that fit both of these criteria. The average location in pixel coordinates was then converted back to real world coordinates, giving the position that the gun should align itself to shoot at.

The speed of the object-finding-algorithm was not measured, but it was assumed that this was faster than the process of gathering the data from the Kinect, which ran at 60 fps.

3.

Device testing

Before putting everything into a single host VI, we tested out devices individually. We found this testing feature necessary when we were running the entire system, so we included a manual mode for every device, namely manually setting reference position for the linear stage, manually changing duty cycle for the servo motor and manually clicking fire button for the solenoid. Therefore, we were able to check if each device was working properly before automatic tracking and firing were switched on.

4.

Servo calibration

After we mounted the gun-solenoid assembly to the servo axle, we fixed the assembly’s position relative to the axle. Then we converted the duty cycle value to the gun’s pitch angle, which would be used in further manipulations. Assuming the relationship is linear, we calibrated it the same way people defined temperature scales: freezing point and boiling point of water are used as two fixed points in Celsius temperature scale; similarly, we used the duty cycle at zero pitch angle and that at 90-degree pitch angle as our two fixed points, then we interpolated the conversion between these two positions. The resultant relationship was comprised of a scale factor (gradient) and a constant offset (axis intercept).

5.

Burning out pins?

In our original host VI, we were running several parallel timed loops to read and write a single FPGA target. The motivation was to better organize controls over different actuators.

During tests, the linear stage was working most of the time, while the servo motor never

9 worked. We used the scope to probe the PPM output pin, which didn’t show any signal. We concluded that we burnt out the pin, probably by not wiring the ground properly. We checked our circuit and did more tests on other pins, but none of them was working. We thought we just burnt out all pins so asked for more. However, we actually didn’t burn out any pin.

We didn’t realize that we could read and write FPGA target in only one loop. With parallel loops running, one loop was overwriting the other loops. In our case, the stage drive loop was overwriting the servo drive loop, resulting in no output on servo pin. We figured that out by stopping the execution of one loop through diagram disable structure. It took a long time to diagnose the problem, but didn’t take long to solve it. We simply moved all FPGA references to a single timed loop and redid wiring.

6.

Trajectory calculation

A subVI converts the distance to the cup to the pitch angle of the gun. The formula we used was very simplified: where is the maximum range the ball is able to reach (ideally achieved at 45-degree launch angle), is the measured distance to the cup and is the desired launch angle.

There are two angles at which the ball can be launched in order to reach the cup. We chose the larger angle since that makes it easier for the ball to fall into the cup. Notice this formula doesn’t take any imperfection (air drag, spin, etc.) into account. They were supposed to be cancelled out by position corrections.

7.

Firing

Making the solenoid fire properly took us a lot of effort. The solenoid basically needed a voltage pulse with high enough current to fire. The 12V power supply was fed into a Toshiba

H-bridge, which was controlled by PWM signal from sbRIO. We turned 100% duty cycle

PWM on for 200 ms in order for the solenoid to give a powerful push. Up to this point manual firing could function properly.

Then we wanted to automate the entire system by making the solenoid fire automatically when appropriate. First we determined a tolerance for the linear stage position and the gun’s pitch angle. The solenoid would fire as long as the alignment was within this tolerance. After some tests, we realized that the pitch angle got aligned immediately due to fast response of the servo motor, but the linear stage might still be moving within the tolerance band (mainly because of the poorly tuned linear stage and non-rigid mechanical components). At that moment, the firing condition was already true but firing would be unreliable. Thus, we built in a half-second delay in the program, so the solenoid would wait for the system to settle down and then fire.

10

8.

Position correction:

Kinect was fixed with constant offsets from the gun. It sent x and y coordinates of the cup, from its point of view, to host LabVIEW VI. In addition, more offsets were required due to the tilting of the gun. Therefore, those raw coordinates (x for distance parallel to the stage motion, y for distance perpendicular to the stage motion) must be processed in order for the gun to position and aim correctly.

Initially we used constant offsets for x and y (y subsequently converted to a duty cycle for the servo PPM). Keeping y constant, we noticed moving the cup laterally (in x) caused the gun to miss target. This meant a constant offset was not good enough to interpret the x position. We checked a few values for x from Kinect and took note of how far off the shot was. We empirically concluded that a good function to interpret Kinect coordinates was the following: use constant offset to calculate the position the stage would move to; then multiply this result by 4/3. They are easy to apply through coding. This function worked very well, so we moved on to y-coordinate calibration.

Unfortunately, during our trials to calibrate y, we discovered that the x offset function just programmed only worked for shots at that particular y distance. This meant both the x and y corrections had to be functions of both x and y. We devised both an empirical and an analytical method to program the offsets taking both x and y into account, but didn’t have time to put it to practice (see IV. What we wish we would have done – Software – 2. More accurate position calibration).

9.

Testing without Kinect

Our development of the vision system was completely separate from the rest of the system.

Kinect was not available for testing until the very last few days, meaning the target’s position information was initially not available. Therefore, we devised another way to test target tracking abilities of the linear stage and the servo motor.

In order to test two devices at the same time, we needed to arbitrarily change a pair of values

(corresponding to coordinates of the target). The position of the mouse on computer screen was a good choice, because it simply provided information about two axes. We created a VI on computer to send mouse data to the real-time target through a network variable. With some calibration, the mouse motion was able to simulate the cup motion, so the two-degree-of-freedom response could be tested without a vision system, though the position offsets were not taken into account in those tests. Moreover, clicking the middle button would start fire action.

Later when we integrated Kinect into our entire system, we still kept the mouse control feature, which acted as a manual aiming and firing mode that user could play with.

10.

Blocked view of target

11

Another problem we had during tests was the loss of target in Kinect’s view, due to either people blocking the view or cup being knocked down. With the original programs, the system would behave in an unpredictable way. The powerful linear stage could be a source of danger.

The problem was solved in the following way: in Kinect’s program, it would send all-zero position information to sbRIO if the cup disappeared from its view; also in real-time host VI, it would return the last position of the linear stage as reference position in succeeding iterations through a shift register.

IV.

What we wish we would have done

In general, we really wish we would start the project earlier this summer and spent more time on it. That would not only increase the quality of our mechanical system and software, but also allowed us to make the project more impressive.

Mechanical

Our biggest mechanical issue is the tilted gun. We would like to use bearings to support the weight of the gun-solenoid assembly, so the only task for the servo motor was to rotate the assembly. We believe that even just replacing our current servo with a higher quality one would improve the performance a lot. We actually received a servo with metal gears and axle on the day of demonstration, just in case our current servo broke.

Software

1.

Trajectory generation for linear stage control

We wish our linear stage performance could be better. The major issue came from the integral control. With I control only working in a preset error boundaries (between 0.5 cm and 8 cm), the stage still doesn’t work well when there is a large change in reference. We would like to implement trajectory generation in our algorithm. It will break a large change in reference into smaller steps, so the linear stage is able to respond more smoothly. That will allow us to move the cup faster while the stage still tracking it steadily.

2.

More accurate position calibration

Empirical method:

Undo any previous functions on x and y coordinates, and keep only constant offsets to Kinect values. Perform a lot of practice shots with the cup in different positions, record Kinect’s x-y pairs and an approximate value for how far off the ball landed from the cup. Then plot the two sets of coordinates (Kinect (x, y) and (Kinect x + x error, Kinect y + y error) and get a

12 polynomial fit for both sets. Then we take the difference between two fit functions, which would result in our correcting functions for x and y.

Analytical method:

Instead of applying a simple idealized formula (see III. Actual Challenges – Software – 6.

Trajectory calculation), we could do a more detailed analysis about the positional relationship among Kinect, gun and cup, and also with the effect of a rotating gun taken into account. The functions for x and y would involve a lot of trigonometric functions. With constant offsets we had before, the non-linearity of these functions was likely causing shots to miss target.

3.

More advanced object finding algorithm

With more time, the real world area in which to search for the object would have been made dynamic. As it was, this distance was hard coded. A more accurate way to code the algorithm would take into account the speed and acceleration of the object. In this way, the area to search would have been predictive, so that there would be no chance of losing the object by having it move out of the search frame. In order to maintain performance, the resolution of the area to search would have to be dropped for larger speeds. At the speeds we were attempting to track, however, this prediction was not necessary.

Download