Appendix - ECE Users Pages - Georgia Institute of Technology

advertisement
ECE 4006
CONVOYbots
Final Report
Anees Elhammali
Michael Malluck
John Parsons
Namrata Sopory
December 11th 2003
Georgia Institute of Technology
College of Engineering
School of Electrical and Computer Engineering
Executive Summary
In the military as well as in scientific expeditions, environments are often deemed too
hazardous for human presence. This project seeks to develop a remotely monitored,
unmanned convoy of robots that could potentially be used for transporting materials over
long distances in such circumstances. A base station will be used to remotely control and
guide the lead robot for the convoy. Visual feedback obtained from the robot will ease
this process. The lead robot will be made to communicate its path to slave robots over an
802.11b wireless connection. The slave robots will then follow, forming an unmanned
convoy. Unique to this project is the use of Arcom’s Olympus development board with
Windows Embedded XP that will function as the robot controller and interface to the
wireless link.
Table of Contents
1. Introduction
1
2. Design Alternatives and Tradeoffs
3
3. Marketing and Cost Analysis
9
4. Project Technical Details
13
5. Tasks and Schedule
24
6. Project Demonstration
26
7. Conclusion
29
8. Bibliography
31
9. Appendix A
32
10. Appendix B
35
11. Appendix C
37
12. Appendix D
39
13. Appendix E
40
14. Appendix F
42
Introduction
Today, numerous development boards with an array of embedded software
capabilities are available in the market. These boards can be interfaced with hardware to
achieve a wide range of functionality with varying costs. The aim of this project was to
use three Arcom Olympus boards (running the Windows Embedded XP operating
system), each mounted atop an Amigobot robot, to a.) control the movement of the robot,
b.) establish wireless links with other boards using a wireless game adapter featuring the
802.11b protocol, and c.) transmit the path traversed by a lead robot to slave robots,
causing them to follow, thus forming an unmanned convoy. The Olympus boards were
thus used as robot controllers. Figure 1 describes a block diagram for this project.
Convoys of this nature may find use in military or scientific expedition scenarios
where environments are deemed too harsh or dangerous for human presence.
Figure 1. Block Diagram for the Robot Convoy Project.
This project was divided into different phases. In each phase, the tasks to be
accomplished were modularized and assigned to one or more team members.
The first phase of the project was research and study. Substantial research and
study was undertaken by all team members a.) to determine the feasibility of each goal
b.) to identify technology, code and documentation that could be leveraged to accomplish
project goals, and c.) to allow team members to familiarize themselves with the language
that was used to code the project (Java). This phase was completed successfully.
The second crucial phase of the project involved the development of code. This
phase was modularized to address different aspects of the project. The first among these
modules was the development of a robot control module that was to run on the Olympus
board. The module was to interface the board with the robot and control its movement in
a given direction. To ensure that correct signaling protocols were used, the Amigobot
manual [1], and code written in previous design projects was studied. The second
important task was the development of a standard module to enable wireless
communication between two Olympus boards over the 802.11b protocol using the
wireless game adapter. To this end, the features of the game adapter, and ethernet
protocol requirements were analyzed. Some time was spent on initially setting up and
configuring the wireless router needed to allow the game adapters to communicate.
The third task in this phase involved determining an algorithm to guide the lead
robot. Numerous possible implementations were thought of. The most effective one was
selected. A control station was used to remotely control the lead robot based on this
algorithm. This called for the development of a graphical user interface that would run on
the control station. This stage of the project was also successfully accomplished, although
the time taken to complete this stage exceeded initial projections.
A fourth aspect of this phase is to get visual data from a CMUcam mounted on
the first robot and transmit this to the control station. This task was undertaken and
accomplished successfully.
Throughout the development process, code was tested and integrated in
increments. This saved the team the effort of attempting to integrate all the numerous
project components towards the end of the semester.
A significant amount of time and effort was spent on to developing hardware to
mount the various components (board, network adapter and voltage regulators) on top of
the amigobots. Of importance to note is the development of a power board by the team to
service the differing voltage needs of various hardware components.
Design Alternatives and Tradeoffs
Many factors influenced the choice of the hardware and software tools used in
this project including compatibility of components (hardware and software), reusability,
and familiarity of the designers with the technology in question. Some design decisions
were influenced by the time constraints surrounding this project. A listing of all the
hardware and software used in this project may be found in the Appendices B, D, E and
F.
It should be noted that the final design of the project was a modification of earlier
plans. The project had originally been designed after brainstorming for possible
functionality options, hardware requirements, and a proposed time line. Due to
circumstances arising during the semester, changes were made to algorithm designs, to
enable the completion of the project.
The following is a discussion of the design alternatives for this project with
regards to hardware used, software used, networking options and protocols chosen, and
algorithm design options.
Hardware:
The listing of the hardware used can be found in Appendices E and F.
The Amigobot was chosen to act as the underbody for our robot. This was chosen
because previous groups had worked with this platform and showed that it would work in
this application. Also the work of previous groups had generated source code to control
the Amigobot that could be adapted to our use. This would help to shorten the time
required to get this part of the design project functional.
The Amigobot incorporates a battery of sonar sensors that make it possible for the
unit to provide feedback information as well as propulsion. This allows the Amigobot to
be used in a number of real world applications where the terrain is uncertain. This is what
made it an attractive to use in this automated convoy. Only a limited amount of
information can be sent back to the controller so it is necessary for the robots to be able to
collect enough information so they can function by themselves.
Another attractive aspect of the Amigobot is its internal power supply. The
Amigobot also contains two lead acid batteries that, in this application, were tapped into
to supply power to additional add-ons. Early on it was felt that this may be necessary.
Adding additional batteries to the top of this or another robot would mean additional
weight to be moved by the robot. This means a slower, less efficient setup.
Next a suitable control card that had to be selected to act as the brain of our
convoy. Money was received from Microsoft to buy several embedded Windows cards.
Because of the high cost of these cards, they had to be purchased with the understanding
that they would need to be flexible enough to use in future applications. Our application
required 2 com ports per board, internet access, an OS capable of running Java, and a
processor that could execute Java code with all immediate speed. The most reasonably
priced card that would meet our minimum requirements was the Olympus board made by
Arcom. It contained the largest number of com ports (4 total), had on-board Ethernet
access, and a PCI expansion slot for other possibilities, and a 800 MHz Celeron
processor. The board came with both windows CE and windows XP Embedded. XP was
used because it most easily integrates Java and allows for virtual desktop access which
allowed the designers to easily running and modifying code.
There were several design trade-offs to be considered with the power supply. The
design setup requires both 12 volts at 1 amp and 5 volts at 10 amps. The Amigobot’s
battery supply can provide several amps at 12 volts. A solution had to be found that could
provide 5 volts at 10 amps. There was discussion of using a 5 volt regulator, but some
research showed that they could not provide 10 amps at 5 volts and they are fairly
inefficient. They dissipate a lot of heat. In order to get the 5 volt supply it was necessary
to use a switching power supply. Datel was recommended to us as a reliable supplier or
such parts. After looking through their catalog a supply was found at a reasonable price.
The Olympus board comes with both windows CE and windows XP Embedded. XP was
used because it most easily integrates Java and allows for virtual desktop access which
allowed the designers to easily running and modifying code.
Software:
With the exception of the GUI, all code for the project has been written in Java.
Alternative languages considered were C and C++. Java was chosen for the following
reasons
1- The Java API has extensive support for networking applications (using the “sockets
API”). As networking and communications formed an integral part of the project, the
flexibility of the API was expected to be useful, as it did indeed prove. Also, the
“javax.comm” API helped ease the process of interfacing the “Comm”
(communication) ports on the Olympus board with all supporting hardware.
2- Preexisting code on the CMU Camera used to gather visual data was written in
Java. Again, using C or C++ would have required a translation of this code. The code
was fairly complex and this task would have required an exorbitant amount of time.
Although the preexisting Amigobot code was written in C, translating the parts of the
code needed into Java was a relatively simpler task.
3- Familiarity with the language: All team members had prior experience coding in
Java. Such was not the case with C and C++. Although it would have been possible
for all team members to come up to speed in this regard, the endeavor posed a steep
learning curve. Given the time constraints for the project, it was wisely decided upon
to use Java.
The tradeoff associated with using Java was that the supporting platform for Java
runs a “garbage clean up” at certain intervals of time that causes the real time response of
the system to slow down. Such is not the case with C or C++ compilers.
The J# environment in Microsoft’s Visual Studio Suite 6.1 was used to develop
the GUI for the project. J# is an extension of Java, but allows for easy development of
user interfaces through a visual interface rather than having the user hand code the GUI.
Network and Communications:
The nature of the targeted market required the project to be implemented with
network communication tools that fit the structure of the internet today. A real world
application of the project will almost certainly require the convoy be remotely guided.
Keeping this potential requirement in mind, setting the robots up on a Wireless Local
Area Network (WLAN) was called for. One means of implementing such a network uses
of the IEEE 802.11b protocol. Alternatives included other IEEE 802.11 versions and
short-range non-hop dependant technology such as Blue-Tooth. The advantages of
802.11b over its alternatives are:
1- 802.11b is the most widely used wireless technology today which allows for
integration of the convoy into larger networks.
2- 802.11b provides a bandwidth of 11Mbps which exceeds the network bandwidth
requirement for this project. IEEE 802.11a and IEEE 802.11g protocols provide a
bandwidth that is much higher. In real world applications, large bandwidths of the
network may not be available for use. It thus made sense to implement the
802.11b protocol in this project.
3- Due to the wide spread usage of 802.11b, hardware related to the same is cost
effective when compared to technology implementing the alternatives.
4- Because 802.11b is the most widely used wireless technology, designing the
product to work with 802.11b makes the product marketable.
IEEE 802.11b was thus the networking protocol of choice for this product.
The second aspect of the communications section of the project was the design of
the communication network between the robots comprising the convoy. These robots
were programmed to move and function as a convoy based on a guidance algorithm.
The original algorithm for this had Bot1 send its commands to both Bot2 and Bot3.
An alternative to this was having Bot1 send its commands to Bot2 only, and have the
latter forward commands to Bot3 after it executes them first. The first option was not
used in the final project as the algorithm made it difficult to avoid collisions between
Bots 2 and 3. This was because Bot3 having no knowledge of Bot2’s position, would
collide with it if Bot2 was stopped unexpectedly. The alternative not only allowed for
collision avoidance in this respect but also allows for a distribution of the “path data
transmission” workload across all members of convoy.
The wireless hardware used for this project is listed in the “Communications”
section of the appendix Appendix B). Each robot in the convoy used a Linksys WirelessB Game Adapter to establish a link to the network. Because the Olympus boards had
configured Ethernet ports, using these game adapters reduced the network interface
design time dramatically as designing software drivers was not necessary.
Visual Feedback:
To enable the end user to navigate the robots with ease, a CMU camera was
mounted atop the lead robot. Alternative cameras included a stand-alone, light weight
video camera that could send real time visual feed to the user. While the latter option
would have allowed for a relatively noise free image, and real time viewing, the
CMUCam provided three main advantages:
-The extensive functionality of the camera: the CMU Camera has a color tracking
function, feedback from which can be used to make the robots follow a specific object.
Such functions could be used to develop the project further, for more sophisticated uses.
-Availability of code: Numerous projects have been done with this camera before.
There was therefore a lot of code already written from which to get ideas on how to
retrieve and process the pixel data from the camera.
A tradeoff of using the CMU camera though was the low feedback receipt rate.
Although the camera can take up to 17 frames per second, image data is stored in large
arrays of pixels, transmitting which took time. This resulted in a low refresh rate for the
image seen by the user. Real time visual feedback was not possible.
Marketing and Cost Analysis
Numerous applications exist for robots especially in situations deemed hazardous
or mundane for human beings. There is at present no physical product realization of a
robot convoy in the market. The following is a market and cost analysis for such a
product.
The markets for the application of single user, remotely controlled robot convoy
technology are immense. In the military, convoys of robot acting as “mules” can be used
to haul ammunition, medical supplies and food. Convoys of drone ambulances can be
used to load wounded soldiers and cart them to hospitals. Robot convoys can also be used
for transportation of equipment and food supplies during scientific expeditions. In both
these cases, robots will replace human beings in a hazardous environment as well as
spare them having to engage in mundane activities, and thus utilize their skills elsewhere.
On a smaller scale, convoys of small robots may be employed in large offices, hospitals
or hotels to assist custodial staff, or act as mail delivery servants. One day, such robots
may even be used to drive trucks or cars in convoys. In every situation, utilizing robots
instead of human beings implies either a decrease in danger and effort, and /or a possible
increase in efficiency in performing the required task.
It should be noted that the hardware selection for this project was influenced in
part by monetary, familiarity and time considerations. As such, each hardware component
may be replaced by an alternative. For example, given a tighter budget a smaller
embedded development board with less functionality than the Olympus board may be
used; or, to expand functionality to account for obstacle avoidance, GPS may be used in
conjunction with 802.11b as a networking technology between robots. Because the most
important aspect of the product, the software running on the robots, is coded in Java and
is portable, the product may be customized with relative ease. The following cost analysis
delineates the development cost of the product and the estimated list price for it.
The cost savings to a customer in the form of reduced workload would more than
offset the cost of startup equipment.
The total estimated development cost of our project is divided into two sections:
cost of parts, and initial labor (Table1).
Table 1. The Estimated Initial Cost
Component
Quantity
Cost per item
Amigobot Robot
3
$1600
Arcom Olympus board
3
$1000
CMU Camera
1
$150
Linksys (WGA11B)
3
$50
wireless game adapters
Linksys BEFW11S4
1
$70
Wireless Router
Custom made power
3
$29
board( refer to appendix
C for more details)
Total Cost of Hardware
= $8257
Development Cost
(Labor)
540 hrs
TOTAL COST
52/hr
Total Cost
$4800
$3000
$150
$150
$70
$87
$28000
~$36,257
The total cost for parts for one fully functional robot was found to be on average
$2752. This drove the cost of parts to $8257. A total of 540 people hours of work were
put into the project by the team engineers. Assuming that that an entry level engineer
earns approximately $100,000 a year, the total labor cost, including benefits, is
approximately $36,470 per robot convoy. This cost assumes that a fully operational
research and development lab is available to the engineers.
Were the “Convoy bots” to be marketed and sold as a product, Table 2 shows a
listing of development, maintenance, utilities and labor costs that would be incurred, and
charged to a customer.
Table 2. Sunk and Sustenance Costs for a Research and Development Laboratory.
Cost (in dollars per month)
Location around 2000 sq feet
$2000
Equipment(computer hardware, software
$5000
licenses, test equipment, and workshop
tools)
Marketing (marketing personnel )
$3000
Development(4 full time engineers)
$33000
Utilities( including data connection,
$2000
telephone lines, administrative
assistance….etc)
TOTAL COST
~$45000 per month (540,000 per year)
Once a product is developed and sold, support for the product would be free but
the customer would be charged for maintenance and consultation fees. The maintenance
fees for each product sold will be charged based on the cost of an engineer’s labor and
necessary equipment.
The bulk of the cost charged to a customer would be for consultation. Because of
the nature of the convoy solution, it can be customized to fit a large sector of
applications. To customize the convoy to perform a specific application the customer
would be charged by the hour. According to table 2, it will cost $281 per hour to run a
research lab with four fulltime engineers and all the necessary hardware, software,
marketing and administrative personal, and utilities. Assuming 24 customizable convoy
solutions are sold per year, each requiring about 320 hours of work, which is half the time
we spend on developing the first prototype of the solution. 320 hours or 80 hours of lab
time two weeks of full time work put in by four engineers, sums to revenue of $540,000
per year. The 24 projects will cover the entire year of work plus the hardware cost which
will be charged to the customer. This will result in revenues to cover the cost per year to
run the research, development and sales of our product.
Project Technical Details
Figure 2 shows a sample of the final product of the Robot Convoy project.
Figure 2. The Lead Amigobot for the Robot Convoy Project.
The complete project features three Amigobots, each mounted with an Olympus
Embedded XP board, a Linksys Wireless–B game adapter and a power board developed
by the team. These robots are remotely controlled over a wireless 802.11b connection
(using the Linksys Wireless-B broadband router), by a user on a remote machine. A
graphical user interface (GUI) on the remote machine is used to start and guide the lead
robot (Bot1). When commanded to start, Bots 2 and 3 follow the path traversed by Bot 1
thus forming a convoy. A CMU Camera, mounted on the head of Bot 1, returns a visual
image of the area in front of Bot 1 to allow the convoy controller to navigate the convoy
easily. An in depth technical description of the product follows:
Graphical User Interface:
The GUI developed for this project consisted of two parts: the front-end and the
backend.The front-end of the GUI is shown in Figure 3. It represents the panel seen by
the user. The GUI is divided into three main sections. Section 1 contains the Start, Stop
and Direction Controls for Bot1. Section 2 contains the Start and Stop buttons for Bot2
(which as we will see later also control Bot3’s movements). The final section contains an
Emergency Stop button that stops all the robots, and a text box that prints the current
status of the application. A menu bar on the GUI allows the user to access ‘Help’ files
describing the objective of the ConvoyBot project as well as ‘How To’ files which
provide a step by step guideline for running and using the software.
Figure 3. GUI FrontEnd
The back end of the GUI was divided into two threads. Both are started as soon as
the GUI application is launched. The first thread ‘Client_GUI.java’ sets up a network
connection between the GUI and Bot1 with the GUI acting as a ‘client’, while Bot1 runs
a ‘server’ program. This thread deals exclusively with sending path movement commands
from the GUI to Bot1. When the user clicks on the robot control buttons, a unique
number associated with each button is sent to the ‘Client_GUI’ program. This number is
then wrapped in a TCP packet and sent to Bot1. On Bot1, the number is unwrapped from
the TCP packet, and interpreted to cause the robot to respond.
The second thread running on the GUI backend is the ‘Client_Camera.java’ thread. This
is responsible solely for receiving image data in pixel arrays from Bot1, creating a
viewable image from this data (using the ‘CameraImage.java’ file), and displaying the
same to the screen. When the application is launched, the Client_Camera program is
automatically run, causing the “camera image feedback” window to display the image
caught by the CMU camera on Bot1. This image is refreshed whenever data arrives at the
client from Bot1.
Network Connections and Robot Control Algorithm
All three robots were made to run their own servers with static IP addresses.
These servers keep listening until a connection is established on the specified port. Bot2
and Bot3 both have just one server to receive path and movement commands. Bot1 has
two servers as opposed to the others. The second server on Bot1 sends visual feed from
the CMUCam to the GUI.
Bot1 and Bot2 are also acting as clients to their followers. They forward
movement commands to their followers (Bots 2 and 3 respectively) only after they
themselves execute the commands. This was done to ensure that the convoy maintains its
order. The client on Bot1 client connects to that on Bot2, and Bot2’s client subsequently
connects to Bot3. The alternative to this method is to have Bot1 as a point of contact for
all convoy members, and have it forward its executed path to both followers. This
approach had disadvantages. Firstly, the follower robots after Bot2 (in this case, Bot3),
have no knowledge of Bot2’s location which could cause a collision if Bot2 was stopped
for any reason. The second reason, briefly discussed inn the design alternatives section, is
that if the convoy becomes larger (say 7-8 robots instead of 3) and Bot1 is distributing
commands to all convoy members, the network and processing workload for Bot1 will
increase
dramatically
potentially
jeopardizing
its
performance.
The
current
implementation of the algorithm ensures that the workload and the guidance algorithm
computation are evenly distributed among all convoy members. Figure 4 depicts this
algorithm diagrammatically.
Figure 4. Block Diagram of Code running on the GUI, Bot1, Bot2 and Bot3.
With regards to the code running on the robots, Bot1 launches and starts two
threads when it is activated. One thread communicates with the camera and the other
handles the robot’s movement. The two threads are completely independent of each
other, so they are both run as threads from a class called ‘Server.java’.
The camera thread simply sends the ‘dump frame’ command to the camera and
retrieves the stream of pixel data, which it stores in an array. This array is sent to the user,
where it is interpreted and displayed in a window on the user’s desktop.
The movement thread, listens to the GUI for commands from the user. Once a
command is received, the thread checks to see if the command is a ‘start’ or ‘stop’
command for Bot1. It then adjusts the robot’s movement status accordingly. A ‘boolean’
variable is set to true if the command was a ‘start’ and false if the command was a ‘stop’.
The variable is set to true by default, so a click of the start button on the GUI is not
necessary for the robot to move unless the stop button is clicked.
If the command is a movement command, the robot checks to see if its movement
‘boolean’ is true and executes the command if this is so. If the ‘boolean’ is false, the
command is ignored. This is because it is assumed that the user is starting and stopping
the Bot1 based on visual feedback from the camera. If the user sees something in the
camera that requires stopping the robot, it would not be prudent to keep putting
commands in a queue to be executed blindly once the start button is clicked. If the user
wishes to have the robot take a certain sequence of actions, the start button can be
clicked, followed by the desired movement commands.
The second robot (Bot2) receives commands from Bot1 and begins storing them
in a vector until it receives the command to start moving. At this point, the second robot
starts a new thread to pull move commands out of the vector and send them to the
AmigoBot driver. Once the second robot has executed a move, it sends the command on
to the third robot and it executes the command immediately.
The collision prevention provision in Bot2 and Bot3’s respective movement code
checks whether it can make its current move without colliding with the previous robot.
Since the only movement commands are a forward move, rotate 90 degrees left and rotate
90 degrees right, the only command that can cause a collision is the forward move. Thus,
when Bot2 or Bot3 is about to move forward, it checks the remainder of the movement
vector for another forward move. Only if it finds another forward move does it execute
the current one. This simple algorithm works because Bot2 and Bot3 do not receive move
commands until the previous robot in front of them executes the said command, so it can
assume that if it has two or more forward move commands in its vector that it can safely
execute the first move command one.
This algorithm, however, will not catch more complicated collision scenarios. For
example, if Bot1 moves forward twice, then turns left twice to turn completely around,
and moves backs towards its starting point, it will collide with Bot2. Bot2 will find three
forward moves in its vector and assume that the first two can be executed safely.
Responsibility to avoid these types of collisions, unfortunately, rests with the user. Since
there are infinitely many scenarios for one robot getting in another robot’s way, any
algorithm that tries to determine collision scenarios based on the moves in the vector
would be extremely complicated and would still invariably have some loopholes.
The only manageable way to solve this problem would be to have each robot have
knowledge of where the others are. Unfortunately, the built in coordinate system in the
AmigoBot resets itself after every move is made, making it useless for this application
without additional software to keep track of location information. Even this would not be
suitable for long convoy journeys because the robots can only guess how far they have
traveled based the number of times the wheels have rotated. If the wheels slip or if
the convoy travels on different surfaces with varying friction, the robots’ guesses of how
far they have traveled will have a significant error. If this project were implemented on a
large scale for outdoor use, global positioning satellites would need to be used, which
would allow for easier and more reliable tracking of the robots.
Implementation Bot3 is different from the proposed implementation, which called
for each robot to essentially behave independently, but follow the same path. However,
no solution could be found for the problem encountered when trying to forward moves
from robot two to robot three.
Bot2 has two threads running, one that listens to Bot1 for moves and another that
executes the moves. Neither could be used to forward commands the in a way that would
allow for autonomous behavior of Bot3. The listening thread could not be used because
Bot3 would only receive move commands when input has been explicitly sent to it.. This
causes problems if Bot3 is stopped for a while. When Bot3 is then started, it will only get
commands when more commands are explicitly sent to it. Using the movement thread
causes problems because the movement thread will not forward moves when it is put to
sleep. This also causes problems if Bot3 is started well after Bot2 because it will not get
the moves it needs to catch up to Bot2 if Bot2 is stopped.
Two methods were attempted to get around this, one using an additional thread on
Bot2 and one using both Bot1 and Bot2 to communicate with the third robot, requiring
that Bot3 have threads listening to each robot and a third executing the moves. If a new
thread is put on Bot2 for forwarding commands, it would not know which moves have
been executed by the movement thread, so it would not know when it should forward the
commands. The problem with Bot3 listening to the other two robots is that when the start
command is received by the new thread, it still must let the other threads know. The
thread listening to Bot2 cannot receive this or it may miss a command from robot two.
The start command cannot be sent to the movement thread because there is no guarantee
that it exists yet.
Hardware Details:
The project consists of 5 major hardware components.
The brain of each robot is the Arcom Olympus board mounted atop each robot
Figure 5.
Figure 5. The Olympus board [5].
It provides the processing power and serial ports necessary to integrate all of the
devices. The Olympus board comes with both windows CE and windows XP Embedded.
XP was used because it most easily integrates Java and allows for virtual desktop access
which makes running and modifying the code trivial. This board requires a maximum of
eight amps at five volts and one amp at twelve volts to run properly; although, these
power requirements fluctuate due to the processor load and peripherals used with the
board.
The robot of choice to form the physical convoy is the Amigobot. The Amigobot
provides propulsion, power, and sensor data that can be analyzed via serial port
communication. The Amigobot contains two 12 volt lead acid batteries that provide
power for the other peripherals as well as the Amigobot. The power connection inside
the Amigobot had to be spliced into to make this possible.
The Amigobot itself is a dumb-terminal and plays no role in the algorithm
calculations. It merely follows the commands it is issued. The Amigobot talks to the
Olympus board via communication port one (COM1).
The convoy is provided vision via a CMUCam (Figure 6). This is mounted atop
Bot1. This camera captures images at regular intervals and sends them to the Olympus
board, again by serial communication. This camera requires a five volt DC source in
order to run.
Figure 6. The CMUCam that is mounted atop Bot1.
Communication between the individual bots and the user is provided by Linksys
wireless-B game adapters. These interface Ethernet protocol to the wireless standard.
This allows for greater hardware and software flexibility. The adapter is not driver
dependant. As long as the interfacing board provides Ethernet connectivity, the adapter
can be used to establish a wireless link. This adapter also requires five volts at one
amp in order to run.
In order to power these many peripherals a voltage converter board (VCB) was
created (Figure 7). Due to the small margin for error (a faulty power supply could lead to
the destruction or malfunction of the components) great care was used in its creation.
Figure7. The Power Board Developed for the Project.
The VCB takes twelve volts from the Amigobot’s batteries and provides a 5 volt
source as well as a prototyping section for any additional modifications. The five volt
section incorporates a Datel 12vDC- 5vDC converter (part number LSN-5/10-D12). The
prototyping section was used to supply 12 volts directly to the output.
Noise was another consideration for this board. The DC motors the Amigobot
employs can generate a considerable amount of electrical noise that can be injected back
into the battery supply. To prevent this interference decoupling capacitors were employed
across each terminal. For each terminal on the VCB both a large 1000μF electrolytic
capacitor (Digikey part number P5142-ND) and a smaller .1 μF ceramic capacitor
(Digikey part number 1210PHCT-ND) was used. These provide AC shorts that keep
noise from propagating through the circuit.
Smaller design considerations included a means to connect the wires to the boards
and a method to test their functionality. Two wire terminal blocks (Digikey part number
277-1236-ND) were used to make the wire connection easier, and LED indicators were
included to test the proper operation of the 5 volt and 12 volt board sections. A per board
parts list and schematic and can be found in Appendix E.
The physical design of the board also allows for the use of a potentiometer to
adjust the output voltage of the DC-DC converter. This was found to be unnecessary so it
was not used in this project, but has been left on the design board incase future projects
find it useful.
Testing:
Individual modules of the project were tested during the development process. As
individual modules were completed, they were integrated with other components and
tested further. The final product was tested in the following way:
Once setup, all robots were pinged from the control machine to ensure that a
wireless connection had been established successfully. A series of commands were then
sent to the Bot1 and its movements were observed. The robot was found to respond well
to the commands sent to it. As all code components had been tested prior to integration
with the hardware, once the integration was accomplished, the robots were found to move
correctly. Only minor changes in the code were required after this. The robot convoy was
tested on both a carpeted floor as well as a smooth tiled floor. The convoy
responded well on both surfaces. In both cases, the formation of the convoy was
maintained well.
During the testing process it was found that
-When given a number of sequential forward commands, Bots2 and 3 deviated a
little from the linear path they were supposed to follow behind Bot1. This was attributed
to a.) The wheels of the robots slipping on the smooth floor; b.) Due to a slight error that
figures into the turns of the robots. Implementing an xy coordinate map as discussed
earlier would help address this issue.
- It was also found that when the Amigobots ran low on battery power, their
response to commands received became unpredictable at times.
Tasks and Schedule
Shown below is the Gantt chart for the project.
Figure 8. Gantt Chart showing projected and actual completion times.
Many of the tasks were compressed into the end of the semester, but fortunately
no major obstacles were encountered. This allowed for the demonstration to take place
just one business day late.
The biggest delay in the project was the late arrival of the hardware. The two
AmigoBots that had to be ordered did not arrive until November 19th. The last two
Arcom Olympus boards did not arrive until the 24th of November. This, as expected,
severely delayed the physical construction of the robots.
The most severe delay on the software side was the overhaul of the computer lab
in which the GUI/robot communication was being tested during the middle of the
semester. This forced Anees and Namrata to work around the times in which the lab was
out of service and to test whatever code they could without any hardware.
The tasks, with one exception, were completed in the order expected. In the first
half of the semester, the basic groundwork was laid with Anees, Michael, and John each
writing a driver to control one of the major components of the project. Namrata,
meanwhile, designed the GUI to be used for getting commands from the user. All of
these tasks were completed about one week later than expected, but this was due to
overly optimistic scheduling rather than any problems with development. Once the
Ethernet driver was completed, the camera driver had to broken into two sections to
allow for visual feedback to be transmitted over the wireless link.
After the drivers were completed, Anees and Namrata began working on the
algorithm for the first robot and the user interface back end concurrently. These two tasks
were so closely related, especially given that Bot1 was not used as a master robot, that
they were developed at the same time. This timeframe was two weeks late for Bot1
algorithm, but two weeks early for the user interface back end. The only major
development problem encountered here was attempting to get the image from the camera
to display in the GUI. Eventually, the image was just displayed in a separate window.
Anees and Namrata continued to attempt to get the image to display in the GUI until the
project was completed, but to no avail.
At this point, John and Anees wrote the algorithms for behavior of Bot2 and Bot3.
Adding Bot2 was not a problem and was accomplished within a week. At noted earlier,
however, adding Bot3 created a major problem given the originally desired functionality.
This problem resulted in the task taking 26 days rather than the expected 19.
When all the major materials had finally been collected, Michael began collecting
the minor pieces of hardware and building the robots. Despite the compressed
construction schedule, all of the robots were completed at about the same time as the
software, allowing for thorough testing. Due thorough testing of each piece of code as it
was written, there were no surprises during the final testing and the robots behaved as
expected once the software was downloaded onto them.
Project Demonstration
To demonstrate the functionality of the robot convoy project, the project was first
setup according to the steps delineated in Appendix A.
Once the hardware, network, software for the robots, and the GUI were setup, the three
Amigobots were placed in alignment a foot apart (in line) on a smooth floor in a long
corridor. Two runs of the demonstration were conducted to demonstrate all aspects of the
convoy’s functionality. First, the robot control buttons on the GUI were clicked in the
following pattern:
Start Bot1, Forward, Forward, Right, Start Bot 2, Forward, Forward, Right, Forward,
Left, Forward, Forward, Forward, StopBot1.
Figure 9. Convoy after Bot1 executes ‘Right’ command.
Figure 10. Bot2 Executing the ‘Right’ Command While Bot1 moves forward.
Figure 11. Bot2 Following Bot1 in a Convoy. Bot3 was about to start moving.
Conclusion
To improve on the functionality of the convoy, the robots need to be able to detect
obstacles, navigate around them, and get back into the convoy. This was not implemented
because the robots have no way of knowing where the others are. Without this knowledge
a complex algorithm would be needed that counts the number of left, right and forward
commands needed to get around the obstacle and calculates based on this what moves it
needs to make to get back on the path. Even then, no matter how many possibilities the
algorithm took into account, there is the possibility of an even more complex scenario of
obstacles that would throw the entire convoy off.
A much simpler and less vulnerable solution is to either use a GPS system or an
internal xy-coordinate system to keep track of the robots. If this project were
implemented for use with large, outdoor vehicles, then GPS would be the logical choice.
For the Amigobot implementation, however, GPS is not an option because it cannot be
used indoors.
The Amigobot has functionality for an xy-coordinate system, but it resets itself
after each move. Thus, it would require another class that takes the movement data and
keeps track of the robot relative to its starting position. With this information, a robot
could navigate around obstacles and use its coordinate system and that of the robot it is
following to get back into the convoy once it is done with the obstacle. Since each robot
would keep track of its position relative to its own starting position and the robots start
with one robot’s length between them, by attempting to get to the coordinates of the
previous robot, a robot would actually be getting back into the place it should be in the
convoy.
The final product met all project goals. Three Arcom Olympus boards (running
the Windows Embedded XP operating system), were successfully mounted atop three
Amigobots, and were effectively interfaced with them. The Olympus boards were able to
control the movement of the robots. Wireless communications between the robots and a
control station were effectively established. A GUI running on the control station was
successfully used to control the movement of the lead robot in the convoy. Visual
feedback from a CMUCam mounted on the lead robot was effectively transmitted over
the wireless connection and displayed on the control station. Lastly, upon receiving
movement and path commands, the lead robot was able to effectively transmit its path to
robots aligned behind it. The ‘slave’ robots successfully followed the lead robot’s path,
thus forming a remotely, and partially controlled, unmanned convoy.
Bibliography
[1] ActivMedia Robotics “AmigoBOT Technical Manual,” (2000), Available HTTP:
http://www.ece.gatech.edu/research/labs/diglab/downloads/AmigoTech.pdf
[2]
Department of Defense Transformation Project, Available HTTP:
http://www.defenselink.mil/specials/transform/
[3]
G. Dudek, M. Jenkin, E. Milios, D. Wilkes, ”Experiments in sensing and
communication for robot convoy,” International Conference on Intelligent Robots
and Systems (IROS), pp. 268-273, August 1995.
[4]
M.J. Mataric, G.S. Sukhatme, E.H. Ostergaard, “Muti-robot task allocation in
uncertain environments,” Autonomous Robots, vol. 14, no. 2-3, pp. 255-63, May
2003.
[5]
Arcom Corporation “Olympus Embedded XP Data Sheet,” (2003), Available
HTTP: http://www.arcom.com
Appendix A
1.Setup
1.1 Hardware setup
- Charge all Amigo Bots (which will give you around 20 min of run time
with all components connected).
- Turn on the power switch of the Amigo Bot, power board, Olympus
board, and CMU camera.
- Align all bots behind each other with a foot and a half distance in between.
- For more detail refer to the hardware appendix.
1.2 Network setup
- The convoy can be operated on any wireless LAN. We used our own
LAN, which means we had a wireless Linksys router to distribute traffic.
- Each Bot needs to either obtain a dynamic IP address or a static IP address
needs to be assigned. In our case, static IP addresses are used as listed
below on Table A1:
Table A1. Static IP Addresses
Component
Bot1
Bot2
Bot3
Address
192.168.1.150
192.168.1.160
192.168.1.170
- The remote host runs the GUI and obtains its IP address dynamically.
- For more detail refer to communication appendix.
1.3 Software setup
- Each Bot needs java runtime setup, we used jre1.4.2 (later versions had
not been tested).
- The host computer needs .net runtime installed (servicepack1, later
versions had not been tested).
- Each Bot needs to have specific files to operate correctly as listed below
on Table A2. These are the class files, so the source code needs to be
compiled before putting it on the bots.
Table A2. Source Files Needed on Each Bot
Convoy List of Files
Member
Bot1
Server.class, Server_Thread.class,
Camer.class, CameraImage.class,
ArrayPixels.class, serialComm.class,
serialport.class, PixelData.class,
AmigoComm.class, AmigoStatus.class,
InfoUpdater.class,Pulser.class
Setup.bat, JavaRun.bat
Bot2
bot2_server.class, bot2Mover.class,
movedata.class,
AmigoComm.class, AmigoStatus.class,
InfoUpdater.class,Pulser.class
Setup.bat, JavaRun.bat
AmigoComm.class, AmigoStatus.class,
InfoUpdater.class,Pulser.class
Setup.bat, JavaRun.bat
Bot3
bot3_server.class, bot3Mover.class,
movedata.class,
AmigoComm.class, AmigoStatus.class,
InfoUpdater.class,Pulser.class
Setup.bat, JavaRun.bat
AmigoComm.class, AmigoStatus.class,
InfoUpdater.class,Pulser.class
Setup.bat, JavaRun.bat
GUI
Form.jsl, Client_GUI.class,
Host
Camer_Client.class
ArrayPixels.class, serialComm.class,
serialport.class, PixelData.class,
-
File to run
Server.class
bot2_server.class
bot3_server.class
WindowsApplication.exe
(To Run just click on it)
Run the software in this order Bot3, Bot2, Bot1, and finally the host GUI.
-
-
To run code on the Bot establish remote desktop connection (start>
Accessories> Communications as show figure A1) to each Olympus
board. This will give a window of the desktop view of window XP
embedded on each board. The names for each board on the remote desktop
connection are Olympus-bot1, Olympus-bot2, and Olympus-bot3 (these
name were given to each board as the computer name and could be
changed).
The login name and pass word are Administrator.
Figure A1. Remote desktop connection box.
-
To Run software on each robot (Setup.bat and JavaRun.bat are given
below) As also shown in figure A2:
o Access each folder on bot.
o Command>Setup.bat
o Command>JavaRun.bat File.class
Figure A2. Bot3 command line for running bot3_server.class
- To run the GUI just click the executable For more detail refer to the GUI
appendix.
1.4 Driving Convoy
- Refer to GUI appendix on driving directions.
Appendix B
1. Communication
- The communication between robots is done through LAN on
IEEE802.11b wireless protocol.
- We used a Linksys WAP11 wireless router, as shown below on figure B1,
to route traffic and distribute IP addresses, and we used a Linksys
WGA11B Game Adapters, as shown below on figure B1, as wireless
interfaces to each Olympus board.
- The Game Adapter was configured using Linksys user guide at
ftp://ftp.linksys.com/pdf/wga11b_ug.pdf
- The Linksys router was configure using Linksys user guide at
ftp://ftp.linksys.com/pdf/wap11v26ug.pdf
Figure B1. Wireless router and Game Adapter.
-
Each Bot on the convoy forwards its executed commands and receives
commands from the preceding member. Each Bot creates a server socket
to listen to a designated port until a connection from the preceding
member is made. As an example this is how bot2 listens to Bot1’s
commands on port 4444:
Socket client_bot=new Socket();
ServerSocket server=new ServerSocket(4444);
client_bot=server.accept();
Bot1will attempt to connect to Bot2 on IP address 192.168.1.160:
Socket bot2_client=new Socket("192.168.1.160",4444);
Each Bot will either write to a socket stream or read from it to transmit
information over the network. This done using a PrintWriter object to
write to a socket and a BufferedReader object to read from a socket. Bot2
reads from its socket as follows:
BufferedReader in = new BufferedReader(new InputStreamReader(
clientSocket.getInputStream()));
String inputLine=in.readLine();
Bot1 on the other had will write to the socket as follows:
PrintWriter out_bot2 = new PrintWriter(bot2_client.getOutputStream(), true);
out_bot2.println(“any text you wand”);
This is how the data is transmitted between Bots. Bot1 has an additional
Server that sends visual feedback continuously to the GUI. The structure
of the communication is sketched with the flow chart below of Figure B2.
The port numbers were randomly chosen out of the available ports. For
more details the algorithm of data distribution between the convoy
members refer to the technical detail section of this report.
Figure B2. Flow chart of data flow on the convoy network.
Appendix C
1.The Graphical User Interface
-The Graphical User Interface developed for this project is shown in Figure C1.
Bot 1 Controls
Robot Controls
Bot 2 & 3 Controls
Emergency Stop
Text Messages
Figure C1. GUI snapshot.
-The GUI has 3 main sections:
1- Bot 1 Controls: This section contains 5 buttons
Start: This button causes Bot1 to begin executing commands sent
to it.
Stop: This button causes Bot1 to stop moving. Bots 2 and 3 will
also stop after catching up to Bot1
Front, Left and Right: These buttons comprise the Robot Controls
section of the GUI. While the Left and Right buttons only cause the robots
to turn left or right by 90 degrees, it is the Front button that actually causes
the robots to move forward
2- Bot2& Bot3Controls: This section contains 2 buttons
Start: This button causes Bot2 to begin executing commands sent
to it.
Stop: This button causes Bot2 to stop moving. Bot3 will also stop
after catching up to Bot2.
3- Emergency Stop and Text messages:
- A Stop All Robots button has been included in the GUI. This
functions as an emergency stop and causes all robots to stop immediately.
- A text message box at the bottom of the GUI displays the current status of the
application
2.Getting Started: Running the GUI
-To launch the “ConvoyBotsGUI” user interface to allow a user to control the
robots, the following steps must be followed:
1. On the machine in use, ensure that Microsoft Visual Studio .Net
2003 has been installed correctly.
2. Open Microsoft Visual Studio .Net 2003.
3. Select File->New->Project and create a new J# project.
4. Navigate to the folder in which the file “Form1.jsl” has been
stored. This file has been stored on the Source CD and may be
downloaded from there. Copy this file to the folder of the project
just created in Microsoft Visual Studio .NET 2003.
5. Once the file has been added to the project, click File->Open in the
Visual Studio environment and open the file.
6. Click Build->Build <Your Project Name>
7. This should compile the project.
8. A text message saying “Build: 1 Succeeded” should be displayed
in the output window
9. To start the GUI, first ensure that all the robots in the convoy are
running. Then click Debug-> Start without Debugging. This will
run the GUI.
10. To exit the program, click the ‘x’ button on the right hand corner
of the GUI, or click File->Exit.
Appendix D
1. The CMU Camera communicates with the robot via com port 2 on the Olympus
board.
 The only command used in this project is the dump frame command, which
involves sending the string “\df” to the camera.
 The camera sends back the string “ack” to acknowledge the command, followed
by a stream of bytes for the pixels in the image.
 A ‘1’ is sent for a new frame and a ‘2’ is sent for a new column.
 The pixel data is sent pixel by pixel from top to bottom and column by column
from left to right. For each pixel, red, green and blue values are sent in that order
as bytes.
2. The camera appears in Figure D1.
Figure D1. CMU Camera
Appendix E
1.Parts list: totals per board on Table E1.
Table E1. Cost of Power Board
Part Name
Part number
PCB Board
Construction
1000uF 16V
Electrolytic Capacitor
.1 μF 50v ceramic
Price per unit
N/A
$8.94
Number necessary
for one board.
1
Digikey P5142-ND
$0.144
6
Digikey 1210PHCT-
$0.05783
6
capacitor
2 Wire terminal
Blocks
Slide switch
470Ω Resistor
100 Ω Resistor
LED
12VDC to 5VDC
converter
ND
Digikey 277-1236ND
Digikey EG1906
Digikey 470ETRND
Digikey 100ETRND
Digikey
LSN-5/10-D12
Total price for one board:
2. Schematics of power boards are on figure E1.
$0.429
6
$0.31537
$0.01495
1
1
$0.01495
1
$0.0615
$15.60
2
1
$28.78
Figure E1. PCB Board construction schematics.
Appendix F
1. There are several specifics that are necessary to setup and operate the Convoybot.
-The first is the layout and position of the major components that make up the
Convoybot.
1
3
2
4
Figure F1. Convoy bot components.
Amigobot setup.
- The four major components on figure F1 are:
1. Olympus Board: The main processing unit.
2. Wireless Game Adapter: Provides communication between the units.
3. Power Supply Board: Provides power for the different units.
4. Master power switch: Controls power for all external componets.
A more detailed view of the Olympus board shows where all of the peripherals
connect to the board.
1
2
3
4
Figure F2. Convoy Bot peripherals connections.
- Here are the major connections on figure F2:
1. Wireless Ethernet Adapter connection (Ethernet Port)
2. Amigobot Connection (Com1)
3. CMUCam Connection (Com2)
4. Power connection
- A close look at the Voltage Controller Card will reveal the underlying connections.
1
2
4
5
3
6
Figure F3. Voltage Controller Card.
-The connections on figure F3 are as follows:
1. Olympus 5 volt power connector.
2. Wireless Ethernet Adapter. (5 Volts)
3. CMUCam connector. (5 Volts)
4. Olympus 12 volt power connector.
5. Unused 12 volt connector.
6. 12 volt battery input.
Download