Final Report

advertisement
The Intellimouse Explorer: An Intelligent Way to Solve a Maze
V. Fucsko, B. Houle, H. Mitro, M. Olson
Abstract – The micromouse competitions have
been running since the 1970s all around the
world. The modern competitions however
originated in the 1980s. A modern competition
maze consists of 16 by 16 cells. The “mouse”, an
autonomous robot, must find the center of the
maze in the smallest time possible. The mouse
must track of the path traveled and not destroy
the maze at any time. The competition rules are
based on the IEEE APEC Micromouse
regulations.
I. Introduction
The Intellimouse team successfully designed
and tested a working micromouse prototype that
conforms to the IEEE APEC Micromouse
contest rules. The design integrates a Handy
board microcontroller board, two stepper motors,
and twelve various sensors to allow the
micromouse to autonomously navigate a given
maze. The design was completed well under
budget (See Section VII) and was successfully
demonstrated at a Northwest-area IEEE
sponsored competition.
Analysis of the
micromouse’s hardware reliability and its
product lifecycle are included in this report.
Additionally, we included a “future work”
section detailing our recommendations and
experiences for the benefit of any future teams
that may continue the project.
The goal of the Intellimouse Explorer
project was to design a self-contained,
autonomous, electrically powered vehicle called
a “micromouse”, to negotiate a path to the center
of a maze. We built micromouse in accordance
with the IEEE APEC micromouse Contest Rules.
A. Specifications
The micromouse must be self-contained and
fit within a 25-cm square area. The maze is
comprised of a 16 x 16 array of 18-cm square
blocks with white, infrared reflective walls and a
black, infrared absorbing floor. The tops of the
walls are red. The micromouse starts in a corner
of the maze, searching for the goal in the center
of the maze. After finding the center of the
maze, the micromouse remembers the course it
traveled. It then begins a “speed run” in which it
returns to its starting point and navigates its way
back to the center of the maze in the minimal
amount of time. During this process, the
micromouse
operates
without
outside
intervention.
II. Procedure
A. Microcontroller
The microcontroller is the brain of the
micromouse. It controls everything the
micromouse does while in the maze. Based on
the inputs the microcontroller receives from the
sensors, it calculates a path for the micromouse
to navigate the maze. It also controls the drive
motors to move the micromouse, and after
reaching the goal, optimizes its path to the
starting square. The microcontroller board is
low power and the entire board is smaller than
9x12cm. We also desired a microcontroller that
is C-programmable and user-friendly. Based on
these requirements we decided to use the MIT
HandyBoard, a Motorola 68HC11-based
microcontroller. We broke down the work on
the microcontroller into three different functions:
path finding, movement, sensing. The path
finding algorithm receives inputs from the
sensors, calculates the best move, then the
movement algorithm outputs commands to the
motor drivers. The micromouse navigates the
maze using the sensor wing for guidance; the
sensor wing detects when the micromouse is offcenter, and makes minor corrective adjustments
to the micromouse’s heading.
B. Drive System
The micromouse's drive system consists of
two stepper motors, using ball casters for balance
(as shown in Figure 1). The micromouse turns by
rotating the motors opposite directions, allowing
for 360° turning in a single cell of the maze.
Acceleration dynamics play a larger role in the
micromouse’s maze solving time than its top
speed because of general maze design. Based on
a worst-case analysis of the maze with our time
goal of 10 minutes for a run, the motors must
accelerate the micromouse at 0.8 m/s/s and drive
the micromouse at an average speed of about 0.1
m/s. According to these criteria, we chose the
ReliaPro 42BYG stepper motor. It is a high-
Fucsko, Houle, Mitro, Olson
2
torque stepper motor, which gives plenty of
acceleration for dynamic operation.
The Handyboard integrates many functions
that simplify the drive system design process.
The Handyboard contains an onboard motor
controller and built-in libraries, which allowed
us to directly connect the motors to the Handy
Board without developing extra control
algorithms. The motors are rated for 6-12V at
under 1.25A each, meaning that the motor can
draw its power directly from the Handyboard’s
power supply.
C. Sensors
The micromouse uses two types of sensors,
analog and digital. It is equipped with a row of
eight Fairchild QRB 1134 proximity sensors
overhanging the top of the maze in a “sensor
wing” on both sides of the micromouse to keep
the micromouse centered within the cell and to
map the environment in adjacent cells. These are
digital sensors, which output a one when they
detect a wall and zero otherwise. We use Sharp
GP2D120 short-range distance sensors and
Sharp GP2D12 long-range distance sensors to
detect the length of the path in front and to the
sides of the micromouse. These infrared sensors
are low cost, high resolution sensors, and have
been used effectively in several micromouse
designs. They transmit an output proportional to
the distance to the nearest object within a range
of 10 to 80 cm. The sensors are minimally
affected by reflectivity and surrounding lights.
Therefore, we calibrate the sensors to function
predictably for varying maze and ambient light
conditions.
Figure 1: Chassis Side View
The chassis required two wheels with
sufficient traction to avoid slipping, and two
caster wheels for balance. The GMPW wheels
are made to match almost any DC or stepper
motor. The wheels are molded from ABS
(Acrylonitrile Butadiene Styrene) plastic, and
they measure 58mm across by 7.62mm wide. It
uses a rubber tire that increased the total wheel
diameter to 60mm and dramatically improved
the wheel’s grip. Two Tamayi, Inc. ball casters
(shown in figure 2) balance the micromouse.
These casters attach directly to the micromouse
housing using standoffs to place them at the
correct height.
III. Systems
A. Chassis Design
Figures 1 and 3 show the micromouse
chassis design. We used the Handyboard’s builtin plastic housing for the base of our chassis,
attaching the motors and ball casters directly to
it.
Figure 2: Ball Caster
Figure 3 below shows the sensor mounting.
Fucsko, Houle, Mitro, Olson
Bottom View
1
2
3
4
1.8
11
3
Ball Caster
M
4
Sensor
Wings
5
6
7
8
Figure 4: Sensor Wing Function
*All values in cm
Figure 3: Sensor Position
The distance sensors attach to the front of
the motors and the sides of the ball casters as
shown in figure 3. The distance between each
proximity sensor is approximately 10mm, and
the wing is 24cm long. The wing is composed of
wooden dowels attached to the front of the
chassis. The front sensor wing holds the eight
QRB1174 Infrared proximity sensors 25 mm
from the front of the microcontroller and 2mm
from the top of the walls of the maze. They form
a line directly above a forward wall when the
mouse sits in a cell. These sensors minimize the
micromouse error during straight-line movement
and detect dead end paths.
B. Centering
The micromouse uses a very simple
proportional response control system to keep it
centered within a cell. When the micromouse is
centered, sensors 2 and 7 (see Figure 4) will see
a wall. If the micromouse veers slightly from its
course, sensors 3 or 6 see a wall, indicating that
the micromouse should make a minor adjustment
toward the middle of the path.
If the
micromouse veers far enough from the path that
sensors 4 or 5 see a wall, the micromouse will
make a large turn toward the center.
This centering algorithm works very well for
dense mazes and on long straight paths. While
reducing the number of walls in the maze hinders
the response of the centering algorithm, the tests
indicate that the micromouse still centers
adequately in sparsely populated mazes.
IV. Hardware Reliability
In order to determine the feasibility of mass
producing our micromouse design, we looked its
potential failure modes. After a detailed analysis
of our design, we organized the failure types into
three categories: Handy Board Failures,
Electrical Failures, and Mechanical Failures.
Component Severity
Failures
Handyboard
Occurrence
Reparability RPN
32K SRAM 8
Mem. Latch 8
2
5
3
2
48
80
6811Motorol 8
a
16x2 LCD
3
1
4
32
3
4
36
Passive
5
3
1
15
Active
5
3
1
15
Power Adp.
7
3
2
42
LEDs
2
4
1
8
Motor
Drivers
Buttons
8
3
3
72
4
1
5
20
Trace Failure 7
2
9
126
Electrical
8 AA NiCad 8
5
2
80
Sharp Sensor 4
7
1
28
Prox Sensor 2
7
1
14
5
1
20
1
1
3
1
8
64
Mechanical
GMPW
4
wheel
Glue
3
Stepper s
8
Table 1: Potential Failure Modes
Fucsko, Houle, Mitro, Olson
We calculated the failure mode and effect
analysis (FMEA) using three ratings based on a
10-point scale, with higher scores indicating
increased risk. The three ratings are severity,
occurrence and reparability. The severity rating
is an estimate of the severity of the repercussions
of a given failure. The occurrence rating is an
estimate of how often a failure type might occur.
The reparability (use detectability) rating is an
estimate of the difficulty in compensating for a
given part or failure type. The product of the
above ratings will provide a risk priority number
(RPN), which prioritizes the sub-modules that
need improvement.
Figure 5 shows the Fault Tree Analysis of
the Intellimouse Explorer. These values are
estimates based upon a production volume of
10000 units. This analysis shows that our
product has a 4.47% failure rate, meaning that
447 out of 10000 will be defective.
Figure 5: Fault Tree Analysis
4
B. Maturity
The market for the micromouse is very
limited and would rapidly saturate. While the
sales of the current iteration of the micromouse
begin to dwindle, we can make software and
hardware improvements to the micromouse to
produce an improved version for future sales.
See the Future Work section for details on
possible improvements to our current design.
C. Decline
Eventually the market for the micromouse
will drop below the profitable level and the
product will be discontinued. To extend the
project’s life, we can continually introduce new
versions of the design.
D. Disposal
Although all of the components in the
micromouse can be discarded when they fail,
most of them can be recycled through various
means. The battery pack is a set of NiMh
(Nickel Metal Hydride) rechargeable batteries
shrink sealed in a plastic coating. Disposal of
NiMh batteries into normal waste is allowed but
many battery disposal facilities exist that will
recycle the components of the batteries. The
circuit board can also be recycled, but electronic
component recycling in the United States is still
a developing technology and may not be readily
available in most areas.
V. Project Life Cycle
VI. Test Results
The product lifecycle analysis outlines our
plan for the micromouse’s release into the
market after we complete the design phase.
We verified that the micromouse operates in
accordance with the specifications using the
procedures
outlined
below.
During
development, we tested each of the subsystems
individually. After we completed the design, we
verified the system performance as a whole. The
desired characteristics and the guaranteed
specifications of the design are summarized in
Table 2 below.
A. Introduction to Market and Growth
Once the product is released, we will need to
provide customer support. Since our design
team is small, the most efficient way of
supporting the users is to ensure that we supply
adequate documentation to the user. Since the
micromouse will most likely be used by high
school students to learn about the different
components of robotics and by robot hobbyists
who wish to start with and improve a base
design, we will include a copy of Interactive C
with the product. Interactive C gives the user the
ability to modify certain aspects of the
micromouse’s code during runtime. We will
provide a troubleshooting guide, project ideas,
the code we developed in testing, and our
development tools.
Fucsko, Houle, Mitro, Olson
5
Measured Characteristic
Straight-Line Movement
(w/out sensor feedback)
(1)Variation in distance
1
(2) Perpendicular
(3) Velocity (m/sec)
3.3
3
29
0.1
0.12
Turning
(4) Variation in degrees
(5) Angular Velocity (deg/sec)
1
0.8
120
130
infinite
75
Micromouse in Maze
(6) Num. steps before collision
(7) Average steps to solve
175
60
(8) Average time to solve
N/A*
2.14
10
3.73
N/A*
2.13
175
133
(12) Max steps to solve maze
N/A*
578
(13) Ave. turns to solve maze
(14) Ave. time to solve maze
(min)
(15 Mazes solved within time
limit (%)
N/A*
129
N/A*
4.9
100
81
25
14.5
(9) Maximum time to solve
(10) Average time per cell (sec)
Micromouse in Software
(11) Average steps to solve
maze
General
(16) Max Length (cm)
(17) Max Width (cm)
25
24.5
(18) Min Battery Life (min)
15
> 60
(19) Max Code Size (Kb)**
16
10.3
Percent of All of Mazes
Maze Solver Performance
Original
SpecificationsTest Results
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
0
Dense Random Mazes (10,000)
Competition Mazes (44 Total)
Less Dense Random Mazes (10,000)
0.5
2.5
3.5
4.5
5.5
6.5
7.5
8.5
9.5 10.5 11.5 12.5 13.5 14.5
Time to Solve (min)
Figure 6: Maze Solver Performance
Figure 6 above shows the performance of
the solving algorithm on three different types of
mazes. The average solving time is 5 minutes per
maze. However, there are mazes that the
Intellimouse is unable to solve in less than 10
minutes due to the complexity of the maze and
its maximum velocity of about 0.1 m/s.
B. FloodFilling
A modified floodfill algorithm is used for
navigating the maze. Floodfill “floods” the maze
assigning each square with the number of moves
it would take to reach the goal from that square.
Because the value of the squares indicates their
respective distances from the goal, the mouse
always moves to the neighbor with the lowest
value. As the mouse discovers more of the maze
the values of the squares change to represent the
new distance from the goal. An example is
shown below in figure 7.
Table 2: Test Results
A. Software
The software consists of the hardware
interaction and maze solving components. We
designed a random maze generator and statistical
testing functions to test various maze solving
algorithms. To test the algorithms thoroughly,
the test mazes contain dead-ends, stair-step
diagonals, loops, and multiple possible paths to
the center goal. We tested our maze-solving
algorithms on over 100,000 mazes without
failure. The average (11 steps, maximum (12
steps, and average number of turns (13 are
shown in Table 2. In addition, we used the
straight-line and angular velocity data to
determine the distribution for the amount of time
it takes to solve the maze vs. number of mazes.
The average time to solve the maze (14) and the
percent of all mazes that could be solved within
the ten-minute time limit (15) will be calculated.
1.5
Seen Maze 1
0
1
2
3
4
8
7
6
5
6
0
7
6
5
X
3
1
6
5
4
3
2
2
5
4
3
2
1
3
4
3
2
1
0
4
Seen Maze 2
0
1
2
3
4
8
7
6
7
8
0
7
6
5
8
X
1
6
5
4
3
2
2
5
4
3
2
1
3
4
3
2
1
0
4
Figure 7: Maze Solution
In this simplified example the goal is
the square located in the lower right of the maze
at (4, 4). If the mouse is located at (1, 3) with
the information shown in Figure 7 Maze 1, the
mouse will move down to position (1, 4) because
its value of 3 is the lowest of its neighbors [3, 5].
Once it moves to (1, 4) it sees the wall blocking
its path to the goal. The grid is flooded to
indicate the new information and the mouse
again will move to the lowest value neighbor, the
8 at (1, 3). This process continues until the
mouse reaches the goal.
Fucsko, Houle, Mitro, Olson
1. Biasing
An issue that becomes apparent once the
mouse is running is what to do when two
neighbors exhibit the same value. The order in
which the equal values are chosen can have a
significant impact on the time to solve certain
mazes. The biasing system we used is illustrated
in Figure 8.
Equal values are chosen based on the
quadrant the mouse resides in. These priorities
are represented by ordered matrices.
For
example, the matrix for quadrant one is [3, 1, 0,
2]. The numbers represent directions in a
clockwise direction starting with zero being up.
Therefore the preferred directions would be [left,
right, up, down]. Using Figure 3.4 we can see
that
the
mouse
favors
an
overall
counterclockwise rotation about the maze. There
is no definitive reason for the counterclockwise
biasing, clockwise would work just as well on
random mazes.
The maze originally
programmed for however, favored a counter
clockwise bias.
C. Drive Systems
The drive system consists of the motors and
wheels and its purpose is to move the
micromouse in the desired direction. Differences
in the straight-line distance traveled for a given
number of steps was measured. The mouse was
directed to travel 674 steps (288 cm), which is
equal to the full length of one side of the maze.
This operation was repeated 10 times, and the
actual straight-line distance traveled (in the
intended direction) was measured for each
iteration. The average distance traveled, as well
as the maximum percent deviation from the
average, recorded in (1), were calculated. The
centering error of the mouse was also measured
as a ratio of the distance traveled perpendicular
to the straight-line motion and the straight-line
distance traveled (2). The mouse traveled
almost 30% of its forward distance to the right,
6
due to the placement of the front ball caster on
the right side of the mouse. However, this error
is corrected by feedback from the sensors in the
complete system. The actual distance traveled
by the mouse (calculated from the straight-line
and perpendicular distances using the
Pythagorean theorem) was used to calculate the
average velocity of the mouse (3). The amount
of turning variation (4) and angular velocity (5)
were also measured using the same process
described above for straight-line motion, with
the mouse directed to turn 360o for each
iteration. Small errors (less than 3%) are
inherent in the stepper motors and the small
variation in the size of the wheels, and can be
detected by the sensors and corrected. Larger
errors are most likely due to a misalignment of
the wheels. A wheel may not be drilled
perpendicularly, or it may not fit tightly on the
shaft of the motor.
D. Sensors
The micromouse uses three types of sensors:
QRB 1134 proximity sensors, Sharp GP2D120
short-range distance sensors, and Sharp GP2D12
long-range distance sensors. We tested the
calibration curves of distance to wall vs.
microcontroller output for the Sharp analog
distance sensors. These curves can be found in
the Appendix. The proximity sensors are digital
and have an optimum detection distance of 0.2
inches. When attached to the sensor wing, the
proximity sensors return a one value if there is a
wall directly underneath.
E. Battery
The micromouse contains eight nickel
metal-hydride batteries that must provide power
to all of the components of the micromouse for
at least the full ten-minute competition time. To
test the battery life, we created a test program
read the sensors, performed calculations, and
moved the motors. The micromouse ran for over
an hour, at which point we artificially stopped
the micromouse.
We determined that the
batteries are capable of providing far more than
the required power to the micromouse. (19)
F. Complete System
The micromouse was tested in 14 maze
configurations. These mazes contain all of the
desired characteristics described in the software
section above. We used these tests to determine
the following:
• The number of unrecoverable collisions. (6)
• The average time to solve the maze. (8)
Fucsko, Houle, Mitro, Olson
• The maximum time to solve the maze. (9)
• The average time per cell. (10)
VII. Budget
The Intellimouse team went well under budget,
as shown in Table 3.
Component
Estimated Cost Actual CostDifference
MIT Handy Baord
$350
$305.00
$45.00
Stepper Motors
$45
$43.64
$1.36
Sensors
$60
$102.75
($42.75)
Mouse Chassis
$20
$2.55
$17.45
12 Pack NiMH
$35
$0.00
$35.00
Battery Charger
$20
$0.00
$20.00
Maze
$60
$170.31 ($110.31)
$79
$0.00
$79.00
Passive Components
Active Components
$100
$0.00
$100.00
Printed Circuit Board
$200
$0.00
$200.00
Report Copies
$25
$47.40
($22.40)
$0
$25.00
($25.00)
Wheels & Ball Caste
Memory Latch
$0.45
$0.45
$0.00
32K SRAM Memory
$5.49
$5.49
$0.00
TOTAL
$1,000
$702.59
Remaining Budget
$297.35
Table 3: Intellimouse Budget
VIII. Future Work
While our design functions within
specifications, it still has many avenues for
improvement. For example, the sensors tend to
report phantom walls if the micromouse is offcentered. In order to minimize this error we
could use a “vote check” sensing algorithm.
Vote checking samples the sensors many times
per cycle or samples the sensors against each
other to determine the presence of a wall. Any
discrepancy in the sensor data will cause the
micromouse to throw out its erroneous data and
recheck the area for a wall. This method can
increase the time it takes to sense walls;
however, it will vastly improve the accuracy of
the micromouse.
The Handyboard has a small amount of
memory, which limits our ability to solve the
maze.
We found that if the micromouse
encounters a very long dead-end path, the
floodfill algorithm recursively searches down
that path and runs out of memory. Using a larger
memory pool would fix this problem.
Our final problem comes from our physical
design and connections. Currently, we connect
the sensors to the Handyboard using wire
bundles. These connections are very unstable
and susceptible to noise. Designing a PCB
7
sensor wing would not only improve the
connections and physical reliability of the sensor
wing dramatically, but would also allow us to
use much smaller sensors.
The following ideas would require a
significant rework of the micromouse design, but
may considerably improve the micromouse:
• Bidirectional control – Allow the
micromouse to run forward or backward,
rather than turning 180°.
• Four drive wheels – Using four-wheel-drive
improves the micromouse’s acceleration and
traction dramatically.
• DC motors – While DC motors are more
difficult than stepper motors to control, they
provide much faster speed.
IX. Conclusion
Although we ran into our fair share of
difficulties ranging from complete hardware
failure to small software bugs, we feel that we
successfully designed and tested a working
micromouse prototype that conforms to the IEEE
APEC Micromouse contest rules.
X. Bibliography
[1]
“Applied Power Electronics Conference
and Exposition (APEC) micromouse
Contest Rules,” [Online document],
1996 Mar 7, [cited 2005 Mar],
Available HTTP: http://www.apecconf.org/APEC_micromouse_Contest_
Rules.html
[2]
Douglas W. Jones, “Control of Stepping
Motors,” [Online], 2004 Sep 21, [cited
2005
Mar],
Available
HTTP:
http://www.cs.uiowa.edu/~jones/step/
[3]
Pete
Harrison,
“micromouse
Information Centre,” [Online], 2004
Mar 4, [cited 2005 Mar], Available
HTTP:
http://micromouse.cannock.ac.uk/
[4]
Fred Martin, “The Handy Board
Technical Reference,” [Online Manual],
2000 Nov 15, [cited 2005 Mar],
Available
PDF:
http://handyboard.com/techdocs/hbman
ual.pdf
Fucsko, Houle, Mitro, Olson
APPENDIX A: An Intelligent design to solve a maze
Start
LookAround
ChooseDir
Better
Neighbor?
Yes
No
Flood(Position)
TurnTo
StepForward
No
Offcenter?
No
Yes
Turn
No
Full
square?
Yes
Goal?
Yes
End
Figure 9: Floodfill Flow Chart
8
Fucsko, Houle, Mitro, Olson
Start
Moves < 43
yes
No
Wall front?
Analog(4) > 100
No
yes
Shift left?
Digital(12) &&
!Digital(8)
Update position
yes
End
Turn Right
Turn(1, 1)
No
Shift right?
Digital(7) &&
!Digital(13)
yes
Turn Left
Turn(0, 1)
No
Way left?
Digital(10)
yes
Swerve Right
Turn(0, 5)
No
Way right?
Digital(10)
yes
Swerve Left
Turn(0, 5)
Figure 10: Intellimouse Control Algorithm
9
Fucsko, Houle, Mitro, Olson 10
Percent of All of Mazes
Maze Solver Performance
16
15
14
13
12
11
10
9
8
7
6
5
4
3
2
1
0
Dense Random Mazes (10,000)
Competition Mazes (44 Total)
Less Dense Random Mazes (10,000)
0.5
1.5
2.5
3.5
4.5
5.5
6.5
7.5
8.5
9.5 10.5 11.5 12.5 13.5 14.5
Time to Solve (min)
Figure11: Floodfill Algorithm Performance
Fucsko, Houle, Mitro, Olson 11
APPENDIX B: A New Approach to Maze Solution
Modified Floodfill Algorith with Bias
Another modified flood fill algorithm has also been implemented. Using this
algorithm, the mouse is biased to move toward the center on the longest straight-line path
possible, unless the next cell in that direction has been visited too many times. In this
case, the mouse prefers to move away from the goal to explore other possible paths. At
each intersection, the algorithm floods all the known paths to eliminate dead ends,
identify if a known path to the center exists, and decide the next move. Figure 9 below
shows a flow chart of the algorithm.
Start
Initialize
while(!found goal)
Flood possible paths
Decide
move
no path
found
Dead end,
go back the
way we came
Go forward until
intersection
NO
Goal?
YES
Finish
Figure 12: Modified Floodfill with Bias
With information from the distance and adjacent wall sensors, the mouse is able to
identify dead ends without visiting all the cells. However, this algorithm heavily relies
on accurate sensor data. The mouse will never revisit a cell which it believes to be a dead
end, so it may not have an opportunity to correct a wall that has been misidentified. The
efficiency of the algorithm decreases drastically when less sensor information is
available. In addition, this algorithm requires more memory allocated for the storage of
maze information, because it relies on two arrays to store horizontal and vertical walls.
This algorithm solves all mazes in software and has also been implemented to run on
the mouse, however a few glitches exist. Because Interactive C does not accept arrays
Fucsko, Houle, Mitro, Olson 12
with more than 16 columns, the dimensions of one of the maze wall arrays had to be
inverted. All the indexes in the code were inverted. However, this caused other problems
that we have not yet been able to identify. The mouse is able to solve simple mazes using
this algorithm. Nonetheless, it often returns an array out of bounds runtime error.
Fucsko, Houle, Mitro, Olson 13
APPENDIX C: Sensor Measurements
GP2D12 Calibration Curve
2.5
2
Vout (V)
1.5
1
0.5
0
0
10
20
30
40
50
60
70
80
90
Distance (cm)
Figure 13: Sensor Calibration Curve
Sharp GP2D12 and Microcontoller Interface
130
120
110
100
microcontroller output
90
80
70
60
50
40
30
20
10
0
0
5
10
15
20
25
30
35
40
45
50
55
60
65
70
distance (cm)
Figure 14: Microcontroller & Sensor interface
75
80
85
90
95
Fucsko, Houle, Mitro, Olson 14
APPENDIX D: Chassis and Sensor Diagrams
Bottom View
1.8
11
Ball Caster
8
4
Motors
Sensor
Wings
4
*All values in cm
Figure 15: Chassis and Sensor Layout
18
Fucsko, Houle, Mitro, Olson 15
Figure 16: Chassis schematic
Fucsko, Houle, Mitro, Olson 16
APPENDIX E: Hardware Analysis
Figure17: Intellimouse Fault Tree
Fucsko, Houle, Mitro, Olson 17
APPENDIX F: Intellimouse Budget
Table 1: Intellimouse Budget
Component
Estimated Amount
Actual Cost Difference
Microprocessor (Motorola 68HC11)
$350
$305.00
$45.00
Stepper Motors
$45
$43.64
$1.36
Sensors (Sharp GP2D12/GP2D02)
$60
$102.75
($42.75)
Mouse Chassis
$20
$2.55
$17.45
12 Pack NiMH Rechargeable Batteries
$35
$0.00
$35.00
Battery Charger
$20
$0.00
$20.00
Maze
$60
$170.31
($110.31)
Passive Components
$79
$0.00
$79.00
Active Components
$100
$0.00
$100.00
Printed Circuit Board
$200
$0.00
$200.00
Posters & Report Copies
$25
$47.40
($22.40)
Wheels & Ball Casters
$0
$25.00
($25.00)
Memory Latch
$0.45
$0.45
$0.00
32K SRAM Memory
$5.49
$5.49
$0.00
TOTAL
$1,000
$702.59
Remaining Budget
$297.35
Fucsko, Houle, Mitro, Olson 18
APPENDIX G: Test Results
Table 2: Intellimouse Test Results:
C. Measured Characteristic
Straight-Line Movement (w/out sensor feedback)
(1) Variation in distance traveled (%)
(2) Perpendicular offset (% of distance traveled)
(3) Velocity (m/sec)
Turning
(4) Variation in degrees turned (%)
(5) Angular Velocity (degrees/sec)
Micromouse in Maze
(6) Average number of steps before collision
(7) Average steps to solve maze
(8) Average time to solve maze (min)
(9) Maximum time to solve maze (min)
(10) Average time per cell (sec)
Micromouse in Software
(11) Average steps to solve maze
(12) Maximum steps to solve maze
(13) Average turns to solve maze
(14) Average time to solve maze (min)
(15) Mazes solved within time limit (%)
General
(16) Max Length (cm)
(17) Max Width ((including wings) (cm)
(18) Min Battery Life (min)
(19) Max Code Size (KB)**
D. Original
Specifications
E. Test Results
1
3
0.1
3.3
29
0.12
1
120
0.8
130
infinite
175
N/A*
10
N/A*
75
60
2.14
3.73
2.13
175
N/A*
N/A*
N/A*
100
133
578
129
4.9
81
25
25
15
16
14.5
24.5
> 60
10.3
Fucsko, Houle, Mitro, Olson 19
APPENDIX H: Test Plan
The following document can be used to verify that the Intellimouse Explorer
complies with the specifications listed in the table below.
Measured Characteristic
Straight-Line Movement
2.1 Variation in forward distance traveled
2.2 Perpendicular offset (% of distance traveled)
2.3 Velocity
Turning
3.1 Variation in degrees turned
3.2 Angular Velocity
Solving
4.1 Number of steps before collision
4.2 Average steps to solve maze
Original
Specifications
Guaranteed
Values
1%
3%
0.1m/s
< 5%
< 30%1
0.1m/s
1%
120(deg/s)
1%
> 120(deg/s)
infinite
175
varies
133
< 10 min3
4.3 Time to solve maze
General
5.1 Length
5.2 Width
< 10 min
5.3 Battery Life
5.4 Code Size
> 15 min
< 16 KB
1
2
3
4
< 25cm
< 25cm
2
< 15cm
< 25cm
4
60 min
< 16 KB
The perpendicular offset is corrected for during normal operation using feedback from the sensor wings.
The number of steps before collision varies widely. The mouse can run the maze one time and complete
it without crashing. On a second run however the mouse may collide with a wall a few steps into the
maze.
The maximum speed of the mouse limits its ability to complete some mazes within the 10 minute
specification. Using the speed values of the mouse we calculate that it can solve 80% of mazes within
the 10 minute limitation.
During testing we never fully tested the life of the battery. After one hour the test was ended because it
far exceeded the maximum amount of time allowed in the maze.
1. Preparation
In order to perform most of these tests, the mouse must be loaded with the test
software package included on the CD. Please refer to the user guide and follow the steps
in Section 2 to download test.c to the micromouse. For help navigating the test package’s
menu system, see Section 5 of the user guide.
Throughout this document references are made to sections in the User Guide. Before
beginning this test plan, either read and understand the user guide, or have it handy for
reference.
Fucsko, Houle, Mitro, Olson 20
1. Straight-Line Movement
2.1 Variation in forward distance traveled
1. Place the mouse somewhere with plenty of room to move forward. The mouse
will attempt to move 288cm forward.
2. Mark the mouse’s starting position.
3. Navigate the menu system on the mouse and select “Move Forward.”
4. Once the mouse completes its move, measure the distance traveled from the
starting position.
5. Calculate the percent error in forward motion using 288cm as the actual value.
2.2 Perpendicular offset
1. Place the mouse somewhere with plenty of room to move forward. The mouse
will attempt to move 288cm forward.
2. Run a long straight object parallel to the mouse touching the outside of its left
wheel. This will be used to measure the offset from. (The mouse pulls to the
right by default)
3. Mark the mouse’s starting position.
4. Instruct the mouse to “Move Forward.”
5. Once the mouse completes its move, measure both the distance traveled parallel
to the straight object from step 2, and the perpendicular distance from that object.
6. Divide the perpendicular distance by the parallel distance to determine the
perpendicular offset.
2.3 Velocity
1. Follow the steps in 2.1, but clock the time the mouse takes to complete the
process.
2. Divide the distance traveled by the time in seconds to complete the process.
3. Turning
3.1 Variation in degrees turned
1. Place the mouse on a piece of paper.
2. Mark the initial position of both wheels of the mouse on the paper. Somehow
differentiate the two wheels.
3. Instruct the mouse to “Turn Right.” (Be sure that the paper is held firmly in
place)
4. Once the mouse completes its turn, mark the new position of the wheels onto the
paper.
5. Measure the angle between the marks for each wheel. Calculate the percent error
using an actual value of 90°.
6. Take the average of both wheels to determine the variation in degrees turned.
3.2 Angular velocity
1. Place the mouse somewhere where it can turn in a complete circle.
2. Instruct the mouse to “Turn 360”
3. Divide 360° by the time taken to complete the maneuver.
Fucsko, Houle, Mitro, Olson 21
4. Solving
4.1 Number of steps before collision
The number of steps before collision is not a measurable specification. Repeated
testing does not result in a reliable result. The mouse either completes the maze
unhindered, or it crashes at some random point in the maze. Collisions are caused by
incorrect turn values. The incorrect turns are caused by the path correction algorithms. If
the mouse corrects its path just before turning it will not make a correct turn and runs the
risk of crashing before it can compensate.
4.2 Average steps to solve maze
There are two methods that the average steps to solve a maze can be calculated. The
mouse can be run in a multitude of mazes. The final step value displayed on the LCD at
completion can be recorded and an average of all runs on all mazes can be calculated.
The second method uses the random maze simulator. For help running the maze
simulator, see Section 4 of the user guide.
1. Launch the floodfill simulator.
2. Instruct the simulator to make multiple runs.
3. Specify the number of mazes to run. (The higher the number of mazes, the more
the result will converge to the true average)
4. When the simulator completes, divide the total number of steps by the number of
runs.
4.3 Time to solve maze
The best way to calculate the time to solve a maze is to time the mouse as it solves
the maze, but a way to calculate an approximate average for all mazes follows:
1. Calculate the Velocity as in 2.3.
2. Divide 16cm by the velocity in cm/s. (This is the time in seconds to move a step).
3. Calculate the average number of steps to solve a maze as in 4.2.
4. Multiply the number of steps from 3 by the seconds/step value from 2.
5. General
5.1 Length
The length of the mouse is the distance from the back to the front of the sensor wing.
5.2 Width
The width of the mouse is simply the width of the sensor wing (the widest part of the
mouse).
5.3 Battery Life
1. Place the mouse somewhere with plenty of room.
2. Instruct the mouse to “Test Battery.”
3. Time until the mouse stops working.
5.4 Code Size
Code size is shown when downloading code to the mouse from within Interactive
C. See Section 2 of the user guide for instructions to download code to the micromouse.
Fucsko, Houle, Mitro, Olson 22
APPENDIX H: Intellimouse Explorer User’s Guide:
1. Basic Operation
When powered on the mouse displays a menu on the LCD screen. Using the
control knob on the microcontroller the user can select from three options: testanalogs,
testdigitals, and floodfill. Once the option wanted is displayed on the screen press the
start button on the microcontroller to select it.
NOTE:If the mouse does not show this menu upon boot, download floodfill.c to
the mouse. (See Section 2)
1.1 Test Analogs – Shows the current value for the selected analog sensor.
• The control knob selects which sensor to display values for.
1.2 Test Digitals – Shows the current value for the selected digital sensor.
• The control knob selects which sensor to display values for.
• The display can also be set to display all connected digital sensors.
1.3 Floodfill – This setting instructs the mouse to solve a maze using floodfill.
• The mouse must be placed in a corner of the maze with its right side running
parallel to the outside wall of the maze.
• For optimal results the mouse must be centered in the starting square.
2. Downloading Code to the Mouse
2.1. First time
1. Install Interactive C. The latest version of Interactive C (5.0.0009) is included on
this CD in the Interactive C folder.
2. Connect the serial connector from the charging board to the back of the computer.
2.2. Downloading Code
1. Launch Interactive C.
2. Follow the connect steps for “Handyboard v1.2”
3. Open the C file to load to the mouse (floodfill.c or test.c) in Interactive C. (Both
files are located on the CD under the Code folder)
4. Press the Download button.
NOTE:If the mouse’s battery has died firmware will need to be downloaded to the
mouse first. This process is explained during the instructions for step 2.
Fucsko, Houle, Mitro, Olson 23
3. Modifying the Code:
Comments exist throughout the code to explain what different variables and processes
actually do. Some items of particular interest are:
• iGoalX & iGoalY – These tell the mouse where the goal is. Changing these
values will move the goal to the specified X, Y position.
NOTE:Coordinates in the mouse’s memory are tracked starting from the upper
left corner with (0, 0), X increases to the right, Y increases down.
•
•
•
•
•
•
•
iMaxX & iMaxY – The limits to the maze, helpful for testing on smaller mazes.
QuadrantI-IV – Biasing for different corners of the maze. Each define consists of
four two bit numbers combined. The numbers indicate the priority to choose
directions in case of a tie. Two older biases are commented out in the code.
Uncomment them and comment out the current ones to try them out. (Quadrants
are based on normal math quadrants, I is upper left, II upper right, etc)
iMaxDepth – Maximum recursion depth for flooding. This limits the recursion to
prevent memory overflow. Increasing the value allows the mouse to flood deeper
dead ends but increases the risk of memory crashes.
A hard coded 7x7 maze is commented out in the InitStart function. This maze can
be used for debugging the actual mouse code in Interactive C. If you uncomment
it, be sure to change iMaxX & iMaxY to 7.
Under the LookAround function there is a commented out section of code that
floods the entire maze every step. This will decrease the average number of steps
to the goal, but drastically increases thought time, try it out.
MMMotors.c contains all of the code to actually drive the motors. Large changes
will be required to this file if different motors are used.
test.c contains simple test functions for implementation of the testplan.
4. Simulators:
Included on this CD are two simulators for testing the mouse’s code on random or
loaded mazes. One simulator uses the floodfill algorithm implemented into the final
mouse, the other uses a logic based solving algorithm we were playing with through
development. Source code for both of the simulators is included along with some
example mazes.
4.1 Operating the Simulators:
Both simulators are operated the same way. Once one of the simulators is
launched it displays the following:
'l' to load a maze, else randomly generate one:
Enter l then pressing enter allows you to specify a filename for a maze to load. Some
example maze are included on the CD, they are the txt files in the Simulators folded. A
hidden command actually exists from this menu. If you type “m” (without the quotes)
press enter, then type “m” and press enter again it will ask you to specify a number of
runs. Enter an integer and the simulator will run on that many randomly generated
mazes.
Fucsko, Houle, Mitro, Olson 24
Once a maze has been loaded or generated you can step through the maze by
pressing the enter key. The floodfill simulator shows both the maze the mouse has seen
(with flood values in the squares) and the actual maze (with the path shown as the latest
step value at that square). The logic solver only shows the actual maze with the path.
Another hidden command exists immediately after generating a random maze. If you
type “s” then press enter the maze will immediately solve to completion.
NOTE:The simulators have nothing to stop them from terminating once the
program completes. Therefore it is recommended to launch them from a
command prompt so that the user can see the final results without the
window closing.
5. Testing
The micromouse can alternatively be loaded with a package of test routines. The
test functions use the same functions to drive the mouse as floodfill.c except that the
sensors are disabled. This section describes what each function in test.c does and how to
operate them. When powered on with test.c the mouse displays a menu system similar to
Section 1. The control knob on the microcontroller selects the test to perform. Once the
option wanted is displayed on the screen press the start button on the microcontroller to
begin the test.
NOTE:All of the test functions have a half second delay before they begin allowing
time for the user to remove their hand from the start button.
5.1 Move Forward – The mouse moves forward 16 steps (288cm).
• Adequate space should be provided in front of the mouse so that it does not crash.
• Space should also be left to the right of the mouse. By default the mouse swings
to the right by as much as 33% of its forward motion.
5.2 Turn Right – The mouse turns 90° to the right.
• Adequate space should be provided for the sensor wing to swing to the right.
5.3 Turn 360 – The mouse will make a 360° to the right.
• Adequate space should be provided for the sensor wing to swing in a complete
circle.
5.4 Test Battery – The mouse will move in a box to the right indefinitely.
• Adequate space should be provided for the mouse to move in a box.
NOTE:Over time errors will build up in the mouse’s turning. Space should be
provided to allow for these errors.
Download