Junk Bot Report

advertisement
Junk Bot
Group 3: Yannick Djoumbi, Krystal Reinwand, Derek Wedley
Design Intent and Constraints_____________________________________________________
The design intent for junk bot is to roam around the given space on the board, detecting
cans, and pushing them out of the arena while playing a tone. The bot should be able to detect
the black border of the arena, and then turn around. In our final junk bot, the bot plays a tone
while there is a can in front of the sensor. It then proceeds straight forward until the can is out
of the arena and the second light sensor reads the black line, telling it to stop and turn around.
Final Concept___________________________________________________________________
The final concept for our bot is based off of the hunt bot. It uses two motors in order to
move forward, backward, left and right. It then has a third wheel on the back to help it pivot.
During the hunting part of the code, the bot turns at a random time between 350ms and 750ms,
changing the direction of the bot either left or right. After the bot is completed with its turn, it
then proceeds forward at a random time between 1 second and 3 seconds. However, when the
bot detects the can, the hunt task is stopped. The Can_Found task is the activated. In this task,
the bot moves forward at 80% speed (slightly faster than the hunt task) until it reaches the black
border. Once it reaches the black border, the Can_Found task is negated and the line_detect task
tells the bot to turn around. The bot then resumes the hunt task after the line_detect task is
complete.
In order to capture the cans and remove them from the arena, we created a V-shaped
entrapping device. With this capture device, the junk bot is possibly able to capture more than
one can. We also positioned the capture device closer to the ground so it did not hit the cans
high up. Whenever the cans were hit higher rather than lower, the cans would fall down. This
was a challenge that we struggled with in the beginning. However, once we fixed the placement
of the capture device, everything fell into place.
First Concept___________________________________________________________________
Final Concept___________________________________________________________________
Software Architecture____________________________________________________________
//finalbot.nxt
#define LEFT OUT_A // Left motor in A
#define RIGHT OUT_C // right motor in C
#define REYE SENSOR_1 // right sensor in input 1
#define LEYE SENSOR_3 // left sensor in input 3
#define THRESH 40 // threshold at 55
int dir=0;
int complete=0;
inline void MoveHunt()
{
OnFwd(OUT_AC,65); // both motors forward and 65%
Wait(1000+Random(2500)); // randomly move forward between 1 and 3 seconds
if (Random(2) == 1) // randomly search
OnRev(OUT_A,65); // reverse out A
else
{
OnRev(OUT_C,65); // or reverse out C
}
Wait(250+Random(500)); //reverse random amount between 0 and 999
}
task hunt() // hunt for objects
{
while(true) // will make following code repeat forever
{
MoveHunt(); // recall inline void MoveHunt
}
}
task line_detect() // task for detecting the line
{
if(dir==1) // when dir is 1, do:
{
Off(OUT_AC); // both motors off
Wait(500); // wait for half second
OnRev(OUT_AC,50); // both motors reverse at half speed
Wait(1000); // reverse half second
OnFwd(OUT_A,50); // left wheel forward
Wait(350+Random(650)); // turn for a random amount between 500 ms and 750 ms
Off(OUT_AC); // both motors off
complete=1; // signifies turn is complete
}
}
task Can_Found() // task for when can is found
{
if (dir==2) // if dir is 2, do:
{
PlayToneEx(350,500,4,FALSE); // play tone
OnFwd(OUT_AC,85); // forward both motors at 60%
complete=1; // signifies complete
}
}
task feedback()
{
while(true) // repeat forever
{
if ((SENSOR_1 < THRESH)) // if right eye is less than thresh and left eye greater than thresh
{
dir = 1; // give dir value of 1
stop hunt; // stop task hunt
start line_detect;// starl task line detect
until (complete==1); // do this until it is complete
stop line_detect; // stop task line detect
complete=0; // reset complete to 0
start hunt; // restart task hunt
}
if ((SENSOR_3 > THRESH)) // if left eye less than thresh and right eye greater than thresh
{
dir = 2; //give dir value of 2
stop hunt; // stop task hunt
start Can_Found; // start task can found
start line_detect; // start line detect
until (complete==1); // until complete is 1
stop Can_Found; // stop can found
stop line_detect; // stop line detect
complete=0; // reset complete to 0
start hunt; // restart task hunt
}
}
}
task main() // main task
{
SetSensorLight(IN_1); // light sensor in input 1
SetSensorLight(IN_3); // light sensor in input 3
start feedback; // start task feedback
start hunt; // start task hunt
}
Final Code_____________________________________________________________________
In the code, we started task feedback and task hunt simultaneously. In task hunt we had
the bot moving forward between 1 second and 2.5 seconds. We then had the bot hunt by using
an if/else telling it to turn left or right based on the random variable. The bot then would turn for
a random time between 250ms and 500ms.
The feedback task is running at the same time as the hunt task. In the feedback task, we
gave the variable dir a value, as we did with the nurse bot. We had two dir variables, 1 and 2.
When dir was 1, sensor 1, the light sensor facing the ground, was less than the threshold. We
stopped the hunt task, started the line_detect task, until complete==1, another variable initially
set to 0. When it was complete, the line_detect task was stopped, complete was reset to 0, and
task hunt was restarted. When dir was 2, sensor 3, the light sensor we had facing the reflective
cans, was greater than the threshold. We stopped task hunt, started Can_Found task and
line_detect task until it was complete. We then stopped the Can_Found and line_detect tasks,
reset the complete back to 0, and restarted hunt.
The whole code revolved around the feedback task. If this single task does not work, then
the whole code would not work. None of the other codes would have been enabled, meaning
that they would have been rendered useless.
Final Thoughts__________________________________________________________________
All-in-all, the bot performed as expected. We incorporated almost everything that we
learned from previous bots, such as hunting and detecting different thresholds. In fact, some
code was borrowed and manipulated to fit the constraints of the project. If we were to change
one thing about it, it would have been interesting to include and incorporate the ultrasonic
sensor. The ultrasonic sensor seemed to be able to recognize the cans from a greater distance
than the light sensor. With the light sensor, the bot had to “rake” the cans into the center, where
the light sensor could read the reflective material.
Download