Poster: Phones and Robots: Brains and Brawn Avraham Klausner Ari Trachtenberg David Starobinski

advertisement
Poster: Phones and Robots: Brains and Brawn
Avraham Klausner
Ari Trachtenberg
David Starobinski
Boston University
8 Saint Mary’s St
Boston, MA 02215
United States
Boston University
8 Saint Mary’s St
Boston, MA 02215
United States
Boston University
8 Saint Mary’s St
Boston, MA 02215
United States
aviclaws@gmail.com
trachten@bu.edu
staro@bu.edu
Abstract
Our project demonstrates the capabilities of a symbiotic
phone-robot hybrid device, wherein the robot provides gross
motor control and the phone provides fine course corrections
and sensing capability. The crude robot generates movement
subject to mechanical wheel asymmetries and non-linear
motor effects; the inexpensive phone provides a variety of
on-board sensors and a reasonably powerful CPU/memory.
We demonstrate the utility of the combined device to provide a reasonably accurate autonomous signal mapping on
an untrained floor plan in our building.
1
Introduction
Signal mapping is an essential but tedious task for analyzing and improving the availability of the growing network of
wireless technologies. As our everyday lives become more
dependent upon wireless electronics, improvements in signal strength are needed. Though many devices have been
created to improve signal reception at a given location in
a network, the important question of where to place these
devices is usually answered only after performing thorough
signal mapping.
Signal mapping comes in many different forms and for
many different purposes. One of the more well known
projects was taken on by Google to improve their online
maps. Google uses the “Google car”, which is driven around
with a camera mounted on the roof. This camera collects
pictures for street view maps. Other more closely related
works use signal strength data collected from multiple smartphones [1] or mobile robots [3] to locate a base stations or
dead spot in a network. Yet others utilize a sensor-embedded
network to act as a navigation system for a robot [2].
In this paper, a new efficient technique using a combination of a VEX Robot and a smartphone is described for
Copyright is held by the author/owner(s).
SenSys’11, November 1–4, 2011, Seattle, WA, USA.
ACM 978-1-4503-0718-5/11/11
Figure 1. Phone Robot Hybrid: 1- VEX Microcontroller,
2 - Start button, 3 - Ultrasonic Range Finder, 4 - Front
bumpers, 5 - Light Sensor, 6 - LED’s used for debugging,
7 - iPhone 4 and Motorola Droid.
autonomous signal mapping. The combination of the sensitivity capabilities of the phone and the robot creates a small
sensor network allowing the robot to navigate autonomously
through a building while mapping out the signal strength.
2 System Design
2.1 Components
Our automated signal mapper uses an iPhone, a Motorola Droid, and a VEX Robot all in tandem to navigate
through an unknown area and collect signal data. The VEX
Robot is equipped with multiple sensors (Fig. 1) and a
VEX Microcontroller allowing the sensors to interact with
our code. The robot’s front two bumpers notify the robot
when it has reached a dead end. The ultrasonic range finder
allows the robot to notice a possible left turn, and it aids
with straight movement (a serious challenge for the crude device!). The light sensor on the robot provides feedback data
from iPhone in the form of encoded messages through flashing the iPhone’s screen. A simple application was developed
that utilized the iPhone’s internal gyroscope and accelerometer to create and accurate map of exactly where the robot has
traveled. Finally, due the lack of a public API for collecting
signal strength on the iPhone, a Motorola Droid is utilized to
log signal strength data.
2.2
Functionally
Our hybrid device can be placed on any smooth clean surface, without jagged walls, and is able to map out the floor
plan. The robot performs an engineered depth-first traversal
of its floor space. At its core, the robot attempts to travel
straight until a left turn is available. If there is no left turn
the robot continues straight until it senses a wall. Once at the
wall, the robot will make a right turn and again follow the
simple guideline of traveling straight until a left or a wall. If
the robot cannot continue straight or turn right, it will travel
back to the last left turn and continue straight from there.
Each turn and its timestamp is logged in the robot’s program
on the stack and is deleted once the robot has traversed those
steps backwards to the last left turn.
While implementing this algorithm the robot can get
stuck in a loop circling an object or path. For example, if
the floor plan has a crate in the middle of the corridor, the
robot will eventually get stuck making left turns around the
crate. In order to correct for this error, the iPhone uses its
map to sense if the robot is stuck in a loop and flashes its
screen appropriately; the robot’s light sensor recognizes, decodes, and then interprets this as a signal to go back to the
last left turn.
2.3
Challenges
Through many trials conducted on this hybrid device, certain challenges surfaced. These problems stem from the mechanical errors in the structural and hardware design of the
robot. Using the sensor network created by the hybrid device, some of these errors were minimized.
Firstly, the robot struggled to travel completely straight
mainly due to the mechanical asymmetry of robot. Meaning,
one wheel wobbled or slipped more frequently than other
causing the robot to veer off path. Also, the increase in temperature of the motors as the robot travels longer distances
results in less accurate speeds and likely contributes to this
error. Finally, these issues contributed to an inconsistency of
the robot in implementing exact 90 degree turns.
In order to correct for the robot’s veering motion, the ultrasonic range finder (URC) is used in a sensor feedback loop
to adjust the speed of each motor. Additionally, the URC corrects for the robot’s failure to perform exact 90-degree turns
by straightening the robot’s path to follow a wall. This correction relies on the robot’s proximity to a wall. In large open
areas, the URC fails to correct for the robot’s veering due to
the maximum range of the sensor only being three meters.
In such cases, the iPhone’s compass is used to help the robot
stay on path. Again the compass is used in a sensor feed back
loop to adjust the speed of each motor. Messages sent from
the iPhone in the form of its screen flashing, are detected by
the robot’s the light sensor and decoded to adjust its motors
appropriately. The robot’s program does not solely rely on
the compass because the iPhone’s magnetometer is affected
by fields generated from the moving motors.
3
Results
The hybrid signal mapper was tested in an office building. It successfully collected and mapped the signal for a
floor plan of the building that was not known to the system.
Figure 2. Signal Map: Red = Strong Signal, Yellow =
Medium Signal, Blue = Weak Signal.
Ten trials were run and averaged in order to obtain accurate
readings of the signal strength (in units of dBm). Fig. 2 is
a sample output of the iPhone’s map with the superimposed
averaged signal reception data. The color scheme depicts the
change in signal and the red boxes with numbers represent
the ordered path traveled by the robot.
4
Conclusion
In conclusion, the phone-robot hybrid device successfully
created and utilized a sensor network to improve the functionality of the robot. The hybrid device effectively mapped
the signal reception of a previously unknown floor plan. Signal mapping is just one of many possible applications for
this given hybrid device. In a broader statement, this project
has illustrated how the creation of a small sensor network
through a hybrid device with an everyday smartphone can
provide valuable autonomous applications.
5
Acknowledgments
This work was supported, in part, by NSF grant CCF091682 and the Undergraduate Research Opportunities Program (UROP) at Boston University.
6
References
[1] Z. Zhang, X. Zhou, W. Zhang, Y. Zhang, G. Wang, Ben
Y. Zhao and Haitao Zheng. I Am the Antenna: Accurate Outdoor AP Location using Smartphones. MobiCom’11, September, 2011.
[2] Batalin, M.A.; Sukhatme, G.S.; Hattig, M. Mobile
robot navigation using a sensor network. Robotics and
Automation, 2004. Proceedings. ICRA ’04. 2004 IEEE
International Conference on, 1:636–641, April, 2004.
[3] J. Fink, and V, Kumar Online Mehtods for Radio Signal
Mapping with Mobile Robots Robotics and Automation (ICRA), 2010 IEEE International Conference on,
July, 2010.
Download