Titel - University of the West of England

advertisement
OCCUPANCY GRID-BASED SLAM USING A
MOBILE ROBOT WITH A RING OF EIGHT SONAR
TRANSDUCERS
George Terzakis1, Sanja Dogramadzi2
1
2
University of Plymouth, School of Computing and Mathematics, Plymouth, UK
georgios.terzakis@plymouth.ac.uk
University of the West of England, Department of Engineering Design and Mathematics, UK
sanja.dogramadzi@uwe.ac.uk
The degree of accuracy by which a mobile robot can estimate the properties of its
surrounding environment, and the ability to successfully navigate throughout the explored space are the main factors that may well determine its autonomy and efficiency
with respect to the goals of the application. This paper focuses on the implementation
of a SLAM framework comprising a planner, a percept and a displacement/angular
error estimator using a regular occupancy grid spatial memory representation.
The overall process is arbitrated by a finite state machine (FSM) invoking the components of the framework according to the deliberate paradigm, SENSE-PLAN-ACT as
described by R. Murphy [1], with the occasional exception of the interstitial invocation of a correction process, based on the estimated angular and displacement error
during the execution of the planned tasks. The FSM uses a global task queue in order
to execute the tasks that the planner produced during the previous call.
The robot uses a sonar ring of eight equally spaced transducers in its circumference; it
utilizes four DC motors connected in pairs to provide differential drive, while the
possible moves include forward-backward linear motion by a distance equal to the
circumference of the circle to which the robot can be inscribed, and rotations by 90o
to the left or to the right. Accordingly, the cells are square-shaped with side length
equal to the diameter of the robot’s circumcircle.
The map is stored in spatial memory as a 2D array of characters; its entries can be one
of the following: ‘B’ for BLOCK, ‘ ‘ for VOID and ‘X’ for UNKNOWN. The percept
decides whether the cell is occupied (BLOCK) or empty (VOID). The UNKNOWN
entry is simply used to denote that the cell has not been inspected (the A* heuristic of
the planner uses these entries to plan a series of moves towards the nearest unexplored
cell).
The perception mechanism perceives the map as 0-order Markov random field [2].
The latter implies that each cell is “tied” to a probability value that is recursively updated using Bayes’s rule upon each cell inspection. The robot’s impression regarding
the state of a neighboring cell is based on whether the sonar readings provide minimal
“assurances” that it can fit in that particular cell (VOID). The robot uses the three
transducers facing the cell under inspection in quite a similar manner to the one that
somebody would use his/her hands to fumble about in the dark in order to decide
whether the space in a particular direction is empty. In that aspect, three transducers
are used by the percept to obtain an impression as to whether the robot can fit breadth
wise in the examined area. These conditions are then used to obtain likelihood probability values for the sonar measurements given the state of the cell. The probability of
the cell is then updated using Bayes’s rule and the new state is determined with respect to whether the new probability of it being BLOCK is greater than the corresponding of it being VOID [3]. The measurement likelihood probability density functions (PDF) are chosen as a increasing (BLOCK) and decreasing (VOID) quadratic in
order to depict the increasing/decreasing likelihood of obtaining greater/less measurements given that the cell is VOID/BLOCK.
During casual motion, the robot performs a set of motion primitives:
GO_FORWARD, GO_BACKWARD, ROTATE_LEFT and ROTATE_RIGHT. The
rotation primitives involve a 900 rotation which, in theory, should not affect the robot’s pose and position. In practice, due to several factors (i.e., wheel spinning, gyro
integration errors, etc) the robot introduces a slight displacement (up to 5 cm in both
axes) and loss of orientation (up to 10 0). To estimate these discrepancies, a Kalman
filter [4] was used, in which, displacement corresponds to process noise, while misalignment is seen as measurement noise. The displacement calculation is based on the
difference between the state estimates before and after a 90 o turn. The average angular
error is calculated as the average inverse cosine of the fraction of the sum of the state
estimate and the robot’s radius, divided by the sum of the corresponding raw transducer measurement for each case in which the measurement is greater than the state
estimate.
Despite the relatively limited number of sensors and several issues related to the
equipment employed (i.e., low budget materials such as serial wireless transceivers,
batteries, gyros and motor driver), the robot can successfully perform mapping in
relatively wide spaces with respect to its own size without significantly losing track of
its position and orientation.
References
1. Murphy, R.: Introduction to AI Robotics. pp.42-44, MIT Press, Cambridge (2001)
2. Elfes, A.: Using occupancy grids for mobile robot perception and navigation. Carnegie Mellon University, Pittsburgh (1989)
3. Theodoridis, S., Koutroumbas S.: Pattern Recognition. 4th ed., pp.14-16, Elsevier
Inc., San Diego (2009)
4. Welch G., Bishop G.: An introduction to the Kalman filter. University of North
Carolina, Chapel Hill (2006)
Download