Map Construction And Localization Using Lego Mindstorms Nxt

advertisement
Journal of Automation, Mobile Robotics & Intelligent Systems
VOLUME 7,
N° 3
2013
Map
Map Construction
Construction And
And Localization
Localization Using
Using Lego
Lego Mindstorms
Mindstorms Nxt
Nxt
Jasmeet Singh, Punam Bedi
Jasmeet Singh, Punam Bedi
Map Construction And Localization Using Lego Mindstorms Nxt
Subm.7th
August 2012; accepted 17thMay 2013
Subm.7th August 2012; accepted 17thMay 2013
Submitted: 7th August 2012; accepted: 17th May 2013
Jasmeet Singh, Punam Bedi
Abstract:
Abstract:
Maps are very useful for understanding unknown places
Maps are very useful for understanding unknown places
before visiting them as maps represent spatial relationbefore visiting them as maps represent spatial relationships between various objects in a region. Using robots
ships between various objects in a region. Using robots
for map construction is an important field these days as
for map construction is an important field these days as
robots can reach places which may be inaccessible to
robots can reach places which may be inaccessible to
human beings. This paper presents a method to use the
human beings. This paper presents a method to use the
data obtained from a single ultrasonic sensor mounted
data obtained from a single ultrasonic sensor mounted
on a robot, to construct a map and localize the robot
on a robot, to construct a map and localize the robot
within that map. Map of the previously unknown enviwithin that map. Map of the previously unknown environment is created with the help of a mobile robot, built
ronment is created with the help of a mobile robot, built
using Lego Mindstorms NXT assembled in a modified
using Lego Mindstorms NXT assembled in a modified
TriBot configuration. The robot is equipped with an
TriBot configuration. The robot is equipped with an
ultrasonic sensor and is controlled from a computer
ultrasonic sensor and is controlled from a computer
system running a MATLAB program, which communisystem running a MATLAB program, which communicates with the NXT over a USB or Bluetooth connection
cates with the NXT over a USB or Bluetooth connection
and performs complex calculations that are not possible
and performs complex calculations that are not possible
for the NXT itself. After the map construction, the robot
for the NXT itself. After the map construction, the robot
finds its position in the map by using a particle filter.
finds its position in the map by using a particle filter.
Implementation has been done in MATLAB programImplementation has been done in MATLAB programming environment using RWTH – Mindstorms NXT
ming environment using RWTH – Mindstorms NXT
Toolbox and has been successfully tested for map conToolbox and has been successfully tested for map construction of a room and localization within that room
struction of a room and localization within that room
with the use of a TriBot.
with the use of a TriBot.
Keywords: Lego Mindstorms NXT, TriBot, ultrasonic
Keywords: Lego Mindstorms NXT, TriBot, ultrasonic
sensor, map construction, localization, particle filter,
sensor, map construction, localization, particle filter,
MATLAB, mobile robot, RWTH – Mindstorms NXT
MATLAB, mobile robot, RWTH – Mindstorms NXT
Toolbox
Toolbox
1. Introduction
1. Introduction
22
Maps are used in a variety of applications for our
Maps are used in a variety of applications for our
day-to-day needs. These maps can be considered as
day-to-day needs. These maps can be considered as
macro-scopes which are used primarily to represent
macro-scopes which are used primarily to represent
spatial relationships between various objects in
spatial relationships between various objects in
a region, on a smaller-scale. Constructing a geometa region, on a smaller-scale. Constructing a geometrically consistent map is a tedious process that rerically consistent map is a tedious process that requires determining spatial relationship between
quires determining spatial relationship between
various objects, which in turn requires a lot of measvarious objects, which in turn requires a lot of measurements. The presented work describes a method
urements. The presented work describes a method
to construct a map using a Lego Mindstorms NXT
to construct a map using a Lego Mindstorms NXT
based TriBot (referred here-on as simply NXT or
based TriBot (referred here-on as simply NXT or
TriBot), which has been successfully tested for creatTriBot), which has been successfully tested for creating an indoor map of a room.
ing an indoor map of a room.
This paper is divided into two main sections of
This paper is divided into two main sections of
Map Construction and Localization and the KidMap Construction and Localization and the Kidnapped Robot Problem. The map construction and
napped Robot Problem. The map construction and
localization section describes how the robot is able
localization section describes how the robot is able
to take readings of its surroundings, move in the
to take readings of its surroundings, move in the
Articles
environment, and also how the constructed global
environment, and also how the constructed global
map is displayed based on the readings obtained by
map is displayed based on the readings obtained by
the ultrasonic sensor of the TriBot. The latter section
the ultrasonic sensor of the TriBot. The latter section
describes how the robot’s position is found within
describes how the robot’s position is found within
the constructed map, after it has been placed at
the constructed map, after it has been placed at
a random position within the mapped environment,
a random position within the mapped environment,
by using the readings obtained from its sensors and
by using the readings obtained from its sensors and
its knowledge of the map.
its knowledge of the map.
Previous work in localization of robots has been
Previous work in localization of robots has been
carried out with the use of more than one ultrasonic
carried out with the use of more than one ultrasonic
sensor at once [1–3]. This requires the robot to know
sensor at once [1–3]. This requires the robot to know
which sensor returned the reading and hence the
which sensor returned the reading and hence the
robot needs to distinguish between several ultrasonrobot needs to distinguish between several ultrasonic sensors that were mounted on it. With the use of
ic sensors that were mounted on it. With the use of
multiple sensors, the cost of the robot increases as
multiple sensors, the cost of the robot increases as
well. The work carried out in this paper requires the
well. The work carried out in this paper requires the
use of only a single ultrasonic sensor for localization,
use of only a single ultrasonic sensor for localization,
placed at the origin of the robot reference frame
placed at the origin of the robot reference frame
thereby eliminating additional computations for
thereby eliminating additional computations for
finding the relative positions of all the ultrasonic
finding the relative positions of all the ultrasonic
sensors, with respect to the reference frame, as in
sensors, with respect to the reference frame, as in
[1, 3]. When using particle filters for solving the
[1, 3]. When using particle filters for solving the
global localization [1, 4], the map given as an input to
global localization [1, 4], the map given as an input to
the particle filter algorithm, was not built by the
the particle filter algorithm, was not built by the
robot itself [1] and was constructed by the user via
robot itself [1] and was constructed by the user via
other techniques. But here the TriBot itself conother techniques. But here the TriBot itself constructs the global map that is later used providing
structs the global map that is later used providing
a solution to the kidnapped robot problem [4] dea solution to the kidnappedthrobot problem [4] described in the localization s7th ection.
scribed in the localization s7 ection.
Mapping with ultrasonic sensors has previously
Mapping with ultrasonic sensors has previously
been reliant on the use of multiple sensors in ring
been reliant on the use of multiple sensors in ring
formation [5]. With the use of a single ultrasonic
formation [5]. With the use of a single ultrasonic
sensor it becomes possible to successfully map the
sensor it becomes possible to successfully map the
environment surrounding the NXT, while keeping
environment surrounding the NXT, while keeping
the computational complexity to a minimum. It also
the computational complexity to a minimum. It also
helps in specifying the number of readings that must
helps in specifying the number of readings that must
be received from the surroundings for each position
be received from the surroundings for each position
of NXT, without having to change the number of
of NXT, without having to change the number of
ultrasonic sensors in the robot’s hardware.
ultrasonic sensors in the robot’s hardware.
For global map construction and localization, the
For global map construction and localization, the
concept of occupancy grids has also been previously
concept of occupancy grids has also been previously
employed in [6, 7] which, though effective, depends
employed in [6, 7] which, though effective, depends
on the size of the grid cells. A robot when moved to
on the size of the grid cells. A robot when moved to
a frontier is able to see the unexplored area of the
a frontier is able to see the unexplored area of the
environment. By constantly moving to successive
environment. By constantly moving to successive
frontiers the robot is able to increase its knowledge
frontiers the robot is able to increase its knowledge
of the surrounding environment. Frontier based
of the surrounding environment. Frontier based
exploration have been coupled with occupancy grids
exploration have been coupled with occupancy grids
for the map construction and localization in [6] but
for the map construction and localization in [6] but
in the implementation of such a system multiple
in the implementation of such a system multiple
laser range finders, sonar sensors and infrared senlaser range finders, sonar sensors and infrared sen-
Journal of Automation, Mobile Robotics & Intelligent Systems
sors
were
used
to make
the
resulting map
as accusors
were
used
make
the
map
sors
were
used to
toThe
make
the resulting
resulting
map as
as accuaccurate
as
possible.
amount
of
computation
and
rate
as
possible.
The
amount
of
computation
and
rate
as
possible.
The
amount
of
computation
and
complexity
increases
with
multiple
sensors,
but
with
complexity
increases
with
multiple
sensors,
but
with
complexity
increases
with
multiple
sensors,
but
with
the
use
of aa single
ultrasonic
sensor
amount of
comthe
use
single
ultrasonic
sensor
of
the
use of
ofisalow.
single
ultrasonic
sensor amount
amountexploraof comcomputation
In
this
work
frontier-based
putation
is
low.
In
this
work
frontier-based
exploraputation
is
low.
In
this
work
frontier-based
exploration
strategy
has
been
employed
without
using
the
tion
strategy
has
been
employed
without
using
tion
strategy
has and
beenonly
employed
without
using the
the
occupancy
grids
the
readings
obtained
at
occupancy
grids
and
only
the
readings
obtained
at
occupancy
grids
and
only
the
readings
obtained
at
the
current
position
of
the
robot
are
used
for
decidthe
current
position
of
the
robot
are
used
for
decidthe
current
position
of
the
robot
are
used
for
deciding
the
next
frontier for
the
robot.
ing
the
next
for
the
robot.
ing In
theother
next frontier
frontier
for
theconstruction
robot.
works
of
map
and
localizaIn
other
works
of
map
construction
and
In
other
works
of
map
construction
and localizalocalization,
Simultaneous
Localization
and
Mapping
(SLAM)
tion,
Simultaneous
Localization
and
Mapping
(SLAM)
tion,
Simultaneous
Localization
and
Mapping
(SLAM)
has
been
used
to
localize
the
robot’s
position
as
it
has
been
used
to
localize
the
robot’s
position
as
it
has
been
used
to
localize
the
robot’s
position
as is
it
moves
within
the
environment,
while
mapping
moves
within
the
environment,
while
mapping
is
moves
within
the
environment,
while
mapping
is
being
performed
[3,
4, 8].
In
the
current
system,
each
being
performed
[3,
In
the
current
system,
being
performed
[3, 4,
4, 8].
8].to
In the
the origin
current
system,ofeach
each
robot
position
is
relative
position
the
robot
position
is
relative
to
the
origin
position
of
the
robot
position
is
relative
to
the
origin
position
of
the
map
and
the
robot
has
no
knowledge
of
a
prior
map.
map
and
the
robot
has
no
knowledge
of
a
prior
map.
map
and
the
robot
has
no
knowledge
of
a
prior
map.
It
is
also possible
to
implement map
construction
It
is
possible
to
map
construction
It
is aalso
also
possible
to implement
implement
map
construction
with
swarm
of
robots
in
the
same
environment
[8]
with
a
swarm
of
robots
in
the
same
environment
[8]
with
a
swarm
of
robots
in
the
same
environment
[8]
to
speed
up
the
process
but
the
implementation
has
to
speed
up
the
process
but
the
implementation
has
to
speed
up
the
process
but
the
implementation
has
been
limited
to
a single
robot
so
as to
ensure
that
been
limited
to
single
robot
so
to
ensure
that
been
limited
toisaa able
single
robot
so as
as
totask
ensure
that
aa single
robot
to
perform
the
of
map
single
robot
is
able
to
perform
the
task
of
map
acreation
single robot
is
able
to
perform
the
task
of
map
efficiently.
It
is also
easier for
the computer
creation
efficiently.
It
also
creation
efficiently.
It is
is robot.
also easier
easier for
for the
the computer
computer
to
keep
track
of
a
single
to
keep
track
of
a
single
robot.
to keep track of a single robot.
2.
Background
2.
2. Background
Background
2.1.
Environment
2.1.
2.1. Environment
Environment
The
environment
considered
for
the
implementaThe
environment
considered
for
the
The
environment
considered
for
the implementaimplementation
is
partially
observable,
with
a
single
robot
worktion
is
partially
observable,
with
aa single
robot
worktion
is
partially
observable,
with
single
robot
working
in
a
continuous
state
space.
The
actual
environing
in
a
continuous
state
space.
The
actual
environing
in
a
continuous
state
space.
The
actual
environment
is
an
empty room
created
by temporary
partiment
is
an
room
created
temporary
partiment
is
an isempty
empty
roomspecifically
created by
by to
temporary
partitions
that
available
the
robot
and
tions
that
is
available
specifically
to
the
robot
and
tions
that
is
available
specifically
to
the
robot
and
has
no
human
interference,
for
the
purposes
of
the
has
no
human
interference,
for
the
purposes
of
the
has
no
human
interference,
for
the
purposes
of
the
given
experiments.
given
experiments.
given experiments.
2.2.
Assumptions Taken
2.2.
2.2. Assumptions
Assumptions Taken
Taken
—
—
—
—
—
—
—
—
—
The
walls
of
the
boundary
of
room and
the obThe
walls
of
the
boundary
of
The
walls
ofthe
theroom
boundary
of room
room and
and the
the obobjects
within
are
immovable.
jects
within
the
room
are
immovable.
jects
within
the
room
are
immovable.
Friction
between
the
floor
of room
and
the
Friction
between
the
floor
room
and
Friction
between
the enough
floor of
of
roomduring
and the
the
wheels
of
robot
is
just
so
that,
the
wheels
of
robot
is
just
enough
so
that,
during
the
wheels
of
robot
is
just
enough
so
that,
during
the
execution
of
a
turn
at
a
position,
the
stationary
execution
of
a
turn
at
a
position,
the
stationary
execution
of
a
turn
at
a
position,
the
stationary
wheel
of
the
robot does
not
slip.
wheel
of
the
not
slip.
wheel
of construction,
the robot
robot does
doesthe
notTriBot’s
slip. first location is
For
map
For
map
construction,
the
TriBot’s
first
location
For
map
construction,
the
TriBot’s
first
location is
is
assumed
to
be
(0,
0)
with
aa 0
degree
orientation.
assumed
to
be
(0,
0)
with
0
degree
orientation.
assumed to be (0, 0) with a 0 degree orientation.
2.3.
Software
2.3.
2.3. Software
Software
MATLAB
is
used
for the
purposes
of
moving the
MATLAB
is
used
purposes
of
the
MATLAB
isreadings
used for
forofthe
the
purposes
of moving
movingenvithe
robot,
taking
the
local
surrounding
robot,
taking
readings
of
the
local
surrounding
envirobot,
taking
readings
of
the
local
surrounding
environment,
performing
calculations
for
map
creation
ronment,
performing
calculations
for
map
ronment,
performing
calculations
forcreated
map creation
creation
and
localization,
and
displaying
the
global
and
localization,
and
displaying
the
created
global
and
localization,
and
displaying
the
created
global
map.
map.
map.
The
RWTH
–
Mindstorms NXT
Toolbox [9],
verThe
RWTH
–
NXT
[9],
verThe
RWTH
– Mindstorms
Mindstorms
NXTaToolbox
Toolbox
[9],from
version
4.04
for
MATLAB,
establishes
connection
sion
4.04
for
MATLAB,
establishes
a
connection
from
sion
4.04
for
MATLAB,
establishes
a
connection
from
the
program,
running
on
the computer,
to
the
TriBot.
the
program,
running
on
computer,
to
the
TriBot.
the
program,
running
on the
the it
computer,
to
the
TriBot.
With
the
use
of
this
toolbox
possible
to
control
all
With
the
use
of
this
toolbox
it
possible
to
control
all
With
the
use
of
this
toolbox
it
possible
to
control
all
the
motors
and
sensors
attached
to
the
TriBot.
the
motors
and
sensors
attached
to
the
TriBot.
the
motors
and
sensors
attached
to
the
TriBot.
Hence,
it
is possible
to
move
the
robot
by controlling
Hence,
it
to
move
the
robot
Hence,
it is
is possible
possible
touse
move
theultrasonic
robot by
by controlling
controlling
its
wheeled
motors,
the
sensor
to
its
wheeled
motors,
use
the
ultrasonic
sensor
to
its
wheeled
motors,
use
the
ultrasonic
sensor the
to
detect
distances
from
objects,
and
also
rotate
detect
distances
from
objects,
and
also
rotate
the
detect
distances
from
objects,
and
also
rotate
the
motor
on
which
the
ultrasonic
sensor is
mounted,
motor
on
which
the
ultrasonic
motor
on
which
the program.
ultrasonic sensor
sensor is
is mounted,
mounted,
through
the
MATLAB
through
the
MATLAB
program.
through
the
MATLAB
program.
For
controlling of
the
wheeled motors
and
deFor
of
the
and
For controlling
controlling
of movements,
the wheeled
wheeleditmotors
motors
and dedetermining
the
robot’s
is
important
to
termining
the
robot’s
movements,
it
is
important
to
termining
the
robot’s
movements,
it
is
important
to
find
the
number
of
degrees
the
motors
must
rotate
find
the
number
of
degrees
the
motors
must
rotate
find
the
number
of
degrees
the
motors
must
rotate
to
be able
to travel
a certain distance.
The diameter
to
to be
be able
able to
to travel
travel aa certain
certain distance.
distance. The
The diameter
diameter
VOLUME 7,
N° 3
2013
of
wheels
used
is 5.5
cm and
hence for
of the
the
wheels
used
and
for
thedegree
wheels
used is
is of5.5
5.5thecm
cmwheels
and hence
hence
for
aof
360
revolution
the
robot
aa 360
degree
revolution
of
the
wheels
the
robot
360
degree
revolution
of
the
wheels
the
robot
moves
forward by
5.5*π
centimeters.
The
degree
of
moves
by
5.5*π
centimeters.
The
degree
moves forward
forward
by
5.5*π
centimeters.
The
degree of
of
revolution,
for
the
wheeled
motors,
is
calculated
revolution,
for
the
wheeled
motors,
is
calculated
revolution,
for
the
wheeled
motors,
is
calculated
according
to
the distance
in
centimeters that
the
according
to
in
according
to the
thetodistance
distance
in centimeters
centimeters that
that the
the
robot
is
required
move.
robot
is
required
to
move.
robot
is
required
to
move.
To
use
the ultrasonic
sensor
it is
necessary
to
To
use
sensor
necessary
to
Toestablish
use the
the aultrasonic
ultrasonic
sensor it
ittois
isthe
necessary
to
first
serial
connection
NXT
brick
first
establish
a
serial
connection
to
the
NXT
brick
first
establish
a
serial
connection
to
the
NXT
brick
and
to its
port
on
which the
ultrasonic sensor
is
atand
port
on
the
sensor
is
and to
to its
its
port
on which
which
the ultrasonic
ultrasonic
sensor
is atattached.
By
using
this
port,
the
distance
readings
tached.
By
using
this
port,
the
distance
readings
tached.
By
using
this
port,
the
distance
readings
recorded
by
the
sensor
are
returned to
the MATLAB
recorded
by
the
sensor
are
to
recorded via
by the
the USB
sensor
are returned
returned
to the
the MATLAB
MATLAB
program
or
Bluetooth
connection.
program
via
the
USB
or
Bluetooth
connection.
program via the USB or Bluetooth connection.
2.4.
Sensor
2.4.
Sensor and
and Motors
Motors
2.4. Sensor
and
Motors
The
ultrasonic sensor
used with
the
TriBot
The
sensor
with
the
TriBot
The ultrasonic
ultrasonic
sensor used
used
with of
thehigh
TriBot
measures
the
time-of-flight
of
a
burst
fremeasures
the
time-of-flight
of
a
burst
of
high
fremeasures
the
time-of-flight
of
a
burst
of
high
frequency
sound
wave.
The
value
returned
by
the
senquency
sound
wave.
The
value
returned
by
the
senquency
sound
wave.
The
value
returned
by
the
sensor is
is aa single
single number
sor
number representing
representing the
the straight
straight line
line
sor
is a single
number
representing
the
straight
line
distance
of
an
object
from
the
sensor.
These
values
distance
of
an
object
from
the
sensor.
These
values
distance
of an objectand
fromare
theread
sensor.
These
values
are in
in centimeters,
centimeters,
by the
the
MATLAB
are
and
are
read
by
MATLAB
are
in
centimeters,
and
are
read
by
the
MATLAB
program. The
The ultrasonic
ultrasonic sensor
sensor is
is able
able to
to detect
detect disdisprogram.
program.
The ultrasonic
sensor is to
able
tometers
detectwith
distances
ranging
from
4
centimeters
2.3
tances
ranging
from
4
centimeters
to
2.3
meters
with
tances
ranging
from
4
centimeters
to
2.3
meters
with
an average
average precision
precision of
of 3
3 centimeters
centimeters and
and returns
returns
an
an
average
precision
of
3object
centimeters
and returns
better
readings
when
the
to
be
detected
has
better
readings when
the
object
to
be
has
better
when
the hits
object
tosurface
be detected
detected
has
aa hard
hard readings
surface.
If
a
wave
the
at
a
wide
surface.
If
aa wave
hits
the
surface
at
aa wide
aangle
hardfrom
surface.
If
wave
hits
the
surface
at
wide
the surface
surface normal
normal [5]
[5] it
it may
may not
not return
return
angle
from
the
angle
from
theinsurface
normal
[5] it
may
not
return
to
the
sensor,
which
case
a
value
of
255
is
received
to
the
sensor,
in
which
case
aa value
of
255
is received
received
to
the
sensor,
in
which
case
value
of
255
is
from the
the sensor,
sensor, indicating
indicating an
an error.
error. The
The wave
wave may
may
from
from
the sensor,
indicatingsurface
an error.
The
wave
may
also
bounce
off
a
separate
and
create
a
ghost
also
bounce
off
aa separate
surface
and
create
aa ghost
also
bounce
off
separate
surface
and
create
ghost
image.
image.
image.
The interactive
interactive servo
servo motors
motors of
of the
the TriBot
TriBot are
are
The
The interactive
servo motors
of the
TriBot
are
equipped
with
an
integrated
encoder
which
records
equipped
with
an
integrated
encoder
which
records
equipped
with
an
integrated
encoder
which
records
count value
value of
of 360
360 for
for aa single
single circular
circular revolution
revolution
aaa count
count
value
of 360rotates
for a single
circular
revolution
i.e.,
when
the
motor
by
360
degrees,
count
i.e.,
when
the
motor
rotates
by
360
degrees,
aaa count
i.e.,
when
the
motor
rotates
by
360
degrees,
count
of
360
is
recoded.
This
encoder
has
an
accuracy
of
of
360
is
recoded.
This
encoder
has
an
accuracy
of
of
360
is
recoded.
This
encoder
has
an
accuracy
of
1
degree.
The
motor
is
able
to
return
this
count
to
1
degree.
The
motor
is
able
to
return
this
count
to
1
degree.
The
motor
is
able
to
return
this
count
to
the program
program via
via the
the serial
serial connection. These
These motors
motors
the
the
program
via maximum
the serial connection.
connection.
Theseencoder
motors
have
a
default
speed
of
1000
have
aa default
maximum
speed
of
1000
encoder
have
default
maximum
speed
of
1000
encoder
counts per
per second
second which
which means
means the
the robot can
can travel
travel
counts
counts
per second
which
means the robot
robot
can travel
in
a
straight
line
at
approximately
48
centimeters
in
a straight
line
at approximately
48
in
straightFor
line
approximately
48 centimeters
centimeters
perasecond.
second.
theatpurpose
purpose
of experimentation,
experimentation,
the
per
For
the
of
the
per
second.
For
the
purpose
of
experimentation,
the
wheeled
motors
have
a
fixed
speed
of
800
encoder
wheeled
motors
have
aa fixed
speed
of
800
encoder
wheeled
motors
have
fixed
speed
of
800
encoder
counts per
per second
second (approximately
(approximately 38.4
38.4 centimecentimecounts
counts
per second
(approximately
38.4
centimeters/second)
and
maintain
a
constant
torque
during
ters/second)
and
maintain
aa constant
torque
during
ters/second)
and
maintain
constant
torque
during
motion.
The
program
can
also
set
the
tacho
limit
of
motion.
The
program
can
also
set
the
tacho
limit
of
motion.
The
program
can
also
set
the
tacho
limit
of
the
motor,
which
specifies
the
integer
number
of
the
motor,
which
specifies
the
integer
number
of
the
motor,
which
specifies
the
integer
number
of
encoder
counts
(or
degrees)
that
a
motor
executes
encoder
counts
(or
degrees)
that
aa motor
executes
encoder
counts
(or
degrees)
that
motor
executes
before coming
coming to
to an
an abrupt
abrupt stop
stop when
when this
this limit
limit is
is
before
before
coming
to for
an abrupt
stop
whenone
thiscentimelimit is
reached.
In
order
the
NXT
to
move
reached.
In order
the
to
one centimereached.
order for
for
the NXT
NXT both
to move
move
centimeter in
in the
theInforward
forward
direction,
the one
motors
must
ter
direction,
both
the
motors
must
ter
in
the
forward
direction,
both
the
motors
must
move
in
the
same
direction
by
a
tacho
limit
of
apmove
in
the
same
direction
by
aa tacho
limit
of
apmove
in
the
same
direction
by
tacho
limit
of
approximately
21
encoder
counts.
The
two
wheels
are
proximately
21
encoder
counts.
The
two
wheels
are
proximately
21
encoder
counts.
The
two
wheels
are
11 centimeters
centimeters apart
apart and
and so
so to
to turn
turn the
the robot
robot by
by
11
11
centimeters
apart and
sowheel
to turn
the
robot the
by
1
degree,
while
moving
one
and
keeping
1
degree,
while
moving
one
wheel
and
keeping
the
1
degree,
while
moving
one
wheel
and
keeping
the
other wheel
wheel stationery,
stationery, the
the motor
motor must
must move
move by
by
other
other
stationery,
the on
motor
must
move by
aa tacho
tachowheel
limit
of
4.
The
motor
which
the
ultrasonic
of
motor
on which
the ultrasonic
asensor
tachoislimit
limit
of 4.
4. The
The
motor
which
ultrasonic
mounted
rotates
at aaon
speed
of the
approximatesensor
is
mounted
rotates
at
speed
of
approximatesensor
is
mounted
rotates
at
a
speed
of
approximately 400
400 counts
counts per
per second.
second. So,
So, aa 360
360 degree rotation
rotation
ly
ly
400
counts persensor
second.and
So, detection
a 360 degree
degree
rotation
of
the
ultrasonic
of
distances
of
the
ultrasonic
sensor
and
detection
of
distances
of
the
ultrasonic
sensor
and
detection
of
distances
after every
every 6 degrees
degrees takes 30
30 seconds, on
on average.
after
after every 6
6 degrees takes
takes 30 seconds,
seconds, on average.
average.
2.5.
Robot Used
2.5.
2.5. Robot
Robot Used
Used
The
robot
used for
the
purposes of
experimentaThe
robot
for
the
of
The
robot used
used
forLego
the purposes
purposes
of experimentaexperimentation
is
based
on
the
Mindstorms
NXT
TriBot,
tion is
is based
based on
on the
the Lego
Lego Mindstorms
Mindstorms NXT
NXT TriBot,
TriBot,
tion
Articles
23
Journal of Automation, Mobile Robotics & Intelligent Systems
with a few modifications. The TriBot is equipped
with a few modifications. The TriBot is equipped
with one ultrasonic sensor, mounted on an interacwith one ultrasonic sensor, mounted on an interactive motor. It is possible to create a ring of 60 sensor
tive motor. It is possible to create a ring of 60 sensor
readings in one complete circular turn by rotating
readings in one complete circular turn by rotating
the sensor anticlockwise by 360 degrees and taking
the sensor anticlockwise by 360 degrees and taking
readings after every 6 degree interval. The ultrasonic
readings after every 6 degree interval. The ultrasonic
sensor has been placed at the origin of the robot
sensor has been placed at the origin of the robot
reference frame [2].
reference frame [2].
Fig. 1. Top view of the Lego Mindstorms NXT TriBot
Fig. 1. Top view of the Lego Mindstorms NXT TriBot
based robot with an interactive servo motor and the
based robot with an interactive servo motor and the
ultrasonic sensor attached on top
ultrasonic sensor attached on top
VOLUME 7,
N° 3
2013
Figure 2 shows the front view of the TriBot used.
Figure 2 shows the front view of the TriBot used.
The wheeled motors, on either side of the robot, are
The wheeled motors, on either side of the robot, are
synchronized so that if one wheel moves forward the
synchronized so that if one wheel moves forward the
other also moves forward by the same degree and at
other also moves forward by the same degree and at
the same speed. Hence both the wheels move in
the same speed. Hence both the wheels move in
unison and it is not necessary to control the power
unison and it is not necessary to control the power
and tacho limit for each motor separately. This is
and tacho limit for each motor separately. This is
important when the robot moves forward or backimportant when the robot moves forward or backward to ensure that it follows a straight line and
ward to ensure that it follows a straight line and
does not drive in a curve. It is also possible to control
does not drive in a curve. It is also possible to control
a single wheeled motor separately when the robot
a single wheeled motor separately when the robot
executes a turn.
executes a turn.
Figure 3 shows the schematic diagram of the
Figure 3 shows the schematic diagram of the
TriBot. The ultrasonic sensor is represented by
TriBot. The ultrasonic sensor is represented by
a yellow triangle, and the orange circle in the center
a yellow triangle, and the orange circle in the center
of the robot represents the motor on which the senof the robot represents the motor on which the sensor is mounted. The intersecting perpendicular lines
sor is mounted. The intersecting perpendicular lines
within the robot represent the robot reference
within the robot represent the robot reference
frame. Origin of the robot reference frame is at the
frame. Origin of the robot reference frame is at the
point where these lines intersect. The two wheels on
point where these lines intersect. The two wheels on
either side of the robot and a swiveling wheel are
either side of the robot and a swiveling wheel are
also shown in the figure using round edged rectanalso shown in the figure using round edged rectangles.
gles.
Fig. 3. Schematic representation of the Lego
Fig. 3. Schematic representation of the Lego
Mindstorms NXT TriBot based robot depicting the
Mindstorms NXT TriBot based robot depicting the
robot reference frame
robot reference frame
3.
3. Map
Map Construction
Construction and
and Localization
Localization
Map construction process involves recording and
Map construction process involves recording and
displaying the robot’s surrounding environment. It is
displaying the robot’s surrounding environment. It is
implemented as a continuous cycle of two steps
implemented as a continuous cycle of two steps
namely, sensing and motion, as shown in Figure 4.
namely, sensing and motion, as shown in Figure 4.
Fig. 2. Front view of the Lego Mindstorms NXT TriBot
Fig. 2. Front view of the Lego Mindstorms NXT TriBot
based robot with the ultrasonic sensor pointing forbased robot with the ultrasonic sensor pointing forward
ward
24
The central component of the TriBot is the NXT
The central component of the TriBot is the NXT
brick which is essential for establishing a connection
brick which is essential for establishing a connection
with the computer and controlling the sensors, and
with the computer and controlling the sensors, and
motors. Figure 1 shows the TriBot used for experimotors. Figure 1 shows the TriBot used for experimentation purposes.
mentation purposes.
Articles
Fig. 4. Map construction process used by the TriBot in
Fig. 4. Map construction process used by the TriBot in
implementation
implementation
Journal of Automation, Mobile Robotics & Intelligent Systems
The robot is able to navigate around in an area and
record the features of that space. It initially starts at
the origin position around which the global map is
constructed. In this system, the origin of the robot
reference frame is initially assumed to coincide with
the origin of the map and the Cartesian coordinates of
the robot in the map are (0, 0) with an orientation of 0
degree with respect to the X-axis. A position of the
robot in the map is the single set of values, of its X and
Y coordinates and its orientation, with respect to the
origin of the map. These positions are recorded as
distinct rows in a matrix in the MATLAB program;
where origin of the map is saved as the first position
of the robot. The ultrasonic sensor mounted on the
robot is initially assumed to be pointing in the 0 degree direction with respect to the X-axis of the map.
3.1. Sensing
In this step the robot senses its current local surroundings by using the ultrasonic sensor. The TriBot
takes a set of 120 ultrasonic readings of the area
around its current position by rotating the motor by a
6 degree interval in the counter-clockwise direction,
for a total of 720 degrees, instead of a revolution of
360 degrees, which decreases the probability that the
recorded readings are representative of a ghost image. The values thus obtained are relative to the origin
of the robot reference frame and denote the robot’s
immediate neighborhood.
Angles, at which these readings are recorded, are
also saved along with the distance values which depict
the recorded straight line distance between the robot
and an object at that angle. The motor is rotated till
the sensor points at an angle of 714 (or 354) degrees
with respect to the positive X-axis. These 120 readings, saved in a two-column matrix, represent the
local map of the robot at that position and the columns in the matrix represent the radius and angle of
the Polar coordinates of the objects from the current
position of the robot.
After obtaining a set of readings, for each distance
reading of 255 the value is deleted and the corresponding angle of that reading is also deleted, so that
these readings are not considered for map construction. The sensor is then rotated back to the 0 degree
position, as it was before the readings were taken. To
display the map, Polar coordinates of the objects taken with respect to the robot reference frame are converted into Cartesian coordinates. The map is initially
empty and the set of readings taken at the origin of
the global map, are then added to the map.
For all the next sets of readings representing the
TriBot’s local map, at some known position relative to
the origin of the map, the calculated Cartesian coordinates of the readings are shifted by the X and Y values
of the TriBot’s current position before being added to
the existing map. The robot’s current position, with
respect to the global origin, is also plotted onto the
map. This process continues till the robot keeps exploring the room, and the then existing map is treated
as the final map of the room.
The robot stops at each new position and records
the local map readings of that position, a process
which takes 60 seconds to obtain 120 readings. The
VOLUME 7,
N° 3
2013
error in the readings obtained is dependent on the
ultrasonic sensor and so an average error of 3 centimeters is possible in the readings, which is a fairly
acceptable value for this kind of sensor.
3.2. Motion
After sensing the environment surrounding the
robot, the distance in centimeters and the direction
of the next frontier from the robot’s current position
is computed and the robot then proceeds to move
toward this new frontier location.
The robot moves to a new frontier location so
that it is possible to detect the objects in the previously unexplored area of the room. To ensure that
the TriBot is able to explore the complete room efficiently, the 120 distance readings obtained for the
current position of the robot are sorted and one of
the four farthest values of the recorded distances is
randomly selected. Random selection is chosen, as it
is possible that the distances recorded are erroneous
readings due to a ghost image.
The TriBot turns according to the angle of the direction of the farthest point selected with respect to
the robot reference frame and it is then moved half
the distance to that point. The distance is halved to
counteract the possibility that the point selected
represents a ghost image. Every new position of the
robot depends only on the relative displacement
from the previous position which in turn depends on
the readings taken on that position. The robot’s relative displacement from the previous position to the
new position is stored in a matrix. The relative orientation from previous position is the angle which the
robot turned, found during the new frontier location
calculation. The X and Y coordinates representing
the new position achieved by displacement from
previous position are found by converting the distance moved and the angle rotated with respect to
the previous position into Cartesian values. The current position of the robot relative to the origin of the
map is calculated by taking the arithmetic sum of all
previous relative displacements and adding it to the
displacement from last known position to the current position. Hence the robot’s position is localized
by calculating the sum of all the displacements.
The sensing and motion steps are repeated continuously, many times, to construct the final map by
the robot through exploration of the environment
and to find the position of the robot after every relative displacement. The readings taken at each position contribute to the calculation of the next frontier
for the robot and the robot has no knowledge of the
readings taken previously. The robot also has orientation information relative to the X-axis of the global
map, though it may have approximately 0.25 degrees
of error from the previous position of the robot.
All the Cartesian coordinates of the objects observed and the robot positions are stored in separate
matrices which are later stored as files, so that it
becomes possible to recreate the map after the robot
has finished collecting the readings from the environment. A single reading, consisting of X and Y coordinates values of the room takes roughly 16 bytes
of file space. So, for 120 readings taken at 100 posiArticles
25
Journal of Automation, Mobile Robotics & Intelligent Systems
tions,
for
a total of
12000 records,
the file
size will
be
tions,
for
of
records,
tions,
for aa total
total188
of 12000
12000
records, the
the file
file size
size will
will be
be
approximately
kBytes.
approximately
kBytes.
tions,
for a total188
of 12000
approximately
188
kBytes.records, the file size will be
4.
Kidnapped
Robot
Problem
approximately
188
kBytes.
4.
Kidnapped
Robot
Problem
4. Kidnapped Robot Problem
4. The
Kidnapped
Problem
algorithm
implemented
and
The
algorithmRobot
implemented
and discussed
discussed here
here is
is
The
algorithm
implemented
and discussed
here
is
used
by
the
TriBot
to
find
its
approximate
location
in
an
used
by the
the
TriBot to
to
find its
its approximate
approximate
location
in an
an
The
algorithm
implemented
and discussed
here
is
used
by
TriBot
find
location
in
existing
map
by
taking
the
readings
of
its
surroundings.
existing
map
by
taking
the
readings
of
its
surroundings.
used
by
the
TriBot
to
find
its
approximate
location
in
an
existing map by taking the readings of its surroundings.
During
the
map
construction
process,
the
robot finds its
existingthe
map
by construction
taking the readings
ofthe
its surroundings.
During
map
process,
robot finds its
position
through
the
relative
displacements
from
the
During the
map construction
the robotfrom
findsthe
its
position
through
the relativeprocess,
displacements
origin
of
the
map.
In
a
situation
where
the
robot
only
position
through
the
relative
displacements
from
the
origin of the map. In a situation where the robot only
knows
the
coordinates
of the objects
ofthe
the map only
and
origin ofthe
thecoordinates
map. In a situation
whereof
knows
of the objects
therobot
map and
does
not
know
its
position
relative
to
the
origin,
the
knowsnot
theknow
coordinates
of the
objectstoofthe
theorigin,
map and
does
its position
relative
the
robot’s
position
is
localized
within
the
environment
by
does not
know is
itslocalized
positionwithin
relative
the origin, the
robot’s
position
thetoenvironment
by
taking
measurements
from
its
immediate
surroundings
robot’smeasurements
position is localized
within
the environment
by
taking
from its
immediate
surroundings
and
bymeasurements
moving it within
theitsarea
for which
the global
taking
from
immediate
surroundings
and by moving it within the area for which the global
map
is available.
Figure
5 the
depicts
these
two steps.
and by
moving itFigure
within5
areathese
for which
the global
map
is available.
depicts
two steps.
map is available. Figure 5 depicts these two steps.
Fig. 5.
5. Localization
Localization process
process used
used by
by the
the TriBot
TriBot
Fig.
Fig.
5. Localization
process used
by the
TriBot
Fig. 5. Localization process used by the TriBot
The position
position of
of the
the particles
particles which
which represent
represent the
the
The
The
position
of the
particles
represent
guesses
of the
the robot’s
robot’s
possible which
position
within the
the
guesses
of
possible
position
within
the
The position
of the particles
which
represent
the
guesses
the robot’s
possible
position
map, are
areoftaken
taken
as the
the initial
initial
belief
for the
thewithin
particles.
map,
as
belief
for
particles.
guesses
the robot’s
possible
position
within
the
map,
areoftaken
as the
initial
belief
for of
thethe
particles.
The particles
particles
then
update
their
belief
robot’s
The
then
update
their
belief
of
the
robot’s
map,
are taken
as the
initial
belief
for of
thethe
particles.
The
particles
then
update
their
belief
robot’s
position by
by using
using the
the measurements
measurements taken
taken by
by the
the
position
The
particles
then the
update
their belief of
the robot’s
position
by using
measurements
taken
by the
robot’s ultrasonic
ultrasonic
sensor
which leads
leads to
to
survival
of
robot’s
sensor
which
survival
of
position
by using sensor
the measurements
taken
by the
robot’s
ultrasonic
to
survival
of
only aa few
few
of the
the particles
particleswhich
with leads
probable
initial bebeonly
of
with
probable
initial
robot’s
ultrasonic
sensor which
leads
to survivalbeof
only
few
of the particles
probable
liefs. aThis
This
selection
of the
the with
particles
with initial
probable
liefs.
selection
of
particles
with
probable
only
aThis
few selection
of the particles
with
probable
initial
beliefs.
of
the
particles
with
probable
beliefs is
is repeated
repeated continuously,
continuously, by
by moving
moving the
the robot
robot
beliefs
liefs.
Thisrepeated
selection
of the particles
with the
probable
beliefs
by and
moving
robot
and the
theisparticles
particles by
bycontinuously,
some distance
distance
taking subsesubseand
some
and
taking
beliefs
isparticles
repeatedbycontinuously,
by and
moving
thesubserobot
and
themeasurements,
some
taking
quent
till distance
all the
the particles
particles
are clusclusquent
measurements,
till
all
are
and
themeasurements,
particles by some
distance
and taking
subsequent
till
all
the
particles
are
clustered around
around aa single
single position
position within
within the
the map
map which
which
tered
quent
measurements,
till all the
particles
arewhich
clustered
around
a single
position
within
the map
can safely
safely
predict
the robot’s
robot’s
approximate
location.
can
predict
the
approximate
location.
tered
around
a single
positionapproximate
within the map
which
can
safely
predict
the
robot’s
location.
4.1.
Particle
Filters
can safely
predict
the robot’s approximate location.
4.1.
Filters
4.1. Particle
Particle
Filters
Particle
filters
are used
used in
in this work
work as
as they
they are
4.1.Particle
Particlefilters
Filtersare
Particle
filters
are
used in this
thisvalue
workproblems,
as they are
are
easier
to program
program
for continuous
continuous
as
easier
to
for
value
problems,
as
Particle
filters are
used in thisvalue
workproblems,
as they are
easier
to
program
for
continuous
as
compared to
to Kalman
Kalman Filters
Filters [10,
[10, 11].
11]. Particle
Particle filters
filters
compared
easier
to program
for Filters
continuous
valueParticle
problems,
as
compared
to
Kalman
[10,
11].
filters
also work
work better
better than
than Histogram
Histogram Filters
Filters [11]
[11] in
in highhighalso
compared
to Kalman
[10, Filters
11]. Particle
filters
also
work better
than Filters
Histogram
[11]
higher-dimensional
spaces.
Table 1
1 compares
compares
theinHistoHistoer-dimensional
spaces.
Table
the
also
work betterspaces.
than Histogram
Filters [11]
inHistohigher-dimensional
Table
1
compares
the
gram, Kalman,
Kalman, and
and Particle
Particle filters
filters on
on the
the basis of
of two
two
gram,
er-dimensional
spaces.
Table
1 compares
the of
Histogram,
Kalman, and
Particle
filters
on the basis
basis
two
characteristics.
characteristics.
gram,
Kalman, and Particle filters on the basis of two
characteristics.
characteristics.
Table
1.
Comparisons
between Histogram,
Kalman and
Table
1.
Comparisons
Table
1.Filters
Comparisons between
between Histogram,
Histogram, Kalman
Kalman and
and
Particle
Particle
Filters
Table
Comparisons between Histogram, Kalman and
Particle1.Filters
Particle
Filters
Filter Name
Name
State-Space
Belief
Filter
State-Space
Belief
Filter Name
Histogram
Filter
Name
Histogram
Histogram
Kalman
Histogram
Kalman
Kalman
Particle
Kalman
Particle
Particle
26
Particle
Articles
State-Space
State-Space
Discrete
Discrete
Continuous
Discrete
Continuous
Continuous
Continuous
Continuous
Continuous
Continuous
Belief
Belief
Multi-Modal
Multi-Modal
Uni-Modal
Multi-Modal
Uni-Modal
Uni-Modal
Multi-Modal
Uni-Modal
Multi-Modal
Multi-Modal
Multi-Modal
VOLUME 7,
N° 3
2013
4.2.
Localization using
Particle Filters
4.2.
4.2. Localization
Localization using
using Particle
Particle Filters
Filters
After
satisfactory
map
construction
the room,
the
4.2.After
Localization
using
Particle
Filtersof
satisfactory
map
construction
of
After satisfactory map construction
of the
the room,
room, the
the
robot is
is kidnapped
kidnapped i.e.,
i.e., removed
removed from
from the
the environment
environment
robot
After
satisfactoryi.e.,
map
construction
of the
room, the
robot
is kidnapped
removed
from the
environment
and
then
placed
randomly
somewhere
inside
the area
area
and
placed
randomly
somewhere
inside
the
robotthen
is kidnapped
i.e., removed
from the
environment
and
then
placed
randomly
somewhere
inside
the area
represented
by
the
global
map.
The
current
position
of
represented
by
global
map.
The
position
of
and
then placed
randomly
somewhere
inside
the area
represented
by the
the
global
map.
The current
current
position
of
the
robot
within
the
map
is
unknown
and
is
deterthe
robot
within
map
is
unknown
and
is
deterrepresented
by thethe
global
map.
The current
position
of
the
robot
within
the
map
is
unknown
and
is
determined by
by finding
finding distances
distances to
to nearby
nearby objects
objects with
with the
the
mined
the
robot
within distances
the map is
unknown
and is
determined
by
finding
to
nearby
objects
with
the
use of
of the
the ultrasonic
ultrasonic sensor,
sensor, and
and calculating
calculating aa good
good
use
mined
finding
distances
to nearby
objects with
the
use
of by
thedistribution
ultrasonic
sensor,
and
calculating
a good
posterior
of
where
the
robot
is
based
on
posterior
where
the
robot
is
based
on
use
of thedistribution
ultrasonic of
sensor,
and
calculating
a good
posterior
distribution
of
where
the
robot
is
based
on
those
readings.
those
readings.
posterior
distribution of where the robot is based on
those
readings.
Each
particle
represents the
possible
position
Each
particle
the
possible
position
those
readings.
Each
particleof represents
represents
the
possible
position
and
orientation
the
robot
in
the
map.
Set
of hunhunand
orientation
of
the
robot
in
the
map.
Set
of
Each
particleof represents
the
possible
position
and
orientation
the
robot
in
the
map.
Set
of hundreds
of
such
(N)
particles
together
comprise
an
dreds
of
(N)
particles
an
and
orientation
of the
robot intogether
the map.comprise
Set of hundreds
of such
suchrepresentation
(N)
particles
together
comprise
an
approximate
of
the
posterior
distriapproximate
representation
of
the
posterior
distridreds
of suchrepresentation
(N) particles of
together
comprise
an
approximate
the
posterior
distribution of
of the
the robot
robot [11].
[11]. In
In the
the beginning,
beginning, all
all the
the N
N
bution
approximate
representation
of the
posterior
distribution
of
the
robot
[11].
In
the
beginning,
all
the
N
particles
are
uniformly
scattered,
and
the
filter
alparticles
and
the
filter
bution
of are
the uniformly
robot [11].scattered,
In the beginning,
all
thealN
particles
are
uniformly
scattered,
and
the
filter
allows them
them to
to survive
survive in
in proportion
proportion to
to how
how consistent
consistent
lows
particles
are
uniformly
scattered, to
and
theconsistent
filter allows
them
to
survive
in
proportion
how
these particles
particles are
are with
with the
the sensor
sensor readings,
readings, such
such
these
lows
them
to survive
in proportion
to how
consistent
these
particles
are
with
the
sensor
readings,
such
that
the
particles
which
are
more
consistent
with
the
that
the
particles
which
more
consistent
with
the
these
particles
are
withare
the
sensor
readings,
such
that
the
particles
which
are
more
consistent
with
the
sensor
measurements
are
considered
fitter
and
are
sensor
fitter
and
are
that
themeasurements
particles whichare
areconsidered
more consistent
with
the
sensor
measurements
are
considered
fitter
and
are
more
likely
to
survive.
Measurements
of
the
robot’s
more
likely
to
survive.
Measurements
the
robot’s
sensor
measurements
are
considered of
fitter
and
are
more
likely
to
survive.
Measurements
of
the
robot’s
surroundings are
are recorded
recorded and
and applied
applied to
to all
all the
the
surroundings
more
likely to are
survive.
Measurements
of the
robot’s
surroundings
recorded
and
applied
to
all
the
particles individually.
individually. As
As aa result,
result, closer
closer the
the particle
particle
particles
surroundings
are recorded
and applied
to particle
all the
particles
individually.
As amore
result,
closer
the
is
to
the
correct
position,
likely
the
possibility
is
to
the
correct
position,
likely
the
possibility
particles
individually.
As amore
result,
closer
the
particle
is
to
the
correct
position,
more
likely
the
possibility
that the
the measurements
measurements belong
belong to
to that
that particle,
particle, and
and
that
is
to the
the measurements
correct position,belong
more to
likely
possibility
that
thatthe
particle,
and
higher
the
likelihood
of
its
survival.
The
particles
higher
likelihood
of
its
survival.
particles
that
thethe
measurements
belong
to that The
particle,
and
higher
the
likelihood
of
its
survival.
The
particles
which are
are highly
highly consistent
consistent with
with the
the measurements
measurements
which
higher
thehighly
likelihood
of its with
survival.
The particles
which
are
consistent
the
measurements
form clusters
clusters around
around aa single
single position
position in
in the
the map
form
which
are highly
consistent
with
the measurements
form
clusters
around
a single
position
in theofmap
map
which
approximately
represents
the
position
rowhich
approximately
the
position
roform
clusters
around represents
a single position
in theof
map
which
approximately
represents
the
position
of
robot
as
it
localizes
itself
within
the
map.
bot
as
localizes
itself
within
the
map.
which
represents
the position of robotEach
as it
itapproximately
localizes
itself
within
the
map.
particle
has
its own
importance
weight and
particle
has
own
importance
and
botEach
as it localizes
itselfits
the map. weight
Each
particle
has
itswithin
ownthe
importance
weight
and
survives
at
random,
though
probability
of its
its sursursurvives
at
random,
though
the
probability
of
Each at
particle
hasthough
its ownthe
importance
weight
and
survives
random,
probability
of
its
survival is
is proportional
proportional to
to its
its importance
importance weight.
weight. Larger
Larger
vival
survives
at random, though
the probability
of its
survival
is
proportional
to
its
importance
weight.
Larger
the weight
weight of
of the
the particle, more
more likely
likely it
it is
is that
that it
it reprereprethe
vival
is proportional
to itsmore
importance
weight.
the
weight
of the particle,
particle,
likely likely
it is
that
it Larger
represents
the
robot’s
position
and
more
that
it
sursents
the
robot’s
position
and
more
that
surthe
weight
of the particle,
more
likely likely
it is that
it it
represents
the
robot’s
position
and
more
likely
that
it
survives.
After
resampling
i.e.,
randomly
drawing
N
new
vives.
After
resampling
i.e.,
randomly
drawing
sents
the
robot’s
position
and
more likely
that N
it new
survives.
After
resampling
i.e.,
randomly
drawing
N
new
particles from
from the
the old
old ones
ones in
in proportion
proportion with
with their
their
particles
vives.
After
resampling
i.e., randomly
drawing
N their
new
particles
from
the
old
ones
in
proportion
with
importance weights,
weights, with
with replacement,
replacement, the
the smaller
smaller
importance
particles
from
the oldwith
onesreplacement,
in proportionthe
with
their
importance
weights,
smaller
weighted
particles
representing
less
probable
robot
weighted
particles
less
probable
robot
importance
weights,representing
with replacement,
the smaller
weighted
particles
representing
less
probable
robot
positions are
are likely
likely to
to be
be discarded.
discarded.
positions
weighted
particles
positions are
likely torepresenting
be discarded.less probable robot
4.3.
Noise
positions
4.3.
Noise
4.3.
Noiseare likely to be discarded.
Three
of noise
values [11]
are considered
dur4.3.Three
Noisetypes
types
Three
types of
of noise
noise values
values [11]
[11] are
are considered
considered durduring Three
the localization
of
the
robot,
as
follows:
types of noise
are considered during the localization
of thevalues
robot,[11]
as follows:
—
noise of
– the
Therobot,
errorasvalue,
in centimeters,
ing Forward
the localization
follows:
—
Forward
noise
– Theinto
error
value,
in centimeters,
that
must
be
taken
account
when
the
robot
must
be
taken
account
the
— that
Forward
noise
– Theinto
error
value,when
in centimeters,
that
must
be
taken
into
account
when
the robot
robot
moves
in
the
forward
direction.
moves
in
direction.
that
must
beforward
taken into
account when the robot
moves
in the
the
forward
direction.
— Turn
noise
The error
value, in degrees, which
in the––forward
direction.
— moves
Turn
noise
The
error
value, in when
degrees,
which
must
be
taken
into
consideration
the
robot
be
taken
into
consideration
the
robot
— must
Turn
noise
– The
error
value, in when
degrees,
which
must
be
taken
into
consideration
when
the
robot
executes
a
right
or
left
turn
on
its
current
position.
executes
a
right
or
left
turn
on
its
current
position.
must
be ataken
whenposition.
the robot
executes
right into
or leftconsideration
turn on its current
— Sense
noise
– The
error
value,
centimeters,
that
a right
or left
turn
on itsin
position.
— executes
Sense
noise
– The
error
value,
incurrent
centimeters,
that
must
be
taken
into
account
due
to
the
ultrasonic
be
taken
into
account
due
the
ultrasonic
— must
Sense
noise
– The
error
value,
in to
centimeters,
that
must
be
taken
into
account
due
to
the
ultrasonic
sensor’s
inability to
measure exact
distances.
sensor’s
to
distances.
must
be inability
taken into
account exact
due to
the ultrasonic
sensor’s
inability
to measure
measure
exact
distances.
4.4.
Particles
inability to measure exact distances.
4.4.
Particles
4.4.sensor’s
Particles
To
create
the particles a class is created in
4.4.To
Particles
create
theanparticles
a that
classclass
is created
in
MATLAB,
where
object
of
represents
MATLAB,
where
object
of
represents
To create
thean
particles
a that
classclass
is created
in
MATLAB,
where
an
object
of
that
class
represents
aa single
particle.
Each
particle
created
using
this
single
particle.
Each
particle
created
using
this
MATLAB,
where
an
object
of
that
class
represents
a single
particle.
Each namely
particle– X-coordinate
created usingvalue,
this
class
has
six
attributes,
class
has
six
attributes,
X-coordinate
aclass
single
particle.
Each namely
particle–
created usingvalue,
this
has
six
attributes,
namely
–
X-coordinate
value,
Y-coordinate
value,
Orientation,
Forward
noise,
Turn
Y-coordinate
value,
Orientation,
Forward
noise,
Turn
class
has
six
attributes,
namely
–
X-coordinate
value,
Y-coordinate
value,
Orientation,
Forward
noise, Turn
noise,
Sense
noise.
The
position
and
orientation
of
noise,
Sense
noise.
The
position
and
orientation
of
Y-coordinate
value,
Orientation,
Forward
noise, Turn
noise,
Sense
noise.
The
position
and
orientation
of
aa particle
are
randomly
initialized
on
creation
of
the
particle
are
randomly
initialized
on
creation
of
the
noise,
Sense
noise.
The
position
and
orientation
of
aparticle.
particleThe
areforward,
randomly
initialized
on
creation
of
the
turn
and
sense
noise
values
for
particle.
forward,
turn
and
sense
noise
values
for
aparticle.
particleThe
are
randomly
initialized
on
creation
of
the
The forward,
turn
and
sense
noise
values for
each
particle
are
initially
set
to
0.5,
by
default.
each
are
set
to
by
default.
particle.
The forward,
turn
sense
values for
each particle
particle
are initially
initially
setand
to 0.5,
0.5,
bynoise
default.
each particle are initially set to 0.5, by default.
Journal of Automation, Mobile Robotics & Intelligent Systems
VOLUME 7,
4.5. Particle Filter Algorithm
Firstly, 250 particles are created using the class
implemented in MATLAB and their noise attribute
values are set according to the measured values of
the precision for TriBot’s wheeled motors and ultrasonic sensor. The values of forward, turn, and sense
noise of the TriBot are calculated using the accuracy
of the ultrasonic sensor and the motors and are set
to 0.05 centimeters, 0.25 degrees and 3 centimeters
respectively for the current implementation.
At the current unknown position of the robot
a reading is taken by the ultrasonic sensor at a certain
angle, between 0 and 45 degrees, with respect to the
robot reference frame. Seven more such readings are
taken at an interval of 45 degree each, which gives a
set of total eight measurements for the robot’s current
position. For each particle, these angles are used to
find the distances between the particle’s position and
the objects in the map at these angles, with respect to
the particle’s orientation. The probability that a particle represents the approximate location of the robot is
computed by using the equation:
(���)�
� ���
�=�
�
∝ √���
(1)
Where µ is the distance of the particle from an object, x is the observed distance measurement at that
angle with respect to the robot reference frame, σ is the
sense noise and α is the set of all angles at which the
eight readings were recorded. This probability, p, is
eventually treated as the importance weight for that
particle.
The particle filter implemented in the current system, is shown in Algorithm 1. RoomX and RoomY are
arrays having the values of X and Y coordinates of the
room boundary and objects, recorded during map construction. Line 2 creates a set of particles P by creating
objects of the class and initializes the respective attribute values for each particle. P is a collection of onedimensional arrays, viz. Px, Py, Porientation, Psense, Pturn
and Pforward, of size N each, where each array represents an attribute. Array Px records the X-coordinate
value for all the N particles in sequence, Py stores the Ycoordinate value for all the particles and so on. W is an
array of size N that sequentially stores the importance
weights of all the particles in P. Line 3 sets the noise
values of all the particles.
The constant, t is used to define the number of
times that the algorithm must be run and a value of 10
is fixed in the implementation. The initially empty array, select, is used to save the index of the coordinates
of room that satisfy the measurement reading, Z, when
applied to a particle. Integer value of the variable angle
can be initialized manually between 0 and π/4 radians
and a value of π/6 radians has been chosen in the implementation.
In line 7, the robot takes a distance reading of its
surrounding at the angle specified. The function polar
takes the arrays of Cartesian coordinates of the room
and converts them to Polar coordinates, with respect to
N° 3
2013
the origin of the global map; where R represents the
array of distances and Ɵ represents the array of angles
of those points, in radians. The size function returns the
number of rows in an array. The values distX, distY
represent the distance between the X and Y coordinates
of the particles from the coordinates of the object of the
room observed at a certain angle by the robot. These
values are used to find the distance of the particle from
the coordinates in map, which is then used in line 29
and 30 to find the importance weight of the particle.
Line 33 applies the resampling algorithm which randomly draws N new particles from the old ones, with
replacement, according to their importance weights. In
line 37 implements the movement of the TriBot and all
the particles, by turning them by 20 degrees anticlockwise and moving them forward by a distance of
6 centimeters. A random Gaussian distribution value
with zero mean, and the standard deviation set as the
forward or turn noise is added when the particles are
moved forward, or rotated.
Algorithm 1: Particle Filter Algorithm for Localization.
Input :
�����, ����� � ����������� �� ����
�
�
� ������ �� ���������
1
begin
2
Create N uniformly distributed particles P, and initialize attributes Px, Py, Porientation, Psense, Pturn, and
Pforward.
3
Set Noise values of Psense, Pturn, Pforward.
4
for t ← 1 to 10 do
5
angle ← π/6 ;
6
for i ← 1 to 8 do
7
Z ← Distance reading with TriBot(angle) ;
8
select ← [ ] ;
9
(R, Ɵ) ← polar (RoomX, RoomY);
10
for j ← 1 to N do
11
W(j) ← 1.0 ;
12
for k ← 1 to N do
13
dcos ← R(k) * cos(Ɵ(k)) – Px(j) ;
14
dsin ← R(k) * sin(Ɵ(k)) – Py(j) ;
15
d ← √����� + ����� ;
if dcos ≅ 0 do
16
r ← ����� (����⁄�) ;
17
else
18
r ← ����� (����⁄�) ;
19
end if (line 16)
20
21
do
22
if �� � ���� = �(������������ (�) + �����) � ����
Add k to select array.
23
end if (line 21)
24
end for (line 12)
25
26
for i ← 1 to size(select) do
distX ← ��(�) � ������������(�)� ;
Articles
27
Journal of Automation, Mobile Robotics & Intelligent Systems
distY ← ��(�) � ������������(�)� ;
distY ← ��(�) � ������������(�)� ;
dist ← √����� �� + ����� �� ;
dist ← √����� �+ ����� ;
27
27
28
28
�←
�←
29
29
30
30
31
31
32
32
33
33
34
34
35
35
36
36
37
37
38
38
39
39
(������)
(������)� �
�(� ∗(������(�))�)
�(� ∗(������(�)) )
��� ∗(������(�))�
��� ∗(������(�))�
;
;
W(j) ← W(j) * p ;
W(j) ← W(j) * p ;
end for (line 25)
end for (line 25)
end for (line 10)
end for (line 10)
P2 ← Resampling Wheel Algorithm (W, N).
P2 ← Resampling Wheel Algorithm (W, N).
P ← P2 ;
P ← P2 ;
angle ← angle + π/4 ;
angle ← angle + π/4 ;
end for (line 6)
end for (line 6)
Move the robot and particles.
Move the robot and particles.
end for (line 4)
end for (line 4)
end
end
For resampling the particles on the basis of their
For resampling the particles on the basis of their
importance weights, a particle that has higher imimportance weights, a particle that has higher importance weight, must be sampled or picked more
portance weight, must be sampled or picked more
number of times than a particle that has low imnumber of times than a particle that has low importance weight. This is achieved through the
portance weight. This is achieved through the
resampling wheel algorithm [11] described in Algoresampling wheel algorithm [11] described in Algorithm 2. First, an initial integer guess of a particle numrithm 2. First, an initial integer guess of a particle number is taken from the uniform interval (1, N) at random
ber is taken from the uniform interval (1, N) at random
and labeled as index. A value beta, initialized as zero, is
and labeled as index. A value beta, initialized as zero, is
added to a real value drawn uniformly from the interval
added to a real value drawn uniformly from the interval
(0, 2*maxw) to give a new value of beta; where maxw is
(0, 2*maxw) to give a new value of beta; where maxw is
the maximum importance weight in W. If weight of the
the maximum importance weight in W. If weight of the
index particle is smaller than the current value of beta,
index particle is smaller than the current value of beta,
then the weight of the particle is subtracted from beta
then the weight of the particle is subtracted from beta
and index is incremented by one. Otherwise, if the beta
and index is incremented by one. Otherwise, if the beta
value is lower than weight of index particle, then that
value is lower than weight of index particle, then that
particle is chosen to survive.
particle is chosen to survive.
In the given algorithm, Ʋ1 represents a function that
In the given algorithm, Ʋ represents a function that
returns a uniformly integer1 drawn value and Ʋ2 rereturns a uniformly integer drawn value and Ʋ2 returns a uniform real value from the intervals specified.
turns a uniform real value from the intervals specified.
The max function returns the maximum value among
The max function returns the maximum value among
all the elements of an array. P2 is, initially, an empty set
all the elements of an array. P2 is, initially, an empty set
of arrays and stores the attribute values of the particles
of arrays and stores the attribute values of the particles
selected for survival after resampling.
selected for survival after resampling.
Algorithm 2. Resampling Wheel Algorithm.
Algorithm 2. Resampling Wheel Algorithm.
Input :
Input
� ∶: ����� �� ���������� ������ �� ���������
� � ∶ ����� �� ���������� ������ �� ���������
� ∶ ������ �� ���������
�
� ∶ ������ �� ���������
Output : � �� ∶ ��������� ����� ����������
Output : � �� ∶ ��������� ����� ����������
1 begin
1 begin
2
index ← Ʋ1(1, N) ;
2
index ← Ʋ1(1, N) ;
3
beta ← 0 ;
3
beta ← 0 ;
4
P2 ← [ ] ;
4
P2 ← [ ] ;
5
maxw ← max(W);
5
maxw ← max(W);
6
for i ← 1 to N do
6
for i ← 1 to N do
7
beta ← beta + Ʋ2(0, 2*maxw) ;
7
beta ← beta + Ʋ2(0, 2*maxw) ;
8
while W[index] < beta do
8
while W[index] < beta do
9
beta ← beta – W(index) ;
9
beta ← beta – W(index) ;
10
index ← index + 1 ;
10
index ← index + 1 ;
28
Articles
VOLUME 7,
11
11
12
12
13
13
14
14
N° 3
2013
end while
end while
Add the index particle to P2.
Add the index particle to P2.
end for
end for
end
end
If the beta value is small, there is a high possibilIf the beta value is small, there is a high possibility of a particle with large importance weight being
ity of a particle with large importance weight being
picked more than once during resampling. After
picked more than once during resampling. After
resampling, N new particles are obtained, each
resampling, N new particles are obtained, each
having a new position and orientation values that
having a new position and orientation values that
have been derived from old particles with high imhave been derived from old particles with high importance weights.
portance weights.
At the end of the process an approximate location
At the end of the process an approximate location
of the robot is found by analyzing the X and Y coorof the robot is found by analyzing the X and Y coordinate of the locations where most of the particles
dinate of the locations where most of the particles
are clustered in the global map.
are clustered in the global map.
5. Experimental Results
5. Experimental Results
Figure 6 shows the area under consideration for
Figure 6 shows the area under consideration for
the purposes of experimentation. The cross mark
the purposes of experimentation. The cross mark
shows the robot’s starting position for map construcshows the robot’s starting position for map construction. The light-gray colored portion in the figure is
tion. The light-gray colored portion in the figure is
beyond the environment under observation.
beyond the environment under observation.
Fig. 6. Room used for Map Construction and LocalizaFig. 6. Room used for Map Construction and Localization by the TriBot
tion by the TriBot
5.1. Map Construction
5.1. Map Construction
Figure 7 shows the two dimensional maps created
Figure 7 shows the two dimensional maps created
after taking readings on the first two positions. All the
after taking readings on the first two positions. All the
objects recorded by the TriBot are displayed as a point
objects recorded by the TriBot are displayed as a point
cloud of black dots as the ultrasonic sensor returns a
cloud of black dots as the ultrasonic sensor returns a
single value of the straight line distance of an object
single value of the straight line distance of an object
from the ultrasonic sensor. In the figure the positions of
from the ultrasonic sensor. In the figure the positions of
the robot where the readings were taken are displayed
the robot where the readings were taken are displayed
by magenta colored dots. Figure 7a shows the local
by magenta colored dots. Figure 7a shows the local
map created at the origin position by the TriBot during
map created at the origin position by the TriBot during
map construction. Figure 7b shows the existing map
map construction. Figure 7b shows the existing map
after adding the readings taken at the second position,
after adding the readings taken at the second position,
to the map created in Figure 7a.
to the map created in Figure 7a.
ning
plete
map
t the
ailed
host
n the
emp-
t the
gs at
p
images, can be seen in the map which lie either in the
region
the Mobile
roomRobotics
boundaries
or within
the empJournal ofbeyond
Automation,
& Intelligent
Systems
ty area of the original room shown in Figure 6.
(a)
VOLUME 7,
N° 3
2013
Fig.
map
5.2.
F
almo
part
the b
(b)
Fig. 7. (a) Map obtained after taking readings at the
(b)the
Map
obtained
after
taking
readings
at
Fig.Map
9.position.
Image
of
room
under
consideration
and the
Fig. 7. (a) Map obtained after taking readings at the origin position.origin
(b)
obtained
after
taking
readings
at
second
position
second
positionbyand
to the existing
map created
theadding
TriBotthem
superimposed
ontomap
it.
and adding them to the existing map
(a)
5.2. Kidnapped Robotboundaries
Problemor within the empty
area
of the original
room
shown
Figure 10 shows the
particles
initially
distributed,
Figuretriangles
6.
almost uniformly. The ingreen
represent the
Figure
9 shows
the map
conparticles created initially. The
blue
asterisks
represent
in Figure
superimthe boundary of the roomstructed
created during
map 8,
construction.
posed onto the room shown in
Figure 6.
(b)
Fig. 7. (a) Map obtained after taking readings at the
origin position. (b) Map obtained after taking readings at
second position and adding them to the existing map
5.2. Kidnapped Robot Problem
Figure 10 shows the particles initially distributed, almost
uniformly. The green triangles
represent the particles created
initially. The blue asterisks represent the boundary of the room
created
duringafter
map construction.
Fig. 8. Map constructed by
the TriBot
one scanning
Fig. 8. Map constructed by the
Figure 11 shows how the
of
the
room
where
the
boundaries
of
the
room can be
TriBot after one scanning of the
green
colored particles
Fig. 10.
Particles created
for localization
of thehave
robot
easily
identified
room where the boundaries of the
clustered
the right hand
and to solve the kidnapped
robot on
problem
room can be easily identified
side of the map and predict the
Figure 9 shows the map constructed in Figure 8, superapproximate position of the
imposed onto the room shown in Figure 6.
TriBot.
Fig. 9. Image of the room under considerIn later iterations of the
Figure 8 shows the map created
ation and
the map created by the TriBot
Fig.
Image
of theof
room
under consideration
and the
algorithm, the particles move
after9.one
scanning
the room.
In
onto it
map
created
by the TriBot
superimposedsuperimposed
onto it.
within the map as the robot
the map
constructed,
the complete
moves in the actual environroom is scanned only once and for
ment. Figure 12 shows how the
better
results of map
construction
5.2.
Kidnapped
Robot
Problem
particles
after
executing
a
left turn and moving forwards,
andFigure
localization,
it
is
possible
to
let
the
TriBot
scan
the
10 shows the particles initially distributed,
and
further
upwards,
in
the
map as the robot moves.
room
continuously
till
a
highly
detailed
map
is
constructalmost
uniformly.
The green
represent
the
Fig.
8. Map
constructed
by the triangles
TriBot after
one scanning
ed.
Noisy
readings,
representing
ghost
images,
can
be
seen
particles
created
initially.
The blue asterisks
represent
of
the room
where
the boundaries
of the room
can be
in the
map which
lieroom
either
in theduring
regionmap
beyond
the room
the
boundary
of the
created
construction.
easily
identified
Figure 9 shows the map constructed in Figure 8, superimposed onto the room shown in Figure 6.
Fig.
Fig. 10.
10. Particles
Particles created
createdfor
forlocalization
localizationofofthe
therobot
robot
and
to
solve
the
kidnapped
robot
problem
and to solve the kidnapped robot problem
Fig.
and
Fig. 11. Particles finally localized to the robot’s approximate position within the map
Fig. 11. Particles finally localized to the robot’s approximate position within the map
Articles
29
Fig.
mate
In within
later iterations
ofthe
therobot
algorithm,
particles
move
the map as
moves the
in the
actual
move
within
the
map
as
the
robot
moves
in
the
environment. Figure 12 shows how the particlesactual
after
Journal of Automation,
Mobile
Intelligent
Systems after
environment.
12 Robotics
shows
the particles
executing
a leftFigure
turn
and
moving&how
forwards,
and further
executing
a left
and
moving
forwards, and further
upwards, in
the turn
map as
the
robot moves.
upwards, in the map as the robot moves.
shown in this paper.
VOLUME 7,
N° 3
2013
AUTHORS
AUTHORS
Jasmeet Singh*, Punam Bedi – Department of
Jasmeet
– Department
of
Computer Singh*,
Science, Punam
Faculty ofBedi
Mathematical
Sciences,
Computer
Science,
Faculty
of
Mathematical
Sciences,
New
Academic Block, University of Delhi, DelhiAUTHORS
New
Academicjasmeetsingh89@ymail.com,
Block, University of Delhi, Delhi110007,
JasmeetIndia,
Singh*,
Punam Bedi – Department of Com110007,
India,
jasmeetsingh89@ymail.com,
pbedi@cs.du.ac.in
puter
Science,
Faculty
of Mathematical Sciences, New
pbedi@cs.du.ac.in
Academic Block, University of Delhi, Delhi- 110007,
*Corresponding
author
India, jasmeetsingh89@ymail.com,
pbedi@cs.du.ac.in
*Corresponding
author
*Corresponding author
References
References
[1] Burguera, A.; González, Y.; Oliver, G., “Mobile Ro-
(a)
(a)
(b)
(b) the robot’s actual moveFig. 12. Particles moving with
Fig.
12.
Particles
moving
with
the robot’s
actual
ment. (a) The cloud of green particles
is near
themoveright
ment.
(a)
The
cloud
of
green
particles
is
near
the
right
hand12.
side
boundary
(b) The
particles
move actual
upwards
and
Fig.
Particles
moving
with
the robot’s
movehand
side
boundary
(b)
The
particles
move
upwards
and
ment.
(a) The
cloud
ofthe
green
particles
is near
the right
left, further
away
from
boundary
left, further
away from (b)
the boundary
hand
side boundary
The particles move upwards
and left, further away from the boundary
6. Conclusion
6. Conclusion
30
Efforts have been made in this paper to create
Efforts
been made
inLego
this Mindstorms
paper to create
a6.map
of anhave
environment
using
NXT
Conclusion
aTriBot.
map ofThis
an environment
using
Lego
Mindstorms
NXT
work may find applications in creating
TriBot.
This
work
may
find
applications
in
creating
Efforts
have been
made in this
to create
maps
of unknown
environments
that paper
are inaccessible
maps
of
environments
thatMindstorms
are inaccessible
a map
of unknown
an environment
using
NXT
for
humans,
by deploying
robotsLego
into
them
or in applifor
humans,
by
deploying
robots
into
them in
oranomalies
in appliTriBot.
This
work
may find
applications
creating
cations
of
home
security
systems
to detect
cations
of
homechanges
security
to detect
anomalies
maps of
unknown
environments
that
are
inaccessible
when
the
map
in systems
the absence
of human
interwhen
the
map
changes
in
the
absence
of
human
interfor humans,
by deploying
robots into
them or in
apference.
The MATLAB
programming
environment
with
ference.
The
MATLAB
programming
environment
plications
of home security
systems
detect
RWTHMindstorms
NXT Toolbox
has to
been
usedanomaforwith
the
RWTHMindstorms
NXT Toolbox
has
been
for the
lies when
map changes
in the
absence
of human
purpose
ofthe
implementation
and
testing
of used
the
system
purpose
implementation
and testing
ofenvironment
the
interference.
The MATLAB
programming
with
the ofTriBot.
The implemented
system
hassystem
been
with
the
TriBot.
The
implemented
system
has been
RWTH –tested
Mindstorms
Toolbox
been
used
successfully
with anNXT
indoor
maphas
creation
of a
successfully
with an solving
indoor map
creationofof
for theand
purpose
of implementation
andkidnapped
testing
the
room
fortested
subsequently
the
ro-a
room
and
for
subsequently
solving
the
kidnapped
rosystem
with the
TriBot.
The implemented
system
bot
problem
with
particle
filter localization
of has
the
bot
problem
with
particle
filter
localization
of
the
been successfully
withrobot
an indoor
creation
TriBot
within that tested
room. The
in thismap
work
uses a
TriBot
within
room.making
The robot
in this
work
uses a
of a room
and that
forsensor,
subsequently
solving
the
kidnapped
single
ultrasonic
it
highly
cost
effective.
single
ultrasonic
sensor,
making
it
highly
cost
effective.
robot
problem
with particle
localization
of the
A better
representation
of filter
the map
can be created
A better
representation
of
the map
can
TriBot
within
that room. The
robot
in this
work
uses
by
making
the
environment
multi-agent
withbe
thecreated
use
of
by
making
the environment
multi-agent
with the
useare
of
a single
ultrasonic
making
highly
cost
efseveral
similar
Lego sensor,
Mindstorms
NXTitTriBots
that
several
similar
Lego
Mindstorms
NXT
TriBots
that
are
fective. via MATLAB and by overlapping the individcontrolled
controlled
MATLAB and by
individA bettervia
representation
of overlapping
the map canthe
be created
by making the environment multi-agent with the use
of several similar Lego Mindstorms NXT TriBots that
are controlled via MATLAB and by overlapping the
individual maps created by them all. It is also possible to reduce the number of particles used in the
implementation, and achieve almost similar results
as the ones shown in this paper.
Articles
[1] bot
Burguera,
A.; González,
Y.; Oliver,
G., “Mobile
RoLocalization
Using Particle
Filters
and Sonar
References
bot
Localization
Using
Particle
Filters
and
Sonar
Sensors”, Advances in Sonar Technology, In-Tech:
[1] Sensors”,
BurgueraAdvances
A., González
Y.’ Oliver
G., “Mobile
Roin Chapter
Sonar
Technology,
In-Tech:
Vienna, Austria,
2009,
10, pp. 213–232.
bot
Localization
Using
Particle
Filters
and
Sonar
Austria,
Chapter
213–232. J.;
[2] Vienna,
Adiprawita,
W;2009,
Ahmad,
A. 10,
S.; pp.
Sembiring,
Sensors”, Advances
in SonarA.Technology,
In-Tech:
[2] Trilaksono,
Adiprawita,
W;
Ahmad,
S.;
Sembiring,
J.;
B. R., “New Resampling Algorithm for
Vienna,
Austria,
2009,
Chapter
10,
pp. 213–232.
Trilaksono,
B.
R.,
“New
Resampling
Algorithm
for
Particle Filter Localization for Mobile Robot with 3
[2] Particle
Adiprawita
W.,
Ahmad A. S.,
Sembiring
J., TrilakFilter
Localization
Mobile Robot
3
Ultrasonic
Sonar
Sensors”,for
In Proceedings
ofwith
Intersono,
B.
R.,
“New
Resampling
Algorithm
forInterParUltrasonic
Sonar Sensors”,
In Proceedings
of
national Conference
on Electrical
Engineering
and
ticle Filter
Localization
for Mobile
Robot2011,
with
national
Conference
Electrical
Engineering
and
Informatics,
Bandung,onIndonesia,
July
17-19,
3
Ultrasonic
Sonar
Sensors”,
In:
Proceedings
of
Informatics,
Bandung,
Indonesia,
July
17-19,
2011,
pp. 1431–1436.
International
Conference
on
Electrical
Engineering
pp.
1431–1436.
[3] Burguera, A.; González, Y.; Oliver, G., “Sonar Sensor
and Informatics,
Bandung,
Indonesia,
July
17–19,
[3] Burguera,
A.;Their
González,
Y.; Oliver,
G., “Sonar
Sensor
Models
and
Application
to Mobile
Robot
Lo2011,
pp.
1431–1436.
Models
and
Their
Application
to
Mobile
Robot
Localization”, Sensors, vol. 9, 2009, pp. 10217–
[3] calization”,
Burguera
A.,
González
Y.,
Oliver
G.,
“Sonar
Sensor
Sensors,
vol.
9,
2009,
pp.
10217–
10243.
Models
TheirFilters
Application
to Mobile
Robot
[4] 10243.
S.
Thrun,and
“Particle
in Robotics”,
In Proceed[4] ings
S.
Thrun,
“Particle
Filters
in
Robotics”,
In
ProceedLocalization”,
Sensors,
vol.
9,
2009,
pp. 10217–
th
of the 18 Annual Conference on Uncertainty
th Annual Conference on Uncertainty
ings
of the 18
10243.
in
Artificial
Intelligence
(UAI), Edmonton, Alberta,
Artificial
Intelligence
(UAI),
Edmonton,
Alberta,
[4] in
Thrun
S.,August
“Particle
Filters
inpp.
Robotics”,
In: ProceedCanada,
1–4,
2002,
511–518.
th 1–4, 2002, pp. 511–518.
Canada,
August
ings
of
the
18
Annual
Conference
on
Uncertainty
[5] Howell, J.; Donald, B. R., “Practical Mobile Robot
[5] Self-Localization”,
Howell,
J.; Donald,
R.,(UAI),
“Practical
Robot
in Artificial
Intelligence
Edmonton,
InB.Proceedings
of Mobile
IEEE Alberta,
InternaSelf-Localization”,
InonProceedings
of IEEE
InternaCanada,Conference
August 1–4,
2002,
pp. 511–518.
tional
Robotics
and
Automation
on CA,
Robotics
and
Automation
[5] tional
HowellConference
J., Donald
B.R.,
“Practical
Mobile
Robot
(ICRA),
San
Francisco,
USA, April
24–28,
2000,
(ICRA),
San
Francisco,
USA, Aprilof24–28,
2000,
Self-Localization”,
In CA,
Proceedings
IEEE Intervol.
4, pp.
3485–3492.
4, pp.Conference
3485–3492.
national
[6] vol.
Yamauchi,
B.; Schultz,onA.;Robotics
Adams,and
W., Automation
“Mobile Ro[6] bot
Yamauchi,
Schultz,
A.; Adams,
W.,
“Mobile
Ro(ICRA),
SanB.;
Francisco,
CA,
USA, April
24–28,
2000,
Exploration
and Map-Building
with
Continubot
Exploration
and
Map-Building
with
Continuvol. Localization”,
4, pp. 3485–3492.
ous
In Proceedings of IEEE InternaLocalization”,
InonProceedings
of“Mobile
IEEE
International
Conference
and
Automation
[6] ous
Yamauchi
B., Schultz
A.,Robotics
Adams W.,
Robot
tional
Conference
on
Robotics
and
Automation
(ICRA),
Leuven,
May 16-20,
vol. 4,
Exploration
andBelgium,
Map-Building
with 1998,
Continuous
(ICRA),
Leuven,In
Belgium,
May 16-20,
1998,
vol. 4,
pp.
3715–3720.
Localization”,
Proceedings
of IEEE
Internapp.
3715–3720.
[7] V.
Varveropoulos,
Localization
and Map
tional
Conference “Robot
on Robotics
and Automation
[7] Construction
V.
Varveropoulos,
“Robot
Localization
andvol.
Map
Using
SonarMay
Data”,
The Rossum
Pro(ICRA),
Leuven,
Belgium,
16–20,
1998,
4,
Construction
Using
Sonar
Data”,
The
Rossum
Project:
2000.
Available
online:
pp. 3715–3720.
2000.
online:
http://www.rossum.sourceforge.net/papers/
[7] ject:
Varveropoulos
V., “RobotAvailable
Localization and
Map
http://www.rossum.sourceforge.net/papers/
Localization
on 17
January
2012). ProjConstruction(accessed
Using Sonar
Data”,
The Rossum
(accessed
on 17
January 2012).
[8] Localization
A.
“Multi-robot
Simultaneous
Localizaect:Howard,
2000. Available
online:
http://www.rossum.
[8] tion
A. Howard,
“Multi-robot
Simultaneous
Localizaand
Mapping
using
Particle
Filters”,(accessed
Int.
J. Rosourceforge.net/papers/Localization
tion
and vol.
Mapping
using
Particle
Filters”, Int. J. Robot.
Res.,
25,
2006,
pp.
1243–1256.
on 17 January 2012).
bot. Res., vol.Mindstorms
25, 2006, pp. 1243–1256.
NXT Toolbox,LocalizaRWTH
[9]
[8] RWTH
Howard A., “Multi-robot
Simultaneous
Mindstorms2010.
NXT Available
Toolbox, online:
RWTH
[9] Aachen
RWTH -University,
tion and Mapping using Particle Filters”, Int. J.
Aachen
University,
2010.
Available
online:
http://www.mindstorms.rwth-aachen.de/trac/
Robot. Res., vol. 25, 2006, pp. 1243–1256.
http://www.mindstorms.rwth-aachen.de/trac/
(accessed
on 18 August
2011). RWTH Aachen
[9] wiki
RWTH
- Mindstorms
NXT Toolbox,
wiki
(accessed
on 18 August
2011).
University, 2010. Available online: http://www.
mindstorms.rwth-aachen.de/trac/ wiki (accessed
on 18 August 2011).
[10]Fox D., Burgardy W., Dellaerta F., Thrun S.,
“Monte Carlo Localization: Efficient Position
Estimation for Mobile Robots”, In: Proceedings
of the Sixteenth National Conference on Artificial
Intelligence, Orlando, FL, USA, July 18–22, 1999,
pp. 343–349.
[11]Artificial Intelligence (CS373) Programming
a Robotic Car, Udacity, 2012. Available online:
http://www.udacity.com/overview/Course/
cs373 (accessed on 26 February 2012).
Download