Slides on motion and perception

advertisement
Probabilistic Robotics:
Motion and Sensing
Sebastian Thrun & Alex Teichman
Stanford Artificial Intelligence Lab
Slide credits: Wolfram Burgard, Dieter Fox, Cyrill Stachniss, Giorgio
Grisetti, Maren Bennewitz, Christian Plagemann, Dirk Haehnel, Mike
Montemerlo, Nick Roy, Kai Arras, Patrick Pfaff and others
4-1
Bayes Filters
Bel ( xt )   P( zt | xt )  P( xt | ut , xt 1 ) Bel ( xt 1 ) dxt 1
4-2
Bayes Filters
Sensing
Action
Bel ( xt )   P( zt | xt )  P( xt | ut , xt 1 ) Bel ( xt 1 ) dxt 1
4-3
Probabilistic Motion Models
4-4
Bayes Filters
Action: p(x’ | u, x)
Bel ( xt )   P( zt | xt )  P( xt | ut , xt 1 ) Bel ( xt 1 ) dxt 1
4-5
Robot Motion
Robot motion is inherently uncertain.
How can we model this uncertainty?
4-6
Probabilistic Motion Models
• To implement the Bayes Filter, we
need the transition model p(x | x’, u).
• The term p(x | x’, u) specifies a
posterior probability, that action u
carries the robot from x’ to x.
• In this section we will specify, how
p(x | x’, u) can be modeled based on the
motion equations.
4-7
Locomotion of Wheeled Robots
Locomotion (Oxford Dict.):
Power of motion from place to place
r ol l
x
y
y
z
m ot i on
• Differential drive (AmigoBot, Pioneer 2-DX)
• Car drive (Ackerman steering)
• Synchronous drive (B21)
• Mecanum wheels, XR4000
4-8
Instantaneous Center of Curvature
ICC
For rolling motion to occur, each wheel has to move
along its y-axis
4-9
Differential Drive
ICC
ICC  [ x  R sin q , y  R cosq ]
y
w
vl
w ( R  l / 2)  vr
w ( R  l / 2)  vl
q
R
x
(x,y)
vr
l/2
l (vl  vr )
R
2 (vr  vl )
vr  vl
w
l
4-10
Differential Drive: Forward Kinematics
 x' cos(wdt )  sin(wdt ) 0  x  ICCx  ICCx 
 y '   sin(wdt ) cos(wdt ) 0  y  ICCy   ICCy 
  

 

q ' 
  wdt 
0
0
1  q
ICC
t
R
P(t+dt)
x(t )   v(t ' ) cos[q (t ' )]dt'
0
t
y (t )   v(t ' ) sin[q (t ' )]dt'
0
P(t)
t
q (t )   w (t ' ) dt'
0
4-11
Differential Drive: Forward Kinematics
 x' cos(wdt )  sin(wdt ) 0  x  ICCx  ICCx 
 y '   sin(wdt ) cos(wdt ) 0  y  ICCy   ICCy 
  

 

q ' 
  wdt 
0
0
1  q
ICC
t
R
P(t+dt)
1
x(t )   [vr (t ' )  vl (t ' )] cos[q (t ' )]dt'
20
t
1
y (t )   [vr (t ' )  vl (t ' )] sin[q (t ' )]dt'
20
t
P(t)
1
q (t )   [vr (t ' )  vl (t ' )] dt'
l0
4-12
Mecanum Wheels
v y  ( v0  v1  v2  v3 ) / 4
vx  ( v0  v1  v2  v3 ) / 4
vq  ( v0  v1  v2  v3 ) / 4
verror  ( v0  v1  v2  v3 ) / 4
4-13
Example
4-14
XR4000 Drive
t
x(t )   v(t ' ) cos[q (t ' )]dt'
0
y
t
y (t )   v(t ' ) sin[q (t ' )]dt'
0
t
q (t )   w (t ' ) dt'
vi(t)
0
q
wi(t)
x
ICC
4-15
XR4000
[courtesy by Oliver Brock & Oussama Khatib]
4-16
Tracked Vehicle: Urban Robot
4-17
Tracked Vehicle: OmniTread
[courtesy by Johann Borenstein]
4-18
Odometry
4-19
Reasons for Motion Errors
ideal case
bump
different wheel
diameters
carpet
and many more …
4-20
Coordinate Systems
• In general the configuration of a robot can be
described by six parameters.
• Three-dimensional cartesian coordinates plus
three Euler angles pitch, roll, and tilt.
• Throughout this section, we consider robots
operating on a planar surface.
The state space of such
systems is threedimensional (x,y,q).
4-21
Typical Motion Models
• In practice, one often finds two types of
motion models:
• Odometry-based
• Velocity-based (dead reckoning)
• Odometry-based models are used when
systems are equipped with wheel encoders.
• Velocity-based models have to be applied
when no wheel encoders are given.
• They calculate the new pose based on the
velocities and the time elapsed.
4-22
Example Wheel Encoders
These modules require
+5V and GND to power
them, and provide a 0 to
5V output. They provide
+5V output when they
"see" white, and a 0V
output when they "see"
black.
These disks are
manufactured out of high
quality laminated color
plastic to offer a very crisp
black to white transition.
This enables a wheel
encoder sensor to easily
see the transitions.
Source: http://www.active-robots.com/
4-23
Dead Reckoning
• Derived from “deduced reckoning.”
• Mathematical procedure for
determining the present location of a
vehicle.
• Achieved by calculating the current
pose of the vehicle based on its
velocities and the time elapsed.
4-24
Odometry Model
Robot moves from
x , y ,q
Odometry information u 
to x ' , y ' ,q ' .
d rot 1, d rot 2 , d trans .
d trans  ( x ' x ) 2  ( y ' y ) 2
d rot 1  atan2( y' y, x ' x )  q
d rot 2  q 'q  d rot 1
x , y ,q
d rot 1
d rot 2
d trans
x ' , y ' ,q '
The atan2 Function
Extends the inverse tangent and correctly
copes with the signs of x and y.
4-26
Noise Model for Odometry
• The measured motion is given by the
true motion corrupted with noise.
dˆrot 1  d rot 1   |d
ro t1 | 2
dˆtrans  d trans  
|d tra n s| 4 |d ro t1 d ro t2 |
1
dˆrot 2  d rot 2  
3
|d tra n s|
1 |d ro t2 | 2
|d tra n s|
Typical Distributions for
Probabilistic Motion Models
Normal distribution
  ( x) 
2

1
2
2
e
2
1x
2 2
Triangular distribution
0 if | x | 6 2

  2 ( x)   6 2  | x |

6 2

Calculating the Probability
(zero-centered)
• For a normal distribution
1.
Algorithm prob_normal_distribution(a,b):
2.
return
• For a triangular distribution
1.
Algorithm prob_triangular_distribution(a,b):
2.
return
4-29
Calculating the Posterior
Given x, x’, and u
1.
Algorithm motion_model_odometry(x,x’,u)
2.
d trans  ( x ' x ) 2  ( y ' y ) 2
d rot 1  atan2( y' y, x ' x )  q
d rot 2  q 'q  d rot 1
3.
4.
5.
6.
7.
8.
9.
10.
11.
odometry values (u)
dˆtrans  ( x' x) 2  ( y ' y ) 2
values of interest (x,x’)
dˆrot 1  atan2( y' y, x'x) q
dˆrot 2  q 'q  dˆrot 1
p1  prob(d rot1  dˆrot1,1 | dˆrot1 | 2dˆtrans )
p2  prob(d trans  dˆtrans ,3dˆtrans  4 (| dˆrot1 |  | dˆrot2 |))
p  prob(d  dˆ , | dˆ |  dˆ )
3
rot 2
return p1 · p2 · p3
rot 2
1
rot 2
2 trans
4-30
Examples
4-31
Application
• Repeated application of the sensor model
•
for short movements.
Typical banana-shaped distributions
obtained for 2d-projection of 3d posterior.
p(x|u,x’)
x’
x’
u
u
Sample-based Density Representation
Sample-based Density Representation
4-34
How to Sample from Normal or
Triangular Distributions?
• Sampling from a normal distribution
1.
Algorithm sample_normal_distribution(b):
2.
return
• Sampling from a triangular distribution
1.
Algorithm sample_triangular_distribution(b):
2.
return
4-35
Normally Distributed Samples
106 samples
4-36
For Triangular Distribution
103 samples
104 samples
105 samples
106 samples
4-37
Sample Odometry Motion Model
1.
1.
2.
3.
Algorithm sample_motion_model(u, x):
u  d rot 1, d rot 2 , d trans , x  x, y,q
dˆrot 1  d rot 1  sample(1 | d rot 1 | 2 dtrans )
dˆtrans  dtrans  sample(3 dtrans  4 (| d rot 1 |  | d rot 2 |))
dˆ  d  sample( | d |  d )
rot 2
4.
5.
6.
rot 2
x'  x  dˆtrans cos(q  dˆrot 1 )
y'  y  dˆtrans sin(q  dˆrot 1 )
q '  q  dˆ  dˆ
rot 1
7.
1
Return
rot 2
x' , y' ,q '
rot 2
2
trans
sample_normal_distribution
Sampling from Our Motion
Model
Start
Examples
Map-Consistent Motion Model
p( x | u, x' )
Approximation:

p( x | u, x' , m)
p( x | u, x' , m)   p( x | m) p( x | u, x' )
Summary
• We discussed motion models for odometry-based
motion (see book for velocity-based systems)
• We discussed ways to calculate the posterior
probability p(x| x’, u).
• We also described how to sample from p(x| x’, u).
• Typically the calculations are done in fixed time
intervals t.
• In practice, the parameters of the models have to
be learned.
• We also discussed an extended motion model that
takes the map into account.
4-42
Probabilistic Sensor Models
4-43
Bayes Filters
Sensing: p(z | x)
Bel ( xt )   P( zt | xt )  P( xt | ut , xt 1 ) Bel ( xt 1 ) dxt 1
4-44
Sensors for Mobile Robots
• Contact sensors:
• Internal sensors
Bumpers, Whiskers
• Accelerometers (spring-mounted masses)
• Gyroscopes (spinning mass, laser light)
• Compasses, inclinometers (earth magnetic field, gravity)
• Proximity sensors
•
•
•
•
•
Sonar (time of flight)
Radar (phase and frequency)
Laser range-finders (triangulation, tof, phase)
Infrared (intensity)
Stereo
• Visual sensors: Cameras, Stereo
• Satellite-based sensors: GPS
4-45
Ultrasound Sensors
• Emit an ultrasound signal
• Wait until they receive the echo
• Time of flight sensor
Polaroyd 6500
4-46
Time of Flight sensors
emitter
object
d  vt / 2
v: speed of the signal
t: time elapsed between broadcast of signal
and reception of the echo.
4-47
Properties of Ultrasounds
• Signal profile [Polaroid]
4-48
Sources of Error
• Opening angle
• Crosstalk
• Specular reflection
4-49
Typical Ultrasound Scan
4-50
Laser Range Scanner
4-51
Properties
• High precision
• Wide field of view
• Approved security for collision
detection
4-52
Robots Equipped with Laser
Scanners
Zora:
Groundhog:
Herbert:
4-53
Typical Scans
4-54
Proximity Sensors
• The central task is to determine P(z|x), i.e., the
•
•
probability of a measurement z given that the
robot is at position x.
Question: Where do the probabilities come from?
Approach: Let’s try to explain a measurement.
4-55
Beam-based Sensor Model
• Scan z consists of K measurements.
z  {z1 , z2 ,...,zK }
• Individual measurements are
independent given the robot position.
K
P( z | x, m)   P( zk | x, m)
k 1
4-56
Beam-based Sensor Model
K
P( z | x, m)   P( zk | x, m)
k 1
4-57
Typical Measurement Errors of
an Range Measurements
1. Beams reflected by
obstacles
2. Beams reflected by
persons / caused
by crosstalk
3. Random
measurements
4. Maximum range
measurements
4-58
Proximity Measurement
• Measurement can be caused by …
•
•
•
•
a known obstacle.
cross-talk.
an unexpected obstacle (people, furniture, …).
missing all obstacles (total reflection, glass, …).
• Noise is due to uncertainty …
•
•
•
•
in measuring distance to known obstacle.
in position of known obstacles.
in position of additional obstacles.
whether obstacle is missed.
4-59
Beam-based Proximity Model
Measurement noise
0
zexp
Phit ( z | x, m)  
Unexpected obstacles
zmax
1
e
2b
1 ( z  zexp )

2
b
2
0
zexp
  e  z
Punexp ( z | x, m)  
 0
zmax
z  zexp 

otherwise
4-60
Beam-based Proximity Model
Random measurement
0
zexp
Prand ( z | x, m)  
zmax
1
zmax
Max range
0
zexp
Pmax ( z | x, m)  
zmax
1
z small
4-61
Resulting Mixture Density
  hit 


 unexp 
P ( z | x, m)  
 max 


 
 rand 
T
 Phit ( z | x, m) 


 Punexp ( z | x, m) 

Pmax ( z | x, m) 


 P ( z | x, m) 
 rand

How can we determine the model parameters?
4-62
Raw Sensor Data
Measured distances for expected distance of 300 cm.
Sonar
Laser
4-63
Approximation
• Maximize log likelihood of the data
P( z | zexp )
• Search space of n-1 parameters.
•
•
•
•
Hill climbing
Gradient descent
Genetic algorithms
…
• Deterministically compute the n-th
parameter to satisfy normalization
constraint.
4-64
Approximation Results
Laser
Sonar
300cm
400cm
4-65
Example
z
P(z|x,m)
4-66
Discrete Model of Proximity Sensors
• Instead of densities, consider discrete steps along the
•
sensor beam.
Consider dependencies between different cases.
Laser sensor
Sonar sensor
4-67
Approximation Results
Laser
Sonar
4-68
Influence of Angle to Obstacle
"sonar-0"
0.25
0.2
0.15
0.1
0.05
0
0
10 20
30
20
30 40
10
50 60
70 0
40
50
60
70
4-69
Influence of Angle to Obstacle
"sonar-1"
0.3
0.25
0.2
0.15
0.1
0.05
0
0
10 20
30
20
30 40
10
50 60
70 0
40
50
60
70
4-70
Influence of Angle to Obstacle
"sonar-2"
0.3
0.25
0.2
0.15
0.1
0.05
0
0
10 20
30
20
30 40
10
50 60
70 0
40
50
60
70
4-71
Influence of Angle to Obstacle
"sonar-3"
0.25
0.2
0.15
0.1
0.05
0
0
10 20
30
20
30 40
10
50 60
70 0
40
50
60
70
4-72
Summary Beam-based Model
• Assumes independence between beams.
• Justification?
• Overconfident!
• Models physical causes for measurements.
• Mixture of densities for these causes.
• Assumes independence between causes. Problem?
• Implementation
• Learn parameters based on real data.
• Different models should be learned for different angles at
which the sensor beam hits the obstacle.
• Determine expected distances by ray-tracing.
• Expected distances can be pre-processed.
4-73
Scan-based Model
• Beam-based model is …
• not smooth for small obstacles and at
edges.
• not very efficient.
• Idea: Instead of following along the
beam, just check the end point.
4-74
Scan-based Model
• Probability is a mixture of …
• a Gaussian distribution with mean at
distance to closest obstacle,
• a uniform distribution for random
measurements, and
• a small uniform distribution for max
range measurements.
• Again, independence between
different components is assumed.
4-75
Example
Likelihood field
Map m
P(z|x,m)
4-76
San Jose Tech Museum
Occupancy grid map
Likelihood field
4-77
Scan Matching
• Extract likelihood field from scan and
use it to match different scan.
4-78
Scan Matching
• Extract likelihood field from first scan
and use it to match second scan.
~0.01 sec
4-79
Properties of Scan-based Model
• Highly efficient, uses 2D tables only.
• Smooth w.r.t. to small changes in robot
position.
• Allows gradient descent, scan matching.
• Ignores physical properties of beams.
• Will it work for ultrasound sensors?
4-80
Landmarks
• Active beacons (e.g., radio, GPS)
• Passive (e.g., visual, retro-reflective)
• Standard approach is triangulation
• Sensor provides
• distance, or
• bearing, or
• distance and bearing.
4-81
Distance and Bearing
4-82
Probabilistic Model
1.
Algorithm landmark_detection_model(z,x,m):
z  i, d , , x  x, y,q
2.
dˆ  (mx (i)  x) 2  (m y (i)  y ) 2
3.
aˆ  atan2(my (i)  y, mx (i)  x) q
4.
pdet  prob(dˆ  d ,  d )  prob(ˆ  ,  )
5.
Return zdet pdet  zfp Puniform ( z | x, m)
4-83
Distributions
4-84
Distances Only
No Uncertainty
x  ( a  d  d ) / 2a
2
2
1
2
2
y   (d12  x 2 )
X’
a
P1
P2
d2
d1
x
P1=(0,0)
P2=(a,0)
4-85
Bearings Only
No Uncertainty
P3
D
P2
P2

D1
P1
2
z2 3 
D1
z2
z1
D
z3
b

Law of cosine
D12  z12  z22  2 z1 z2 cos
P1
z1
D12  z12  z 22  2 z1 z 2 cos( )
D22  z 22  z32  2 z1 z 2 cos(b )
D32  z12  z32  2 z1 z 2 cos(  b )
4-86
Bearings Only With Uncertainty
P3
P2
P2
P1
P1
Most approaches attempt to find estimation mean.
4-87
Summary of Sensor Models
• Explicitly modeling uncertainty in sensing is key to
robustness.
• In many cases, good models can be found by the following
approach:
1. Determine parametric model of noise free measurement.
2. Analyze sources of noise.
3. Add adequate noise to parameters (eventually mix in densities
for noise).
4. Learn (and verify) parameters by fitting model to data.
5. Likelihood of measurement is given by “probabilistically
comparing” the actual with the expected measurement.
• This holds for motion models as well.
• It is extremely important to be aware of the underlying
assumptions!
4-88
Download