Lecture 9: Behavior Languages CS 344R: Robotics Benjamin Kuipers

advertisement
Lecture 9:
Behavior Languages
CS 344R: Robotics
Benjamin Kuipers
Alternative Approaches To
Sequencers
• Roger Brockett, MDL
– Hristu-Varsakelis & Andersson, MDLe.
• Jim Firby, RAPS
• … there are others …
• The right answer is not completely clear.
Motion Description Languages
• Problem: Describe continuous motion in a
complex environment as a finite set of
symbolic elements.
– Applicability = sequencing
– Termination = condition or time-out.
• Roger Brockett defined MDL.
– Extended to MDLe by Manikonda,
Krishnaprasad, and Hendler.
This is an instance of
our framework for control laws
• A local control law is a triple: A, Hi, 
– Applicability predicate A(y)
– Control policy
u = Hi(y)
– Termination predicate (y)
(y)
u  H(y)
(y)
The Kinetic State Machine
• The MDLe state evolution model is:
xÝ f (x)  G(x)U(t, x)
y  h(x)
– This is an instance of our general model xÝ F(x,u)
• There is also:
– a set of timers Ti;
– a set of boolean features i(y) 
• U(t, x) is a general control law which can be
suspended by the timer Ti or the interrupt i(y)
The Kinetic State Machine
Q: What is the role of G(x)?
• In the state evolution model
xÝ f (x)  G(x)U(t, x)
y  h(x)
• x is in Rn. Motor vector U(t,x) is in Rk.
• G is an nk matrix whose columns gi are
vector fields in Rn.
– Each column represents the effect on x of one
component of the motor vector.
MDL Programs
• The simplest MDL program is an atom
  U, ,T
• To run an atom,
– apply U to the kinetic state machine model,
– until the interrupt function (y) goes false, or
– until T units of time elapse.
Compose Atoms to Behaviors
• Given atoms
1  U1,1,T1
 2  U 2 ,2 ,T2
• Define the behavior
b  (1 , 2 ), b ,Tb
• Which means to do the atoms sequentially
until the interrupt b or time-out Tb occurs.
• Behaviors nest recursively to make plans.
Example Interrupts
• (bumper)
• (wait T)
• (atIsection b)
– b specifies 4 bits: whether obstacle is required
(front, left, back, right).
– Interrupt occurs when a location of that
structure is detected.
Example Atoms
• (Atom interrupt_condition control_law)
•
•
•
•
(Atom
(Atom
(Atom
(Atom
(wait ) (rotate ))
(bumper OR atIsection(b)) (go v, ))
(wait T) (goAvoid , kf, kt))
(ri(t)==rj(t)) (align ri rj))
• Select ideas from here for your controllers.
Environment Model
• A graph of local maps.
– We will study local metrical maps later.
– Likewise topological maps.
• Edges in the graph represent behaviors.
• Compact and effective:
– Local metrical maps are reliable.
– Describe geometry only where necessary.
Experiment
• They built a model
of three places in
their laboratory.
• They demonstrated
MDLe plans for
travel between
pairs of places.
Limitations
• Simple sequential FSM model.
– No parallelism or combination of control laws.
– No success/failure exits from control laws.
– Much can pack into the interrupt conditions.
• Limited evaluation:
– No exploration or learning.
– No test of reliability.
Next: Observers
• Probabilistic estimates of the true state,
given the observations.
• Basic concepts:
– Probability distribution; Gaussian model
– Expectations
Estimates and Uncertainty
• Conditional probability density function
Gaussian (Normal) Distribution
• Completely described by N(,)
– Mean 
– Standard deviation , variance  2
1
 2
(x  )2 / 2 2
e
The Central Limit Theorem
• The sum of many random variables
– with the same mean, but
– with arbitrary conditional density functions,
converges to a Gaussian density function.
• If a model omits many small unmodeled
effects, then the resulting error should
converge to a Gaussian density function.

Expectations
• Let x be a random variable.
• The expected value E[x] is the mean:
N
1
E[x]   x p(x) dx  x   x i
N 1
– The probability-weighted mean of all possible
values. The sample mean approaches it.
• Expected value of a vector x is by component.
T
E[x]  x  [x1, x n ]
Variance and Covariance
• The variance is E[ (x-E[x])2 ]
N
1
2
2
2
  E[(x  x ) ]   (x i  x )
N 1
• Covariance matrix is E[ (x-E[x])(x-E[x])T ]
1
Cij 
N
N
(x
k 1
ik
 x i )(x jk  x j )
Covariance Matrix
• Along the diagonal, Cii are variances.
• Off-diagonal Cij are essentially correlations.
C1,1  12
C1,2

2
C2,2   2
 C2,1


 CN ,1




2 
  N 
C1,N
CN ,N
Independent Variation
• x and y are
Gaussian random
variables (N=100)
• Generated with
x=1 y=3
• Covariance matrix:
0.90 0.44
Cxy  

0.44 8.82
Dependent Variation
• c and d are random
variables.
• Generated with
c=x+y d=x-y
• Covariance matrix:
10.62 7.93
Ccd  

7.93 8.84 
Download