Vision-Guided Humanoid Footstep Planning for Dynamic Environments

advertisement
Vision-Guided Humanoid Footstep
Planning for Dynamic Environments
P. Michel, J. Chestnutt, J. Kuffner, T. Kanade
Carnegie Mellon University – Robotics
Institute
Humanoids 2005
Objective
• Paper presents a vision- based footstep
planning system that computes the best
partial footstep path within its time-limited
search horizon, according to problem-specific
cost metrics and heuristics.
Related Work
• reliable, stable gait generation and feedback
– Emphasis on pre-generating walking trajectories
– Online trajectory generation
– Dynamic balance
– No accounting for obstacles!
• Little work focused on developing global
navigation autonomy for biped robots
Related Work
• Obstacle avoidance and local planning based
on visual feedback has been studied in
humans
• Several reactive perception-based obstacle
avoidance techniques for bipeds have been
developed
• Environment mapping; obstacle detection;
color-based segmentation
CMU’s Honda ASIMO Humanoid
Sensing and the Environment
• ASIMO Robot
• Global sensing
– Overhead camera: compute position of robot,
desired goal location, obstacles
– All processing done on real-time
Sensing and the Environment: Color
Segmentation
• Colored markers
– Bright pink: planar obstacles on the floor
– Light blue: desired goal location
– Yellow and green: identify robot’s location and
orientation
– Dark blue: 4 square delimiters to define a rectangular
area within which the robot operates
• Color segmentation performed directly on YUV
stream generated by camera
– Avoids processing overhead
Sensing and the Environment: Color
Segmentation
• Color thresholds = sample pixel values offline for
each marker
– Produced series of binary masks including presence or
absence of markers pixel
– Noise eliminated by erosion/dilation
– Connected components labeling is applied to group of
pixels
• Calculate moments for each color blob
– Centroid, area, major/minor aces, orientation of floor
Sensing and the Environment:
Converting to World Coordinates
• Assume physical distance between 4 delimiters
that outline robot’s walking area are known
• Scaling used to convert between pixel coordinate
of each blob’s centroid and corresponding realworld distances
• Orientation of robot determined from angle the
line connecting the backpack markers forms with
horizontal
• Footstep planning requires precise location of
robot’s feet
Sensing and the Environment:
Converting to World Coordinates
Sensing and the Environment: Building
the Environment Map
• 2D grid of binary value cells = environment
– Value in cell = whether terrain is obstacle free or
partially/totally occupied by obstacle
– Bitmap representation of freespace and obstacles
Footstep Planning
• Goal: to find as close to an optimal sequence
of actions as possible that causes the robot to
reach the goal location while avoiding
obstacles in the environment
Footstep Planning: Basic Algorithm
• Planner Algorithms
– Input: environment map E, initial and goal robot
states, mapping of possible actions that may be taken
in each state and an action-effect mapping
– Return: sequence of footstep actions after finding
path to goal
– Planner computes cost of each candidate footstep
location using 3 metrics:
• Location cost determining whether candidate location is
“safe” in environment
• Step costs which prefers ‘easy’ stepping actions
• Estimated cost-to-go providing approximation of candidate’s
proximity to goal using standard mobile-robot planner
Footstep Planning: Basic Algorithm
• A* search performed on possible sequences of
walking actions
– Done until a path is found OR
– Specified computation time limit is exceeded
Footstep Planning: Plan Reuse
• At each step: plan a path towards the goal.
– ASIMO takes first step and then replans for next
step
– Reuse computations from before using a forward
search
Evaluation: Vision-Planner Integration
Vision Server
Footstep Planning Server
Floor Map
R
E
Q
U
E
S
T
Robot & Goal
Position
(x,y, q)
R
E
Q
U
E
S
T
Footstep Plan
Navigation Client
Footstep
Commands
1. walk(x, y, q)
2. walk(x, y, q)
...
Fig. 5. System components. The navigation client gathers updated sensing
data from vision, requests for a new footstep plan to be computed and sends
the commands required to execute the path to the robot.
the two servers and wirelessly sends footstep commands to the
a path that caused
obstacle (“ cutting
obstacle was small
for an example). T
standard mobile ro
The vision-guid
environment, retur
change in the obs
when an obstacle
occluded by the ro
In this case, there i
to ‘ absorb’ the ob
remains undetected
We have presen
walking in the pres
combines sensing,
By combining fast
planner operating a
Evaluation: Obstacle Avoidance –
Unpredictably Moving Obstacles
Discussion
• Approach to autonomous humanoid walking
in presence of dynamically moving obstacles
– Combines sensing, planning and execution in
closed loop
• Currently working:
– more realistic estimate of floor directly
surrounding robot’s feet
– On-body vision to satisfy real-time constraints for
sensing loop
Related documents
Download