Sensors.ppt

advertisement
Fall ’06 COMP 790-072 Robotics Computer Science Dept. UNC-Chapel Hill
Sensors in Robotics
Li Guan
lguan@cs.unc.edu
Figure from Roland Siegwart,
Sensors for mobile robotics
Feature extraction
Savannah River Site Nuclear Surveillance Robot
Classification of Sensors

What to measure:
 Proprioceptive sensors



Exteroceptive sensors



measure values internally to the system (robot),
e.g. motor speed, wheel load, heading of the robot, battery status
information from the robots environment
distances to objects, intensity of the ambient light, unique features.
How to measure:
 Passive sensors


Active sensors


2020/4/9
energy coming for the environment
emit their proper energy and measure the reaction
better performance, but some influence on environment
2
Outline



Recent Vision Sensors
Sensor Fusion Framework
Multiple Sensor Cooperation
2020/4/9
3
A Taxonomy
Figure from Marc
Pollefeys, COMP790-089
3D Photography
2020/4/9
4
A Taxonomy (cont.)
Figure from Marc
Pollefeys, COMP790-089
3D Photography
2020/4/9
5
Projector as camera
2020/4/9
6
Multi-Stripe Triangulation



To go faster, project multiple stripes
But which stripe is which?
Answer #1: assume surface continuity
e.g. Eyetronics’ ShapeCam
2020/4/9
7
Multi-Stripe Triangulation



To go faster, project multiple stripes
But which stripe is which?
Answer #2: colored stripes (or dots)
2020/4/9
8
Multi-Stripe Triangulation



To go faster, project multiple stripes
But which stripe is which?
Answer #3: time-coded stripes
2020/4/9
9
Time-Coded Light Patterns

Assign each stripe a unique illumination code
over time [Posdamer 82]
Time
2020/4/9
Space
10
Direct 3D Depth Sensor


Basic idea: send out pulse of light (usually laser), time
how long it takes to return
1
d  ct
2
Pulsed laser


measurement of elapsed time directly
resolving picoseconds

Phase shift measurement to produce range
estimation

Energy Integration
2020/4/9
11
Pulsed Time of Flight

Advantages:


Disadvantages:


Large working volume (up to 100 m.)
Not-so-great accuracy (at best ~5 mm.)
 Requires getting timing to ~30 picoseconds
 Does not scale with working volume
Often used for scanning buildings, rooms,
archeological sites, etc.
2020/4/9
12
Phase Shift Measurement
2020/4/9
13
Phase Shift Measurement
(Cont.)
2020/4/9
Note the ambiguity in the measured phase!
14
Direct Integration: Canesta
3D Camera

2D array of time-of-flight
sensors

jitter too big on single
measurement,
but averages out on many
(10,000 measurements100x
improvement)


2020/4/9
15
Other Vision Sensors

Omni-directional Camera
2020/4/9
16
Other Vision Sensors (cont.)

Depth from Focus/Defocus
2020/4/9
17
Outline



Recent Vision Sensors
Sensor Fusion Framework
Multiple Sensor Cooperation
2020/4/9
18
Sensor Errors

Systematic error  deterministic errors



caused by factors that can (in theory) be modeled
 prediction
e.g. calibration of a laser sensor or of the
distortion cause by the optic of a camera
Random error  non-deterministic errors



2020/4/9
no prediction possible
however, they can be described probabilistically
e.g. Hue instability of camera, black level noise of
camera ..
19
Probabilistic Sensor Fusion
 Given the sensor models
PS1 (Output1 | Input), PS2 (Output 2 | Input), PS3 (Output 3 | Input), ... ...
 Bayesian Inference
P(Input|Output1 ,Output 2 ,Output 3 )
n
 P (Output
=
i 1

Si
| Input)
n
 P (Output
Input k {Input Status Space} i 1
k =1 ,..., |Input Status Space|
2020/4/9
i
Si
i
| Input k )
20
Sensor Fusion Example:
Probabilistic Visual Hull
Jean-Sebastien Franco, et. al. ICCV`05



Multiple Camera Sensors
Inward Looking
Reconstruct the
environment
figures from
http://graphics.csail.mit.edu/~
wojciech/vh/reduction.html
2020/4/9
21
Fusion of Multi-View Silhouette Cues Using a
Space Occupancy Grid (ICCV `05)

Unreliable silhouettes: do not make decision about their location

Do sensor fusion: use all image information simultaneously
2020/4/9
22
Bayesian formulation

Idea: we wish to find the content of the
scene from images, as a probability grid

Modeling the forward problem explaining image observations given the
grid state - is easy. It can be accounted
for in a sensor model.

Bayesian inference enables the
formulation of our initial inverse problem
from the sensor model
Simplification for tractability: independent
analysis and processing of voxels

2020/4/9
23
Modeling
Sensor model:
P ( I | G X , )
  P ( I | F , B, ) P ( F | G X ,  )

I: color information in images

B: background color models

F: silhouette detection variable (0 or 1): hidden
Inference:

GX: occupancy at voxel X (0 or 1)
P(GX
F
 P( I
| I , ) 
  P( I
img , pixel
| GX , )
img , pixel
img , pixel
| GX ,  )
GX img , pixel
Grid
Gx
2020/4/9
24
Visualization
2020/4/9
25
Further, we can infer occlusion

Foreground object inference robust to partial occlusions, when

Static occluders, partial occlusion

This enables detection of discrepancies between the foreground volume and
where its silhouette is actually observed

Example (Old Well dataset with 9 cameras, frame#118, voxels>90%)
2020/4/9
26
2020/4/9
27
Occlusion Inference Example
9 views,
30fps,
720by480,
calibrated,
about 1.2min.
2020/4/9
28
Current Result
2020/4/9
Binary Occluder
A demo video
29
Other Reference




M. A. Abidiand R. C. Gonzalez, Data Fusion in Robotics and
Machine Intelligence, Academic Press, 1992.
P.K.Allen,Robotic object recognition using vision and touch,
KluwerAcademic Publishers, 1987
A. I. Hernandez, G. Carrault, F. Mora, L. Thoraval, G. Passariello,
and J. M. Schleich, “Multisensorfusion for atrialand ventricular
activity detection in coronary care monitoring, IEEE Transactions
on Biomedical Engineering, vol. 46, no. 10, pp. 1186–1190, 1999.
A. Hernandez, O. Basset, I. Magnin, A. Bremond, and G.
Gimenez, “Fusion of ultrasonic and radiographic images of the
breast, in Proc. IEEE UltrasonicsSymposium, pp. 1437–1440,
San Antonio, TX, USA, 1996.
2020/4/9
30
Outline



Recent Vision Sensors
Sensor Fusion Framework
Multiple Sensor Cooperation
2020/4/9
31
Sensor Communication



Different Types of Sensors/Drivers
 image sensors: camera, MRI, radar…
 sound sensors: microphones, hydrophones, seismic sensors.
 temperature sensors: thermometers
 motion sensors: radar gun, speedometer, tachometer, odometer,
turn coordinator
 …
Sensor Data Transmission
 Size
 Format
 Frequency
SensorTalk (Honda Research Institute) `05
2020/4/9
32
A Counterpart - RoboTalk
Copyright
Lucasfilm Ltd.
Mobile Robot with
Pan-Tilt Camera
Honda Asimo
Humanoid Robot
Allen Y. Yang, Hector Gonzalez-Banos, Victor Ng-Thow-Hing, James Davis, RoboTalk: controlling arms, bases and
androids through a single motion interface, IEEE Int. Conf. on Advanced Robotics (ICAR), 2005.
2020/4/9
33
2020/4/9
34
Robot? Sensor?

A PTZ (Pan/Tilt/Zoom) camera
Movable on its horizontal (Pan),
Vertical (Tilt), and focal length (Zoom)
axis.

2020/4/9
The Mars Land Rover
A specialized sensing robot…
35
Why not just
SensorTalk/RoboTalk

Robot:



QoS – high
Throughput - low
Sensor:


2020/4/9
Qos – low
Throughput – may be huge!
36
Objective of SensorTalk

Variety of Sensors






Different requirements (output frequency)
Different input/output
High re-usability of driver and application code
(Cross platform)
Multi-user access to the sensor
To build sensors from simpler sensors
Work together with RoboTalk


2020/4/9
Think of a sensor as a robot – Pan-tilt-zoom camera
Think of a robot as a sensor – NASA Mars Exploration
Rover, ASIMO…
37
Objective

A communication tool



Coordinate different types of sensors
Facilitate different types of applications
A protocol



2020/4/9
A set of rules to write the drivers & applications
A set of methods to support multiple clients (e.g.
write-locking)
A set of modes to transmit output data
38
Basic Idea

A model of sensor
2020/4/9
39
Model of a Sensor

A service with parameters




Static Parameters (Input Signal, Output Signal)
Tunable Parameters
Client can query all parameters
Client can change tunable parameters that
are not being locked
2020/4/9
40
Example #1: Heat Sensor

Parameters






2020/4/9
output format (integer, double)
output value unit (Kelvin, oC)
gain
publishing frequency (1Hz ~ 29.99Hz)
Resolution of output value
…
41
Example #2: Camera

Parameters







2020/4/9
output format (RGB, JPG)
image resolution (1024*768 pixels)
projection matrix (3*4 double matrix)
focal lens ()
radius distortion correction map (1024*768*2
double array)
publishing frequency (1Hz ~ 100Hz)
…
42
Example #3: Visual Hull
Sensor

Parameters







2020/4/9
number of camera views
Parameters related with each cameras
projection matrix of every view
output format
volume resolution
publishing frequency (1Hz~60Hz)
…
43
SensorTalk Design

Serve multiple users






2020/4/9
One base frequency
Multiple client required transmission mode
 DIRECT MODE
 CONTINUOUS MODE
 BATCH MODE
Multiple client required publishing rate
Multiple client required frame compression
Locking Parameters
Read Output Frame/Stop Read Output Frame
44
SensorTalk Scenario
Server
Client
Up
Subscribe
Create a client structure
Up
Return client ID
Ask for Description
Return Description
Control para “A”
Call function to change “A”
2020/4/9
Return new “A”
45
SensorTalk Scenario (cont.)
Server
Client
Get 1 frame (DIRECT)
Get 1 frame from driver
Return the frame
Process the frame
Get frames (CONTINUOUS)
Get 1 frame from driver
Return the frame
Get 1 frame from driver
Return the frame
Get 1 frame from driver
Return the frame
2020/4/9
46
SensorTalk Scenario (cont.)
Server
Client
Stop stream (CONTINUOUS)
Stop getting frames
Return SUCCESS
Release
Delete the client
structure with ID
Disconnect
Close program
Waiting for other
connections
2020/4/9
47
Demo



2 Virtual Cameras
1 “Visual Hull” sensor
Dataset from
http://www.mpi-sb.mpg.de/departments/irg3/kungfu/
A demo video
2020/4/9
48
Conclusion


Recent Vision Sensors
Sensor Fusion Framework


More in SLAM
Multiple Sensor Cooperation

More in Multiple robot coordination
1st Summer School on Perception and Sensor Fusion in Mobile Robotics,
September 11~16, 2006 – Fermo, Italy
http://psfmr.univpm.it/2005/material.htm
Thanks, any Questions?
2020/4/9
49
Download