Structure Light

advertisement
--- some recent progress
Bo Fu
University of Kentucky
Paper List
 1. Structured Light 3D Scanning in the Presence of Global Illumination.
(CVPR 11’)
[systematic error introduced by ill lighting condition]
Extension:
A Practical Approach to 3D Scanning in the Presence of Inter-reflections, Subsurface
Scattering and Defocus. (IJCV 12’)
 2. Implementing High Resolution Structured Light by Exploiting Projector
Blur. (WACV 12’)
[increase depth resolution]
 3. Vision Processing for Real-time 3D Data Acquisition Based on Coded
Structured Light. (IEEE Trans. on Image Processing 2008)
[reduce acquisition time]
Structured Light 3D Scanning in the Presence of
Global Illumination
Mohit Gupta, etc
Carnegie Mellon University
Introduction
 Motivation
One important assumption of most structured light techniques does
not always hold: scene points receive illumination only directly from the
light source.
Introduction
 Main idea
Design patterns that modulate global illumination and prevent the
errors at capture time itself.
Gray code
Min-SW
XOR-02
Errors due to Global Illumination
Short range effect: sub-surface scattering, defocus
Long range effect: inter-reflection, diffusion
1. inter-reflection error
1. inter-reflection error
 solution
2. sub-surface scattering error
2. sub-surface scattering error
 solution
Error formulation
𝐹𝑜𝑟 𝑎 𝑠𝑐𝑒𝑛𝑒 𝑝𝑜𝑖𝑛𝑡, 𝑆 𝑖 , only with global illumination
𝐿𝑖 : irradiances under pattern P
𝐿𝑖 : irradiances under inverse pattern 𝑃
Lid : direct component
Lig : global component
β: fraction of the global component under pattern P
Error formulation
𝐹𝑜𝑟 𝑎 𝑠𝑐𝑒𝑛𝑒 𝑝𝑜𝑖𝑛𝑡, 𝑆 𝑖 , with global illumination
and projector defocus
α: projector defocus fraction
Error formulation
 Correct binarization:
⇒
𝑆𝑖
Note: without global illumination (Lg =0), defocus (α = 1), this
condition automatically holds
Error formulation
 Long range effect
diffuse and specular inter-reflection
𝛽≈0
𝑖 < 𝐿𝑖
⟹
𝐿
𝐿𝑑 < 𝐿𝑔
Consequence: low frequency decode error. Since the low frequency pattern correspond to the higher-order bits, this results
in a large error in the recovered shape.
Error formulation
 Short-range effect
sub-surface scattering and defocus
𝐿𝑑 ≈ 0
⟹ 𝛼𝐿𝑑 − 1 − 𝛼 𝐿𝑑 𝑖𝑠 𝑠𝑚𝑎𝑙𝑙
⟹ 𝐿𝑑 ≥ 𝐿𝑔 𝑜𝑟𝐿𝑑 < 𝐿𝑔
𝛼 ≈ 0.5
𝛽 ≈ 0.5
Consequence: loss of depth resolution
Pattern for error prevention
 How to design pattern to prevent both long range effect and short
range effect?
pattern with only high frequencies can prevent long range effect.
pattern with only low frequencies can prevent short range effect.
It is possible to design code with only high frequency patterns, while for
short range effect, patterns with large minimum stripe-width can be
designed.
Pattern for error prevention
 1. For long range effect
Model the binarization process as a scene-dependent function from the
set of binary projected patterns ℙ to the set of binary classifications of the
captured image 𝔹:
Since low frequency pattern is not suitable for long range effect, we can
decompose it into two high frequency patterns with logical XOR
Pattern for error prevention
 1. For long range effect
Gray code
XOR
Base pattern
Pattern for error prevention
 1. For long range effect
Pattern for error prevention
 2. For short range effect
Design codes with large minimum stripe-width (min-SW)
well studied in combinatorial mathematics [1].
[1] binary gray codes with long bit runs. The electronic journal of combinatorics. 2003
Pattern for error prevention
 2. Ensemble of codes for general scenes
using four codes optimized for different effects:
XOR-04 and XOR-02 for long range effect
Gray code with maximum min-SW and Gray code for short range effect
Rule: if any two agree within a small threshold, that value is returned as
true depth,.
Experiment
 Please refer to the paper (IJCV preferred)
Implementing High Resolution Structured Light by
Exploiting Projector Blur
CamilloTaylor
University of Pennsylvania
Introduction
 Motivation
1. With standard structured light decoding schemes, one is limited by
the resolution of the projector. The quantization of the projector
ultimately limits the accuracy of the reconstruction.
Introduction
 Motivation
2. Growing disparity between the resolution of the image sensor and the
resolution of the projector systems.
1600*1200
VS
640*480
Introduction
 Main idea
By exploit the blur induced by the optics of the projector, subpixel
correspondences between camera frame and projector frame can be
established
Major comparison: Li Zhang, Shree Nayar. Projection Defocus Analysis for Scene Capture and Image Display.
(TOG 06’)
Approach
The effective irradiance that a pixel in the projector contributes to a point in the scene is related to the displacement between
the projection of that scene point on the projector frame and the center of the pixel. This falloff is modeled as a Gaussian
I: observed scene radiance measured at a pixel in the camera
f: BRDF at the scene point
E: irradiance supplied to the corresponding scene point by the projector
Approach
k: stripe index
σ: width of the blur kernel at point in the scene
δ: projection of scene point in the projector frame (with subpixel precision)
k
δ
Approach
I0: scene irradiance due to ambient illumination (known)
δ : floating point offset between -0.5 and 7.5
Experiment
For more result, please refer to original paper
Vision Processing for Real-time 3D Data Acquisition
Based on Coded Structured Light
S. Y. Chen, etc
City University of Hong Kong
Motivation
 Conventional structured light system can not be applied to moving
surfaces since multiple patterns must be projected.
Main idea
 Grid solid pattern for 3d reconstruction with fast
matching strategies.
Thank you
Download