Camera Calibration - Microsoft Research

advertisement
Camera Calibration
With 3D, 2D, 1D and No Apparatus
Zhengyou Zhang
r
Microsoft Research
zhang@microsoft.com
http://research.microsoft.com/~zhang
Outline
• Introduction
• Camera Modeling
• Camera Calibration
–
–
–
–
With 3D pattern (3D)
With planar pattern (2D)
With a wand (1D)
Without any apparatus (0D)
• Conclusion
• References
Motivations
• Use a camera as a quantitative sensor
• Essential for many applications to recover
quantitative information about the observed
scene (3D Euclidean structure in particular)
Problem Statement
• Determine the parameters of the transformation
between an object in 3D space and the 2D
image observed by the camera
– from visual information (images)
• The transformation includes
– Extrinsic parameters: location (translation) and
orientation (rotation) of the camera
– Intrinsic parameters: characteristics of the camera,
such as focal length, magnitude factors, optical
center retinal location
Camera Modeling
Modeling Cameras

x 
M   y
 z 


(u0 , v0 )
m
C
(R, t )
x 
u    u0 
 y
s v    0  v0 r1 r2 r3 t  

z 
R t 
1   0 0 1 
 






1

~
m
A
~
M
Pinhole Model
Extrinsic Parameters
Intrinsic Parameters
Perspective Projection Matrix
Calibration With 3D Apparatus
3D approach
3D Apparatus: Example 1
• Using a predefined 3D object with at least two planes
From 3D structure + 3D-2D correspondences
To intrinsic + extrinsic parameters ( projection matrix)
3D Apparatus: Example 2
• A movable plane with
known displacement
(Tsai’s technique)
Calibration Parameters
• Parameters to be calibrated: 11
– Five intrinsic parameters
• u: size of the focal length in horizontal pixels
(horizontal scale factor)
• v: size of the focal length in vertical pixels
(vertical scale factor)
• u0 and v0: coordinates of the principal point of
the camera, i.e., the intersection between the
optical axis and the image plane
• : angle between the retinal axes. The pixel
grid may not be exactly orthogonal. In practice,
however, it is very close to 
(To be continued)
Calibration Parameters (cont’d)
– Six extrinsic parameters:
• R: rotation matrix, 3 degrees of freedom
• t: translation vector, 3 degrees of freedom
• Perspective projection matrix
– 12 elements, but is defined up to a scale factor.
This also leads to 11 parameters.
Method 1: Use detected points
• Main steps:
• Observations:
Method 1: Linear Technique
Recovering Intrinsic and Extrinsic
Parameters from Projection Matrix
• Intrinsic parameters
• Extrinsic parameters
Method 1: Nonlinear Technique
Method 2: Use directly image
intensity
Method 2: Use edge points on
the calibration pattern
Method 2: Result
Taking into Account Lens Distortion
Radial & Decentering Distortion
Radial Distortion
Calibration with a Plane
2D approach
Plane-Based Calibration Technique
• Flexible (simple calibration object & easy setup)
– Print a pattern on a paper
– AttachUse
the paper
to aone
planarplane
surface
only
– Show the plane freely a few times to the camera
• Robust
– validated with extensive simulations & real data
Camera Model

x 
M   y
 z 


(u0 , v0 )
m
C
(R, t )
x 
u    u0 
 y
s v    0  v0 r1 r2 r3 t  

z 
R t 
1   0 0 1 
 






1

~

m
A
~
M
Plane projection
• For convenience, assume the plane at z = 0.
v
x 
M 
 y
u

z0
(u0 , v0 )
m
C
• The relation between image points and model
points is then given by:
~
~  HM
sm
with H  Ar1 r2 t 
What do we get from one image?
Estimate H , which is defined up to a scale factor
Let H  h1 h 2
h1
h3  , we have
h 2 h3   Ar1 r2 t 
This yields
T T 1
h1 A A h 2 0
T T 1
T T 1
h1 A A h1 h 2 A A h 2
We obtain two equations in A.
Absolute Conic: 
• Absolute Conic : in P3
• Interpretation
An imaginary circle of radius i  1 on the plane at infinity.
• Important property
Invariance under rigid transformation
Invariance of the Absolute Conic
Image of the Absolute Conic: 
Geometric interpretation
r1 
 0 
Absolute conic
r1  ir2 

a
 0 
xTx  0
r2 
 0 
r3
mT A T A 1m   0
a(h1  ih 2 )
(h1  ih 2 )T A T A 1 (h1  ih 2 )  0
C
Linear Equations
• Let
 B11
B  A T A 1   B21
 B31
B12
B22
B32
• Define b  B11 B12 B22
up to a scale factor
• Rewrite
T T 1
B13 
B23 
B33 
B13
B23
symmetric
B33 
h1 A A h 2 0
T T 1
T T 1
h1 A A h1 h 2 A A h 2
as linear equations:
Mb  0
What do we get from 2 images?
• If we impose  = 0, which is usually the case
with modern cameras, we can solve all the
other camera intrinsic parameters.
How about more images?
Better! More constraints than unknowns.
Analytical solution followed by nonlinear optimization
Lens Distortion
• Use only first two terms of radial distortion: k1
and k2
• Assume the radial distortion center coincides
with the principal point
• Linear estimation if fixing A
• Alternately estimate (k1, k2) and A
• Complete maximum likelihood estimation
Solution
• Show the plane under n orientations (n > 1)
• Estimate the n homography matrices
(analytic solution followed by MLE)
• Solve analytically the 6 intermediate
parameters (defined up to a scale factor)
• Extract the five intrinsic parameters
• Compute the extrinsic parameters
• Estimate the radial distortion parameters
• Refine all parameters with MLE
Experimental results
Extracted corner points
Result (1)
Result (2)
Correction of Radial Distortion
Original
Corrected image
Errors vs. Noise Levels in data
Errors vs. Number of Planes
Errors vs. Angle of the plane
Errors vs. Noise in model points
Errors vs. Spherical non-planarity
Errors vs. Cylindrical non-planarity
Application to object modeling
3D Reconstruction
M i c ro s o ft R e s e a r c h M e d i a V i e w e r
Summary
• We have developed a flexible and robust
technique for camera calibration.
• Analytical solution exists.
• MLE improves the analytical solution.
• We need at least two images if  = 0.
• We can use as many images of the plane as
possible to improve the accuracy.
• It really works!
Binary executable is available from
http://research.microsoft.com/~zhang
Calibration with a Wand
1D Approach
The Missing Dimension: 1D
• 1D Objects (Wand):
Points aligned on a line
• Conclusions:
– Impossible with a freemoving wand
– Possible if one point is fixed
Free-Moving Wand: No Way!
# unknowns
# knowns
# pts
fixed
per image
Total
per image
Total
2
5
5
5+5N
4
4N
3
5
5
5+5N
6 – 1 (colinearity)
5N
M
5
5
5+5N
2M – 1 – 2(M – 3)
5N
2 pts define wand - known length
Colinearity + cross-ratio
#unknows (5+5N) > #knowns (5N)
Wand with a Fixed Point
#
pts
# unknowns
# knowns
fixed
per im.
Total
fixed
per image
Total
2
5+3
2
8+2N
2
2
2+2N
3
5+3
2
8+2N
2
4 – 1 (colinearity)
2+3N
M
5+3
2
8+2N
2
2(M – 1) – 1 – 2(M – 3)
2+3N
1 pt defines wand - known length
Colinearity + cross-ratio
If M >= 3 & N >= 6, then #knows >= #unknowns
Notation
• Pinhole model
x 
u    u0 
 y
s v    0  v0 r1 r2 r3 t  

z 
R t 
1   0 0 1 
 






1

~

m
A
~
M
• All 3D points are defined in the camera sys.
R = I, t = 0.
Basic Configuration
A (fixed point)
Basic equations
3D:
(1)
C
(2)
B
a
2D:
(3)
c
b
(4)
(5)
O
Some Derivations
(3)+(4)+(5) → (1)
→ (2)
Calibration Equation
• The last equation is equivalent to
Depth of the fixed point
Image of the absolute conic
(*)
known
Close-Form Solution
Let
Then equation (*) becomes linear in x:
We can solve for x if 6 or more observations are available
Nonlinear Optimization
• Refine closed-form solution by minimizing
where
 ( A, M) is the projection of M onto image
Computer Simulations
α = β = 1000, γ = 0, u0=320, v0=240. image res.: 640x480
Wand length: 70cm. Fixed point: (0,35,150)
100 randomly selected orientations
120 independent trials at each noise level
Closed-form
solution
Computer Simulations (cont’d)
Nonlinear
optimization
50% less error than closed-form solution
Real Data
• Wand: 3 beads, 28cm long.
• 150 frames.
Frame 10
Frame 60
Frame 90
Frame 140
Real Data (cont’d)
• Calibration result
• Diff. of 2% w.r.t. Plane-based
• Two sources of major errors
– Fixed point not fixed
– Beads positioning by eye check
Plane-based
Summary
• Investigated camera calibration with 1D objects
• Not possible with free-moving wand
• Possible with one-fixed point
–
–
–
–
At least 3 points
At least 6 images
Closed-form solution
Nonlinear optimization reduces errors by half
• Validated with both simulation and real data
• Extension for a wand with more than 3 points
• Future work: Multiple fixed points
– How to increase the calibration volume
Calibration Without Apparatus
0D approach
(Self-calibration)
Self-calibration
• Move the camera in a static environment
– match feature points across images
– make use of rigidity constraint
• The Automatic Way
From Projective To Euclidean
High-Level Observation
• Rigidity constraint:
– The Fundamental matrix between two views yields
two relations between the internal parameters of the
camera if those are constant
(Faugeras-Maybank, 1993, Luong 1993, Zeller 1995)
• Self-calibration: Compute the internal parameters
– Nonlinear optimization with three or more views
• The quality of the results can be checked by
measuring Euclidean invariants in the scene.
– Angles, ratios of lengths, etc.
Absolute Conic: 
• Absolute Conic : in P3
• Interpretation
– An imaginary circle of radius i  1 on the plane
at infinity.
• Important property
Invariance under rigid transformation
Invariance of the Absolute Conic
Image of the Absolute Conic: 
Dual Conic & Kruppa Coefficients
Epipolar transformation
Kruppa Equations (1)
Kruppa Equations (2)
Kruppa Equations (3)
Self-Calibration from 3 images
• Assumption: They are taken by the same
camera with constant intrinsic parameters
• Unknowns: 5
• Equations:
– 2 for each image pair
– We have 3 image pairs
– 2x3 equations > 5 unknows
Self-Calibration: More
• With more images and some constraint (e.g.,
rectangle pixels), we can deal with selfcalibration of varying intrinsic parameters
• Stereo setup can provide more constraints
• Bundle adjustment to achieve higher accuracy
Self-calibration From Two Images
With Known u0, v0 and 
Example 1: The Arcades Square
Example 2: The INRIA Library
Example 3: The Valbonne Church
Stereo Example: 2 Image Pairs
t1
t2
Stereo Example: Initial 3D Reconstruction
Stereo Example: Final 3D Reconstruction
Stereo Example: Quantitative Result
Conclusions
• A number of calibration techniques are available
– Using 3D, 2D, 1D or 0D objects
• Each technique has their own advantages and
shortcomings
– Which to use depends on the task
• There still exist more recent work
– E.g., using spheres (another 2D approach)
Acknowledgments
• Part of the data and results were provided by
the INRIA Robotvis group
• Part of the data and results were collected at
Microsoft Research
References
• D.C. Brown. Close-range camera calibration. Photogrammetric
Engineering, 37, 1971.
• O. Faugeras. Three-dimensional Computer Vision: a geometric
viewpoint. MIT Press, Boston, 1993.
• Q.-T. Luong, O. Faugeras. Self-calibration of a moving camera from
point correspondences and fundamental matrices. The International
Journal of Computer Vision, 22(3):261–289, 1997.
• L. Robert. Camera Calibration Without Feature Extraction. CVGIP:
Image Understanding, 63(2):314--325, 1995.
• R.Y. Tsai. An efficient and accurate calibration technique for 3D
machine vision. In IEEE Proc CVPR'86, pages 364--374,1986.
• R.Y. Tsai. Synopsis of recent progress on camera calibration for 3D
machine vision. Robotics Review, 1:147--159, 1989.
• Z. Zhang. Flexible Camera Calibration By Viewing a Plane From
Unknown Orientations. International Conference on Computer Vision
(ICCV'99), pages 666-673, 1999.
• Z. Zhang. Camera Calibration with 1D Objects. ECCV, 2002.
Download