Parameter Estimation, Part 2

advertisement
Parameter estimation
Invariance to transforms ?
x  Hx
~
x  Tx
~
x   Tx 
?
~
H  T HT
-1
~~
~
x   Hx
~
Tx  HTx
-1 ~
x  T HTx
will result change?
for which algorithms? for which transformations?
Non-invariance of DLT
Given x i  xi and H computed by DLT,
and
~
xi  Tx i , ~
xi  Txi
Does the DLT algorithm applied to
yield
~?
-1
~
xi  ~
xi
H  THT
Answer is too hard for general T and T’
But for similarity transform we can state NO
Conclusion: DLT is NOT invariant to Similarity
But can show that Geometric Error is Invariant to Similarity
Normalizing transformations
• Since DLT is not invariant,
what is a good choice of coordinates?
e.g.
• Translate centroid to origin
• Scale to a 2 average distance to the origin
• Independently on both images
Importance of normalization
0
x
 i
0
yi
0  xi  yi  1 yixi
1 0
0
0  xixi
~102 ~102 1
~102
~102
1
~104
yi yi
 xi yi
~104
 h1 
yi  2 
h   0

 xi  3 
h 
~102
orders of magnitude difference!
Without normalization
with normalization
Assumes H is identity; adds 0.1 Gaussian noise to each point. Then computes H:
Normalized DLT algorithm
Objective
Given n≥4 2D to 2D point correspondences {xi↔xi’},
determine the 2D homography matrix H such that xi’=Hxi
Algorithm
 xi
(i) Normalize points ~
x i  Tnormx i , ~
xi  Tnorm
(ii) Apply DLT algorithm to ~
xi  ~
xi ,
~
(iii) Denormalize solution H  T-1 H
T
norm
norm
Iterative minimization metods
Required to minimize geometric error
(i) Often slower than DLT
(ii) Require initialization
(iii) No guaranteed convergence, local minima
(iv) Stopping criterion required
Initialization
• Typically, use linear solution
• If outliers, use robust algorithm
• Alternative, sample parameter space
Iterative methods
Many algorithms exist
• Newton’s method
• Levenberg-Marquardt
• Powell’s method
• Simplex method
Robust estimation
• What if set of matches contains gross outliers?
ransac
Filled black circles  inliers
Empty circles  outliers
least squares
RANSAC
Objective
Robust fit of model to data set S which contains outliers
Algorithm
(i) Randomly select a sample of s data points from S and
instantiate the model from this subset.
(ii) Determine the set of data points Si which are within a
distance threshold t of the model. The set Si is the
consensus set of samples and defines the inliers of S.
(iii) If the subset of Si is greater than some threshold T, reestimate the model using all the points in Si and terminate
(iv) If the size of Si is less than T, select a new subset and
repeat the above.
(v) After N trials the largest consensus set Si is selected, and
the model is re-estimated using all the points in the
subset Si
Distance threshold
Choose t so probability for inlier is α (e.g. 0.95)
• Often empirically
2
d
• Zero-mean Gaussian noise σ then  follows
 m2 distribution with m=codimension of model
(dimension+codimension=dimension space)
Codimension
Model
t2
1
l,F
3.84σ2
2
H,P
5.99σ2
3
T
7.81σ2
How many samples?
Choose N so that, with probability p, at least one
random sample of s points is free from outliers.
e.g. p=0.99; e =proportion of outliers in the
entire data set
1  1  e 
s N
 1 p

N  log 1  p  / log 1  1  e 
s

proportion of outliers e
s
2
3
4
5
6
7
8
5%
2
3
3
4
4
4
5
10% 20% 25% 30% 40% 50%
3
5
6
7
11
17
4
7
9
11
19
35
5
9
13
17
34
72
6
12
17
26
57
146
7
16
24
37
97
293
8
20
33
54 163 588
9
26
44
78 272 1177
Acceptable consensus set?
• Typically, terminate when inlier ratio reaches
expected ratio of inliers; n = size of data set;
e = expected percentage of outliers
T  1  en
Adaptively determining
the number of samples
e is often unknown a priori, so pick worst case,
e.g. 50%, and adapt if more inliers are found,
e.g. 80% would yield e=0.2
• N=∞, sample_count =0
• While N >sample_count repeat
•
•
•
•
Choose a sample and count the number of inliers
Set e=1-(number of inliers)/(total number of points)
Recompute N from e
N  log 1  p  / log 1  1  e 
Increment the sample_count by 1
• Terminate
s
Automatic computation of H
Objective
Compute homography between two images
Algorithm
(i) Interest points: Compute interest points in each image
(ii) Putative correspondences: Compute a set of interest
point matches based on some similarity measure
(iii) RANSAC robust estimation: Repeat for N samples
(a) Select 4 correspondences and compute H
(b) Calculate the distance d for each putative match
(c) Compute the number of inliers consistent with H (d<t)
Choose H with most inliers
(iv) Optimal estimation: re-estimate H from all inliers by
minimizing ML cost function with Levenberg-Marquardt
(v) Guided matching: Determine more matches using
prediction by computed H
Optionally iterate last two steps until convergence
Example: robust computation
Interest points
(500/image)
Left: Putative
correspondences (268)
Right: Outliers (117)
Left: Inliers (151) after
Ransac
Right: Final inliers (262)
After MLE and guided
matching
Download