Primitives detection
Problem:
In automated analysis of digital images, a sub-problem often arises of detecting simple shapes, such as straight lines, circles or ellipses
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Primitives detection
Solution?
Edge detection
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Finding straight lines
An edge detector can be used as a pre-processing stage to obtain points
(pixels) that are on the desired curve in the image space
Due to imperfections in either the image data or the edge detector, there may be missing points on the desired curves as well as spatial deviations between the ideal line/circle/ellipse and the noisy edge points as they are obtained from the edge detector
It is often non-trivial to group the extracted edge features to an appropriate set of lines, circles or ellipses
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Grouping & Fitting methods
2 approaches for grouping & fitting
1. Global optimization / Search for parameters
Least squares fit
Total least squares fit
Robust least squares
Iterative closest point (ICP)
Etc.
2. Hypothesize and test
Hough transform
Generalized Hough transform
RANSAC
Etc.
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Grouping & Fitting methods
1. Global optimization / Search for parameters
Least squares fit
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Least square line fitting
Data: (x
1
, y
1
), …, (x n
, y n
) Line equation: y i
= m x i
+ b Find (m, b) to minimize: E
n i
1
( y i
m x i
b )
2
(x i
, y i
)
E
n i
1
x i
1
y i
2
x x n
1
1
1
y
1
2
Ap
y
2 T y y
2( Ap )
T y
( Ap
T
) ( Ap ) dE dP
2
T T
A Ap 2 A y
0 A
T
Ap
A
T y
p
1
A
T y Matlab: p = A \ y;
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Least square fit
Least square line fitting
Question: Will least squares work in this case?
• Fails completely for vertical lines
Question: is LSQ invariant to rotation?
•No. same edge, different parameters m
1
,b
1
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting m
2
,b
2
Least square fit
Grouping & Fitting methods
2 approaches for grouping & fitting
1. Global optimization / Search for parameters
Total least squares fit
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Total least square distance from a point
x y i i
to a line
ax by c ax i
by i
c d
a
2 b
2 ax+by+c=0
E
Find (a, b, c) to minimize the sum of squared perpendicular distances
n i
1
( a x i
b y i
c )
2 n
a
2 b
2
1
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
(x i
, y i
) n
ˆ
: a
2 b
2
Least square fit
Total least square
E
c
n i
1
2 ( a x i
b y i
c )
0 c
a n
n i
1 x i
b n
n i
1 y i
a x
b y
E
n i
1
( ( i
x )
( i
y ))
2
x
1
x
y
1
y
x n
x
y n
y
2
T T p A Ap minimize s.t. p p
1
*Solution is eigenvector corresponding to smallest eigenvalue of A T A
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Least square fit
LSQ & TLSQ analitic solutions
Recap: Two Common Optimization Problems
2 minimize Ax
b least squares solution t o Ax
b x
A
T
A
1
A
T b x
A \ b (matlab)
minimize
T T x A Ax s.t.
T x x
1
v
eig(
T
A A )
n
1...
n
1
: x
v n
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Least square fit
Least square line fitting
Applications and
Algorithms in CV
Least squares fit to the red points
Tutorial 3:
Grouping & Fitting
Least square fit
Least square line fitting:: robustness to noise
Applications and
Algorithms in CV
Least squares fit with an outlier
Squared error heavily penalizes outliers
Tutorial 3:
Grouping & Fitting
Least square fit
Least square line fitting:: conclusions
Good
• Clearly specified objective
• Optimization is easy (for least squares)
Bad
• Not appropriate for non-convex objectives
– May get stuck in local minima
• Sensitive to outliers
– Bad matches, extra points
• Doesn’t allow you to get multiple good fits
– Detecting multiple objects, lines, etc.
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Least square fit
Grouping & Fitting methods
2 approaches for grouping & fitting
2. Hypothesize and test
Hough transform
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting introduction
Hough transform
The purpose of the Hough transform is to address the problem of primitives detection by performing an explicit voting procedure over a set of parameterized image objects
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform :: Detecting straight lines
A line y = mx + b can be described as the set of all points comprising it: (x
1
, y
1
), (x
2
, y
2
)...
Instead, a straight line can be represented as a point (b, m) in the parameter space
Question: does {b,m} (slope and intercept) indeed the best parameter space?
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: Line parameterization
Hough transform is calculated in polar coordinates
The parameter 𝔯 represents the distance between the line and the origin 𝜃 is the angle of the vector from the origin to this closest point
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: Line parameterization
Using this parameterization, the equation of the line can be written as: which can be rearranged to:
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: Example
Question : Is there a straight line connecting the 3 dots?
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: Example point1
Hough Solution :
1. For each data point, a number of lines are plotted going through it, all at different angles ( 𝜃 ) (shown as solid lines)
2. Calculate 𝔯 For each solid line from step 1
(shown as dashed lines)
3. For each data point, organize the results in a table
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting point2 point3
Hough Transform
Hough transform:: Example
Hough Solution :
4. Draw the lines received for each data point in the Hough space
5. Intersection of all lines indicate a line passing through all data points (remember: each point in
Hough space represent a line in the original space)
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: Example
Applications and
Algorithms in CV features
Tutorial 3:
Grouping & Fitting votes
Hough Transform
Hough transform:: other shapes
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: effect of noise features
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: effect of noise features
Applications and
Algorithms in CV votes
Tutorial 3:
Grouping & Fitting
Hough Transform
Hough transform:: random points features
Applications and
Algorithms in CV votes
Tutorial 3:
Grouping & Fitting
Hough Transform
Random points:: Dealing with noise
Grid resolution tradeoff
Too coarse: large votes obtained when too many different lines correspond to a single bucket
Too fine: miss lines because some points that are not exactly collinear cast votes for different buckets
Increment neighboring bins
Smoothing in accumulator array
Try to get rid of irrelevant features
Take only edge points with significant gradient magnitude
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
Random points:: conclusions
Good
1. Robust to outliers: each point votes separately
2. Fairly efficient (often faster than trying all sets of parameters)
3. Provides multiple good fits
Bad
1. Some sensitivity to noise
2. Bin size trades off between noise tolerance, precision, and speed/memory
3. Not suitable for more than a few parameters (grid size grows exponentially)
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
Hough Transform
( RAN dom SA mple C onsensus):
Learning technique to estimate parameters of a model by random sampling of observed data
RANSAC
RANSAC
RANSAC:: algorithm
Algorithm:
1.
Sample (randomly) the number of points required to fit the model
2.
Solve for model parameters using samples
3.
Score by the fraction of inliers within a preset threshold of the model
Repeat 1-3 until the best model is found with high confidence
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: line fitting example
Algorithm:
1.
Sample (randomly) the number of points required to fit the model 𝒔
2.
Solve for model parameters using samples
3.
Score by the fraction of inliers within a preset threshold of the model
Repeat 1-3 until the best model is found with high confidence
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: line fitting example
Algorithm:
1.
Sample (randomly) the number of points required to fit the model 𝒔
2.
Solve for model parameters using samples
3.
Score by the fraction of inliers within a preset threshold of the model
Repeat 1-3 until the best model is found with high confidence
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: line fitting example
N
I
6
Algorithm:
1.
Sample (randomly) the number of points required to fit the model 𝒔
2.
Solve for model parameters using samples
3.
Score by the fraction of inliers within a preset threshold of the model
Repeat 1-3 until the best model is found with high confidence
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: line fitting example
N
I
14
Algorithm:
1.
Sample (randomly) the number of points required to fit the model 𝒔
2.
Solve for model parameters using samples
3.
Score by the fraction of inliers within a preset threshold of the model
Repeat 1-3 until the best model is found with high confidence
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: parameter selection 𝒔 :initial number of points (Typically minimum number needed to fit the model)
𝑵 :number of RANSAC iterations 𝒑 : probability of choosing 𝒔 inliers at least in some iteration 𝝎 :probability of choosing an inliers 𝝎 𝒔 :probability that all s points are inliers
𝟏 − 𝝎 𝒔 :probability that at least one point is an outlier
𝟏 − 𝒑 = 𝟏 − 𝝎 𝒔 𝑵 : probability that the algorithm never selects a set of s inliers
Applications and
Algorithms in CV
N
p
s
Tutorial 3:
Grouping & Fitting
RANSAC
RANSAC:: conclusions
Good
Robust to outliers
Applicable for larger number of parameters than Hough transform
Parameters are easier to choose than Hough transform
Bad
Computational time grows quickly with fraction of outliers and number of parameters
Not good for getting multiple fits
Applications and
Algorithms in CV
Tutorial 3:
Grouping & Fitting
RANSAC