Ordinary Least

advertisement
Non-linear Least-Squares
Regularized Least-Squares
Why non-linear?
• The model is non-linear (e.g. joints, position, ..)
• The error function is non-linear
Regularized Least-Squares
Setup
• Let f be a function such that
a  R  f a, x   R
n
where x is a vector of parameters
Regularized Least-Squares
Setup
• Let f be a function such that
a  R  f a, x   R
n
where x is a vector of parameters
• Let {ak,bk} be a set of measurements/constraints.
We fit f to the data by solving:
min
1
x
b

2
k
Regularized Least-Squares
k
 f a k , x

2
Setup
• Let f be a function such that
a  R  f a, x   R
n
where x is a vector of parameters
• Let {ak,bk} be a set of measurements/constraints.
We fit f to the data by solving:
min
1
x
2
 b k
k
 f a k , x

2
or
min
x

rk
2
k
with rk  b k  f a k , x
Regularized Least-Squares

Example
Regularized Least-Squares
Overview
•
•
•
•
•
Existence and uniqueness of minimum
Steepest-descent
Newton’s method
Gauss-Newton’s method
Levenberg-Marquardt method
Regularized Least-Squares
A non-linear function:
the Rosenbrock function
z  f ( x , y )  (1  x )  100 ( y  x )
2
2
Global minimum at (1,1)
Regularized Least-Squares
2
2
Existence of minimum
A local minima is characterized by:
1.
 E x
2.
h H
min
Regularized Least-Squares
 x min  h  0 , for all h small enough
H E  x min  is positive semi - definite)
T
T
(e.g.
 0
E
Existence of minimum
E x

E  x min  h   E  x min  
x
Regularized Least-Squares
min
1
2
h H
T
T


x
h
E
Descent algorithm
• Start at an initial position x0
• Until convergence
– Find minimizing step dxk
– xk+1=xk+ dxk
Produce a sequence x0, x1, …, xn such that
f(x0) > f(x1) > …> f(xn)
Regularized Least-Squares
Descent algorithm
• Start at an initial position x0
• Until convergence
– Find minimizing step dxk
using a local approximation of f
– xk+1=xk+ dxk
Produce a sequence x0, x1, …, xn such that
f(x0) > f(x1) > …> f(xn)
Regularized Least-Squares
Approximation using Taylor series
E x  h   E x    E x
Regularized Least-Squares
 h
T

1
2
h H
x  h  O  h
T
T
E
2

Approximation using Taylor series
E x  h   E x    E x
 h
T

1
2
h H
x  h  O  h
T
T
E
 r j 
2
with E  x    r j , J  
 and H
2 j 1
 xi 
1
Regularized Least-Squares
m
rj
  2rj

  x i



2

Approximation using Taylor series
E x  h   E x    E x
 h
T

1
h H
2
x  h  O  h
T
T
E
 rj 
2
with E  x    r j , J  
 and H
2 j 1
 xi 
1
m
E 

m
rj
  2rj
 
  x i



rj rj  J r
T
j 1
H
m
E


j 1
Regularized Least-Squares
m
T
 rj rj 

j 1
r jH
T
rj
J J 
T
m

j 1
r jH
T
rj
2

Steepest descent
Step
x k 1  x k    E  x k 
where  is chosen such that:
f  x k 1   f  x k 
using a line search algorithm:
min
Regularized Least-Squares

f
 k x k    E  x k 
xk
x1
x2
Regularized Least-Squares
x
 E x k 
min
x k 1
xk
x1
x2
Regularized Least-Squares
x
 E x k 
min
x k 1
In the plane of the
steepest descent direction
x k 1
Regularized Least-Squares
xk
Rosenbrock function (1000 iterations)
Regularized Least-Squares
Newton’s method
• Step: second order approximation
E  x k  h   N (h )  E  x
   E x k  h
T
k

1
h H
2
At the minimum of N
 N (h )  0   E  x
k
 h  H
x
k 1
Regularized Least-Squares
 x
k
H
  H E  x k h
E
x k 
1
0
 E x
1


x
 E x k 
E
k
k

x k 
T
T
E
E x
T





E
x
h
k
k

1
2
h H
T
T


x
h
E
k
xk
x1
x2
Regularized Least-Squares
x
-H
min
x k 1
E
x k 
1
 E x
k

Problem
• If H x  is not positive semi-definite, then - H  x   E  x 
is not a descent direction: the step increases the error
function
1
E
k
E
k
• Uses positive semi-definite approximation of Hessian
based on the jacobian
(quasi-Newton methods)
Regularized Least-Squares
k
Gauss-Newton method
Step: use
x
k 1
 x
k
H
1


x
 E x k 
E
k
with the approximate hessian
H
J J 
T
E
m

j 1
r jH
T
rj
J J
T
Advantages:
• No second order derivatives
• J T J is positive semi-definite
Regularized Least-Squares
Rosenbrock function (48 evaluations)
Regularized Least-Squares
Levenberg-Marquardt algorithm
• Blends Steepest descent and Gauss-Newton
• At each step solve, for the descent direction h
J
T
J  λ I  h   E  x k 
if λ large
h   E  x k 
(steepest
descent)
if λ small
h   J J
T
Regularized Least-Squares

1
 E x
k

(Gauss - Newton)
Managing the damping parameter λ
• General approach:
– If step fails, increase damping until step is successful
– If step succeeds, decrease damping to take larger
step
• Improved damping
J
T
J  λ diag (J J )  h   E  x k 
Regularized Least-Squares
T
Rosenbrock function (90 evaluations)
Regularized Least-Squares
“Resynthesizing Facial Animation through
3D Model-Based Tracking”, Pighin etal.,

+

+

+

+

Video frame
Regularized Least-Squares
Model
Model fitting
min
-
 i 
Video frame
Regularized Least-Squares

+

+

+

+

Model
2
Model fitting
min
-
 i 
Video frame
Regularized Least-Squares

+

+

+

+

Model
2
Fitted model
“Resynthesizing Facial Animation through 3D
Model-Based Tracking”, Pighin etal.,
• Facial tracking by fitting face model
Regularized Least-Squares
“Resynthesizing Facial Animation through 3D
Model-Based Tracking”, Pighin etal.,
• Facial tracking by fitting face model
• Problem
– Given an image I
– A 3D face model
Regularized Least-Squares
“Resynthesizing Facial Animation through 3D
Model-Based Tracking”, Pighin etal.,
• Facial tracking by fitting face model
• Problem
– Given an image I
– A 3D face model
Minimize
1
min p 
2

i

ˆ
I ( x j , y j )  I ( p )( x j , y j )  D ( p ) 

Where Iˆ ( p ) is the image produced by the face model
And D  p  is a regularization term.
Regularized Least-Squares
Problem setup
p
is a set of parameters
and the camera positions
D (p ) 

including
R
|t
facial parameters

min 0 ,  k   max 0 ,  k  1 
2
k
D (p )
0
Regularized Least-Squares
1
2
{ k }
Solving
• Using a gaussian pyramid
of the input image
• For level in [coarserLevel : finerLever]
– Initialize using solution at level-1
– Minimize at level using Levenberg-Marquardt
Solution is solution at finerLevel
Regularized Least-Squares
Tracking results
Regularized Least-Squares
Conclusion
• Get a good initial guess
Prediction (temporal/spatial coherency)
Partial solve
Prior knowledge
• Levenberg-Marquardt is your friend
Regularized Least-Squares
Download