Introduction to Simulation - Lecture 10 Modified Newton Methods Jacob White

advertisement
Introduction to Simulation - Lecture 10
Modified Newton Methods
Jacob White
Thanks to Deepak Ramaswamy, Jaime Peraire, Michal
Rewienski, and Karen Veroy
Outline
• Damped Newton Schemes
– Globally Convergent if Jacobian is Nonsingular
– Difficulty with Singular Jacobians
• Introduce Continuation Schemes
– Problem with Source/Load stepping
– More General Continuation Scheme
• Improving Continuation Efficiency
– Better first guess for each continuation step
• Arc Length Continuation
SMA-HPC ©2003 MIT
Multidimensional
Newton Method
Newton Algorithm
Newton Algorithm for Solving F ( x ) = 0
x = Initial Guess, k = 0
0
Repeat {
( ) ( )
( x )( x − x ) = − F ( x )
Compute F x k , J F x k
Solve J F
k
k +1
k
k
for x k +1
k = k +1
} Until
SMA-HPC ©2003 MIT
(
x k +1 − x k , F x k +1
)
small enough
Multidimensional
Newton Method
Multidimensional
Convergence Theorem
Theorem Statement
Main Theorem
If
( )
≤β
( Inverse is bounded )
a)
J F−1 x k
b)
JF ( x) − JF ( y) ≤ A x − y
( Derivative is Lipschitz Cont )
Then Newton’s method converges given a sufficiently
close initial guess
SMA-HPC ©2003 MIT
Multidimensional
Newton Method
Multidimensional
Convergence Theorem
Implications
If a function’s first derivative never goes to zero, and its
second derivative is never too large…
Then Newton’s method can be used to find the zero
of the function provided you all ready know the
answer.
Need a way to develop Newton methods which
converge regardless of initial guess!
SMA-HPC ©2003 MIT
Non-converging
Case
1-D Picture
f(x)
x1
x
0
Limiting the changes in X might improve convergence
SMA-HPC ©2003 MIT
X
Newton Method
with Limiting
Newton Algorithm
Newton Algorithm for Solving F ( x ) = 0
x = Initial Guess, k = 0
0
Repeat {
( ) ( )
( x ) ∆x = − F ( x )
+ limited ( ∆x )
Compute F x k , J F x k
Solve J F
x k +1 = x k
k = k +1
} Until
SMA-HPC ©2003 MIT
k
k +1
k
for ∆x k +1
k +1
(
∆x k +1 , F x k +1
)
small enough
Damped Newton
Scheme
Newton Method
with Limiting
General Damping Scheme
( )
( )
Solve J F x k ∆x k +1 = − F x k
for ∆x k +1
x k +1 = x k + α k ∆x k +1
Key Idea: Line Search
(
Pick α to minimize F x + α ∆x
k
(
F x + α ∆x
k
k
k +1
)
2
2
k
k
(
k +1
)
≡ F x + α ∆x
k
k
2
2
k +1
) F (x
T
k
+ α k ∆x k +1
Method Performs a one-dimensional search in
Newton Direction
SMA-HPC ©2003 MIT
)
Newton Method
with Limiting
Damped Newton
Convergence Theorem
If
a)
J F−1 ( x k ) ≤ β
b)
JF ( x) − JF ( y) ≤ A x − y
( Inverse is bounded )
( Derivative is Lipschitz Cont )
Then
There exists a set of α k ' s ∈ ( 0,1] such that
F ( x k +1 ) = F ( x k + α k ∆x k +1 ) < γ F ( x k ) with γ <1
Every Step reduces F-- Global Convergence!
SMA-HPC ©2003 MIT
Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {
( ) ( )
Solve J ( x ) ∆x = − F ( x ) for ∆x
Find α ∈ ( 0,1] such that F ( x + α ∆x )
Compute F x k , J F x k
k
k +1
k +1
k
F
k
k
x k +1 = x k + α k ∆x k +1
k = k +1
} Until
SMA-HPC ©2003 MIT
(
∆x k +1 , F x k +1
)
k
small enough
k +1
is minimized
Newton Method
with Limiting
v1
1v
v2
10
+
+
- Vd
Damped Newton
Example
1
I r − Vr = 0
10
Vd
I d − I s (e
-
Vt
− 1) = 0
Nodal Equations with Numerical Values
f ( v2 )
( v −0)
v2 − 1)
(
−16
0.025
=
+ 10 (e
− 1) = 0
2
10
Newton Method
with Limiting
f ( v2 )
Damped Newton
Example cont.
( v −0)
v2 − 1)
(
−16
0.025
=
+ 10 (e
− 1) = 0
2
10
Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {
( ) ( )
Solve J ( x ) ∆x = − F ( x ) for ∆x
Find α ∈ ( 0,1] such that F ( x + α ∆x )
Compute F x k , J F x k
k
k +1
k +1
k
F
k
k
x k +1 = x k + α k ∆x k +1
k = k +1
} Until
(
∆x k +1 , F x k +1
)
k
k +1
is minimized
small enough
How can one find the damping coefficients?
SMA-HPC ©2003 MIT
Newton Method
with Limiting
Damped Newton
Theorem Proof
By definition of the Newton Iteration
x
k +1
=x -α
k
k
( )
k
( )
−1
JF x
F xk
Newton Direction
Multidimensional Mean Value Lemma
F ( x ) − F ( y ) − J F ( y )( x − y ) ≤
A
x− y
2
2
Combining
(
F x
k +1
)− F (x )+ J (x )
k
SMA-HPC ©2003 MIT
k
F
( )
⎡α k J x k
F
⎢⎣
−1
A
F x ⎤ ≤ α k J F xk
⎥⎦ 2
( )
k
( )
−1
( )
F x
k
2
Newton Method
with Limiting
Damped Newton
Theorem Proof-Cont
From the previous slide
(
F x
k +1
)− F (x )+ J (x )
k
k
F
( )
⎡α J x
F
⎢⎣
k
k
−1
A k
⎤
F x
≤ α J F xk
⎥⎦ 2
( )
( )
k
Combining terms and moving scalars out of norms
(
F x
k +1
) − (1 − α ) F ( x ) ≤ (α )
k
k
k
2
A
J F xk
2
( )
−1
( )
F x
k
2
Using the Jacobian Bound and splitting the norm
2
2⎤
2 β A
⎡
k +1
k
k
k
k
F ( x ) ≤ ⎢(1 − α ) F ( x ) + (α )
F (x ) ⎥
⎣
2
⎦
Yields a quadratic in the damping coefficient
SMA-HPC ©2003 MIT
−1
( )
F x
k
2
Newton Method
with Limiting
Damped Newton
Theorem Proof-Cont-II
Simplifying quadratic from previous slide
(
F x
k +1
)
⎡
≤ ⎢1 − α k + α k
⎣
( )
2
β 2A
2
( )
F x
k
⎤
k
F
x
⎥
⎦
( )
Two Cases:
1)
β 2A
2
( )
F xk
<
1
2
Pick α k = 1 (Standard Newton)
2
2 β A
⎛
k
k
⇒ ⎜ 1−α + α
F xk
2
⎝
β 2A
1
1
2)
Pick α k = 2
F xk >
2
2
β A F
( )
( )
( )
⎛
⇒ ⎜ 1−α k + α k
⎝
( )
SMA-HPC ©2003 MIT
2
β 2A
2
(x )
( )
F xk
⎞ 1
⎟<
⎠ 2
k
⎞
1
1
<
−
⎟
2 β 2A F x k
⎠
( )
Newton Method
with Limiting
Damped Newton
Theorem Proof-Cont-III
Combining the results from the previous slide
(
)
( )
F x k +1 ≤ γ k F x k
not good enough, need γ independent from k
The above result does imply
(
)
( )
F x k +1 ≤ F x 0
not yet a convergence theorem
2
β
A
1
For the case where
F ( xk ) >
2
1−
1
2
( )
2 β 2A F x k
≤ 1−
1
( )
2 β 2A F x0
≤ γ0
Note the proof technique
First – Show that the iterates do not increase
Second – Use the non-increasing fact to prove convergence
SMA-HPC ©2003 MIT
Damped Newton
Newton Method
with Limiting
Nested Iteration
x 0 = Initial Guess, k = 0
Repeat {
( ) ( )
Solve J ( x ) ∆x = − F ( x ) for ∆x
Find α ∈ ( 0,1] such that F ( x + α ∆x )
Compute F x k , J F x k
k
k +1
k +1
k
F
k
k
x k +1 = x k + α k ∆x k +1
k = k +1
} Until
(
∆x k +1 , F x k +1
)
k
k +1
is minimized
small enough
Many approaches to finding α
SMA-HPC ©2003 MIT
k
Newton Method
with Limiting
Damped Newton
Singular Jacobian Problem
f(x)
x2
1
x
1
D
x
x
0
X
Damped Newton Methods “push” iterates to local minimums
Finds the points where Jacobian is Singular
SMA-HPC ©2003 MIT
Continuation Schemes
•
Source or Load-Stepping
Newton converges given a close initial guess
–
–
•
Basic Concepts
Generate a sequence of problems
Make sure previous problem generates guess for next problem
Heat-conducting bar example
1. Start with heat off, T= 0 is a very close initial guess
2. Increase the heat slightly, T=0 is a good initial guess
3. Increase heat again
SMA-HPC ©2003 MIT
Continuation Schemes
Basic Concepts
General Setting
Solve F ( x ( λ ) , λ ) = 0 where:
a) F ( x ( 0 ) , 0 ) = 0 is easy to solve Starts the continuation
b) F ( x (1) ,1) = F ( x )
Ends the continuation
c) x ( λ ) is sufficiently smooth Hard to insure!
x (λ )
Dissallowed
0
SMA-HPC ©2003 MIT
1
λ
Continuation Schemes
Basic Concepts
Template Algorithm
Solve F ( x ( 0 ) , 0 ) , x ( λ prev ) = x ( 0 )
δλ =0.01, λ = δλ
While λ < 1 {
x 0 ( λ ) = x ( λ prev )
Try to Solve F ( x ( λ ) , λ ) = 0 with Newton
}
If Newton Converged
x ( λ prev ) = x ( λ ) , λ = λ + δλ , δλ = 2δλ
Else
1
δλ = δλ , λ = λ prev + δλ
2
SMA-HPC ©2003 MIT
Basic Concepts
Continuation Schemes
R
Vs
+
-
v
Diode
Source/Load Stepping Examples
1
f ( v ( λ ) , λ ) = idiode ( v ) + ( v − λVs ) = 0
R
∂f ( v, λ )
∂v
f
fL
=
∂idiode ( v )
G
F ( x, λ ) =
∂v
1
+
← Not λ dependent!
R
f x ( x, y ) = 0
f y ( x, y ) + λ f l = 0
Source/Load Stepping Does Not Alter Jacobian
SMA-HPC ©2003 MIT
Jacobian Altering Scheme
Continuation Schemes
Description
F ( x ( λ ) , λ ) = λ F ( x ( λ ) ) + (1 − λ ) x ( λ )
Observations
λ =0 F ( x ( 0 ) , 0 ) = x ( 0 ) = 0
∂F ( x ( 0 ) , 0 )
∂x
=I
λ =1 F ( x (1) ,1) = F ( x (1) )
∂F ( x ( 0 ) , 0 )
∂x
SMA-HPC ©2003 MIT
=
∂F ( x (1) )
∂x
Problem is easy to solve and
Jacobian definitely nonsingular.
Back to the original problem
and original Jacobian
Continuation Schemes
Jacobian Altering Scheme
Basic Algorithm
Solve F ( x ( 0 ) , 0 ) , x ( λ prev ) = x ( 0 )
δλ =0.01, λ = δλ
While λ < 1 {
x 0 ( λ ) = x ( λ prev ) + ?
Try to Solve F ( x ( λ ) , λ ) = 0 with Newton
}
If Newton Converged
x ( λ prev ) = x ( λ ) , λ = λ + δλ , δλ = 2δλ
Else
1
δλ = δλ , λ = λ prev + δλ
2
SMA-HPC ©2003 MIT
Jacobian Altering Scheme
Continuation Schemes
Initial Guess for each step.
x(λ)
x ( λ + δλ )
Initial Guess Error
x0 ( λ + δλ ) = x ( λ )
0
SMA-HPC ©2003 MIT
λ
λ + δλ
1
λ
Jacobian Altering Scheme
Continuation Schemes
Update Improvement
F ( x ( λ + δλ ) , λ + δλ ) ≈ F ( x ( λ0) , λ ) +
∂F ( x ( λ ) , λ )
∂x
( x ( λ + δλ ) − x ( λ ) )
∂F ( x ( λ ) , λ )
∂λ
⇒
∂F ( x ( λ ) , λ )
∂x
Have From last
step’s Newton
SMA-HPC ©2003 MIT
(
)
δλ
x 0 ( λ + δλ ) − x ( λ ) = −
Better Guess
for next step’s
Newton
∂F ( x ( λ ) , λ )
∂λ
δλ
+
Continuation Schemes
Jacobian Altering Scheme
Update Improvement Cont.
If
F ( x ( λ ) , λ ) = λ F ( x ( λ ) ) + (1 − λ ) x ( λ )
Then
∂F ( x, λ )
= F ( x) − x (λ )
∂λ
Easily Computed
SMA-HPC ©2003 MIT
Jacobian Altering Scheme
Continuation Schemes
Update Improvement Cont. II.
( x (λ ) , λ )
⎛
F
∂
x 0 ( λ + δλ ) = x ( λ ) − ⎜
⎜
∂x
⎝
⎞
⎟
⎟
⎠
−1
∂F ( x ( λ ) , λ )
∂λ
δλ
Graphically
x(λ)
x0 ( λ + δλ )
0
SMA-HPC ©2003 MIT
λ
λ + δλ
1
λ
Continuation Schemes
Jacobian Altering Scheme
Still can have problems
x(λ)
Must switch back to
increasing lambda
Arc-length
steps
0
1
lambda steps
SMA-HPC ©2003 MIT
Must switch from
increasing to
decreasing lambda
λ
Continuation Schemes
Jacobian Altering Scheme
Arc-length Steps?
x(λ)
arc-length ≈
Arc-length
steps
0
1
( ∆x ) + (δλ )
2
λ
Must Solve For Lambda
F ( x, λ ) = 0
2
λ
−
λ
+
x
−
x
λ
−
arc
=0
( prev )
( prev )
2
2
2
SMA-HPC ©2003 MIT
2
Jacobian Altering Scheme
Continuation Schemes
Arc-length steps by Newton
(
)
⎡ ∂F x k , λ k
⎢
∂x
⎢
⎢
k
2
x
− x ( λ prev )
⎢⎣
(
(
)
T
(
⎡
−⎢
⎢ λk − λ
prev
⎢⎣
(
SMA-HPC ©2003 MIT
)
k
k
∂F x , λ ⎤
⎥ ⎡ x k +1 − x k ⎤
∂λ
=
⎥ ⎢ k +1
k⎥
⎥ ⎣λ − λ ⎦
k
2 λ − λ prev ⎥
⎦
)
(
)
− x (λ )
k
k
F x ,λ
)
2
+ x
k
prev
2
2
⎤
⎥
− arc 2 ⎥⎥
⎦
Jacobian Altering Scheme
Continuation Schemes
Arc-length Turning point
x( λ)
What happens here?
0
Upper left-hand
Block is singular
1
λ
(
(
SMA-HPC ©2003 MIT
)
⎡ ∂F x k , λ k
⎢
∂x
⎢
⎢
k
− x ( λ prev )
2
x
⎢⎣
(
)
T
)
∂F x k , λ k ⎤
⎥
∂λ
⎥
⎥
k
2 λ − λ prev ⎥
⎦
(
)
Summary
• Damped Newton Schemes
– Globally Convergent if Jacobian is Nonsingular
– Difficulty with Singular Jacobians
• Introduce Continuation Schemes
– Problem with Source/Load stepping
– More General Continuation Scheme
• Improving Efficiency
– Better first guess for each continuation step
• Arc-length Continuation
SMA-HPC ©2003 MIT
Download