Introduction to Simulation - Lecture 15 Karen Veroy Methods for Computing Periodic

advertisement
Introduction to Simulation - Lecture 15
Methods for Computing Periodic
Steady-State
Jacob White
Thanks to Deepak Ramaswamy, Michal Rewienski, and
Karen Veroy
Outline
• Periodic Steady-state problems
– Application examples and simple cases
• Finite-difference methods
– Formulating large matrices
• Shooting Methods
– State transition function
– Sensitivity matrix
• Matrix Free Approach
Basic Definition
Periodic Steady-State
Basics


dx ( t )
(t )
= F  x ( t )  + u{
 { input
dt
 state 
• Suppose the system has a periodic input
T
2T
3T
t
• Many Systems eventually respond periodically
x ( t + T ) = x ( t ) for t >> 0
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
Basic Definition
Interesting Property
• If x satisfies a differential equation which has a
unique solution for any initial condition
dx ( t )
= F ( x ( t ) ) + u (t )
dt
• Then if u is periodic with period T and
x ( t0 + T ) = x ( t0 ) for some t0
⇒ x ( t + T ) = x ( t ) for all t > t0
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
• Periodic Input
– Wind
• Response
– Oscillating Platform
• Desired Info
– Oscillation Amplitude
SMA-HPC ©2003 MIT
Application Examples
Swaying Bridge
Periodic Steady-State
Basics
Application Examples
Communication Integrated
Circuit
• Periodic Input
– Received Signal at 900Mhz
• Response
– filtered demodulated signal
• Desired Info
– Distortion
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
Application Examples
Automobile Vibration
• Periodic Input
– Regularly Spaced
Road Bumps
• Response
– Car Shakes
• Desired Info
– Shake amplitude
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
RLC Circuit
Simple Example
RLC Filter,
Spring+Mass+Dashpot
Spring-Mass-Dashpot
Force
• Both Described by Second-Order ODE
2
d x
dx
M 2 + D + x = u{
(t )
dt
dt
input
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
Simple Example
RLC Filter,
Spring+Mass+Dashpot Cont.
• Both Described by Second-Order ODE
2
d x
dx
M 2 + D + x = u (t )
dt
dt
• u(t) = 0 lightly damped (D<<M) Response
x ( t ) ≈ Ke
SMA-HPC ©2003 MIT
−
D
2M
 t

cos 
+φ 
 M

Periodic Steady-State
Basics
Simple Example
RLC Filter,
Spring+Mass+Dashpot Cont.
Ke
D
−
2M
• A lightly damped system oscillates many
times before settling to a steady-state
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
Computing Steady State
Frequency Domain Approach
• Sinusoidally excited linear time-invariant system
dx ( t )
iω t
= Ax ( t ) + e{
dt
input
• Steady-State Solution simple to determine
x ( t ) = ( iω − A ) e
−1
iω t
Not useful for nonlinear or time-varying systems
SMA-HPC ©2003 MIT
Periodic Steady-State
Basics
Computing Steady State
Time Integration Method
• Time-Integrate Until Steady-State Achieved
dx ( t )
= F ( x ( t ) ) + u (t ) ⇒ xˆ l = xˆ l −1 + ∆t F ( xˆ l ) + u (l ∆t )
dt
(
• Need many timepoints for lightly damped case!
SMA-HPC ©2003 MIT
)
Aside Reviewing
Integration Methods
Solve with Backward-Euler
• Nonlinear System


dx ( t )
= F  x ( t )  + u{
(t )
 { input
dt
 state 
x ( 0 ) = x0
1424
3
Initial Condition
• Backward Euler Equation for timestep l
xˆ − xˆ
l
l −1
(
= ∆t F ( xˆ ) + u (l ∆t )
l
)
How do we solve the backward-Euler Equation?
SMA-HPC ©2003 MIT
Implicit Methods
Aside Reviewing
Integration Methods
Backward-Euler Example
Forward-Euler
Backward-Euler
x(t1 )
xˆ1 = x(0) + ∆t f ( x ( 0 ) , u ( 0 ) )
x(t1 )
x(t2 )
xˆ 2 = xˆ1 + ∆t f xˆ1 , u ( t1 )
(
x(t2 )
M
x(t L )
(
)
xˆ L = xˆ L −1 + ∆t f xˆ L −1 , u ( t L −1 )
Requires just function
Evaluations
)
(
xˆ 2 = xˆ1
M
x(t L )
)
+ ∆t f ( xˆ , u ( t ) )
xˆ1 = x(0) + ∆t f xˆ1 , u ( t1 )
2
2
(
xˆ L = xˆ L −1 + ∆t f xˆ L , u ( t L )
Nonlinear equation
solution at each step
Stepwise Nonlinear equation solution needed whenever β 0 ≠ 0
SMA-HPC ©2003 MIT
)
Implicit Methods
Aside Reviewing
Integration Methods
Solution with Newton
Rewrite the multistep Equation
k
α 0 xˆ − ∆t β 0 f ( xˆ , u ( tl ) ) + ∑ α j xˆ
l
l
j =1
Solve with Newton

∂f ( xˆ l , j , u ( tl ) ) 
 α 0 I − ∆t β 0


∂x
Jacobian
l− j
k
(
j =1
b
Independent of xˆ l
(
 ( xˆ l , j +1 − xˆ l , j ) = − α 0 xˆ l , j − ∆t β 0 f ( xˆ l , j , u ( tl ) ) + b


F ( xl , j )
Here j is the Newton iteration index
SMA-HPC ©2003 MIT
)
− ∆t ∑ β j f xˆ l − j , u ( tl − j ) = 0
)
Implicit Methods
Aside Reviewing
Integration Methods
Solution with Newton Cont.

∂f ( xˆ l , j , u ( tl ) )  l , j +1 l , j
 α 0 I − ∆t β 0
 ( xˆ
− xˆ ) = − F ( xl , j )


∂x


Newton Iteration:
Solution with Newton is very efficient
Converged
Solution
xˆ l
xˆ l ,0
tl − k
α 0 I − ∆t β 0
t l − 3 t l − 2 t l −1 t l
∂f ( xˆ l , j , u ( tl ) )
SMA-HPC ©2003 MIT
∂x
Polynomial
Predictor
⇒ α0 I
as ∆t → 0
Easy to generate a good initial
guess using polynomial fitting
Jacobian become easy to
factor for small timesteps
Boundary-Value
Problem
Basic Formulation
Periodicity
Constraint
Differential
Equation Solution
d
N Differential Equations:
xi ( t ) = Fi ( x ( t ) )
dt
N Periodicity Constraints: xi (T ) = xi ( 0 )
SMA-HPC ©2003 MIT
Finite Difference Methods
Boundary-Value
Problem
Linear Example Problem
dx ( t )
= Ax ( t ) + u ( t ) t ∈ [ 0, T ]
{
dt
x (T ) = x ( t )
14243
input
periodicity
constraint
Discretize with Backward-Euler
1
0
1
xˆ = xˆ + ∆t Axˆ + u ( ∆t )
xˆ 2 = xˆ1 + ∆t Axˆ 2 + u ( 2∆t )
(
(
xˆ = xˆ
L
L −1
)
)
T
∆t =
L
+ ∆t ( Axˆ + u ( L∆t ) )
L
Periodicity implies xˆ = xˆ
L
SMA-HPC ©2003 MIT
0
Finite Difference Methods
Boundary-Value
Problem
Linear Example Matrix Form
NxL
1
0
0
 ∆t I − A

1
 − 1 I
I−A
0
∆t
NxL  ∆t

O
O
 0

1
0
− I
 0
∆t

1 
− I   xˆ1   u ( 0 ) 
∆t

 2 
u
t
∆
ˆ
x   ( )


0
 M  =  M 

  
0  M   M 
  xˆ L  u ( L∆t ) 
1

I − A   
∆t

Matrix is almost lower triangular
SMA-HPC ©2003 MIT
Finite Difference Methods
Boundary-Value
Problem
Nonlinear Problem
dx ( t )
= F ( x ( t ) ) + u ( t ) t ∈ [ 0, T ]
{
dt
input
x (T ) = x ( t )
14243
periodicity
constraint
Discretize with Backward-Euler
1
L
1
1
ˆ
ˆ
ˆ
x
−
x
−
∆
t
F
x
+ u ( ∆t )
  xˆ  
 2 
2
1
2
ˆ
ˆ
ˆ
xˆ   x − x − ∆t F x + u ( 2∆t )


H FD
=
 M 
 L
  xˆ   xˆ L − xˆ L −1 − ∆t F xˆ L + u L∆t
( )
 
( ( )
( ( )
( ( )
Solve Using Newton’s Method
SMA-HPC ©2003 MIT
)
) =0
)
Boundary-Value
Problem
Shooting Method
Basic Definitions
dx ( t )
= F ( x (t )) + u (t )
Start with
dt
And assume x(t) is unique given x(0).
D.E. defines a State-Transition Function
Φ ( y, t0 , t1 ) ≡ x ( t1 )
where x (t ) is the D.E. solution given x ( t0 ) = y
SMA-HPC ©2003 MIT
Boundary-Value
Problem
Shooting Method
State Transition function Example
dx ( t )
= λ x (t )
dt
Φ ( y, t0 , t1 ) ≡ e
SMA-HPC ©2003 MIT
λ ( t1 −t0 )
y
Shooting Method
Boundary-Value
Problem
Abstract Formulation
Solve
H ( x ( 0 ) ) = Φ ( x ( 0 ) , 0, T ) − x ( 0 ) = 0
14
4244
3
x(T )
Use Newton’s method
∂Φ ( x, 0, T )
JH ( x) =
−I
∂x
JH ( x
SMA-HPC ©2003 MIT
k
)( x
k +1
−x
k
) = −H ( x )
k
Boundary-Value
Problem
Shooting Method
Computing Newton
To Compute Φ ( x ( 0 ) , 0, T )
dx ( t )
Integrate
= F ( x ( t ) ) + u ( t ) on [0,T]
dt
∂Φ ( x, 0, T )
ε
What is
?
x (T )
∂x
x (0) + ε
x (T )
x (0)
Indicates the sensitivity of x(T) to changes in x(0)
SMA-HPC ©2003 MIT
Boundary-Value
Problem
∂Φ ( x, 0, T )
≈
∂x
 x1ε1 (T ) − x1 (T )

ε
1


M

M

 xε1 T − x T
(
)
(
)
N
N


ε1
SMA-HPC ©2003 MIT
Shooting Method
Sensitivity Matrix by Perturbation
εN
L L
L L
L L
L L
x1
(T ) − x1 (T ) 

εN


M

M


εN
xN ( T ) − xN ( T ) 

εN
Boundary-Value
Problem
Shooting Method
Efficient Sensitivity Evaluation
Differentiate the first step of Backward-Euler
∂
1
1
xˆ − x ( 0 ) − ∆t F xˆ + u ( ∆t ) = 0
∂x ( 0 )
1
1
1
ˆ
∂
F
x
∂
x
0
(
)
∂xˆ
∂xˆ
⇒
−
− ∆t
=0
∂x ( 0 ) ∂x ( 0 )
∂x ∂x ( 0 )
1
I
1


ˆ
∂F x
∂x ( 0 )
∂xˆ

⇒  I − ∆t
=

∂x  ∂x ( 0 ) ∂x ( 0 )


(
) )
( ( )
( )
( )
SMA-HPC ©2003 MIT
Shooting Method
Boundary-Value
Problem
Efficient Sensitivity Matrix Cont
Applying the same trick on the l-th step
l
l −1

∂F ( xˆ )  ∂xˆ l
∂xˆ

⇒  I − ∆t
=

∂x  ∂x ( 0 ) ∂x ( 0 )


l


ˆ
F
x
∂
(
)
∂Φ ( x, 0, T )

≈ ∏  I − ∆t
∂x
∂x 
l =1 


L
SMA-HPC ©2003 MIT
−1
Shooting Method
Boundary-Value
Problem
Observations on Sensitivity Matrix
Newton at each timestep uses same matrices
l


ˆ
∂
F
x
(
)
∂Φ ( x, 0, T )
≈ ∏  I − ∆t

∂x
∂x 
l =1 
1442443
L
Timestep Newton
Jacobian
Formula simplifies in the linear case
∂Φ ( x, 0, T )
≈
∂x
SMA-HPC ©2003 MIT
( I − ∆tA)
−L
−1
Shooting Method
Matrix-Free Approach
Basic Setup
dx ( t )
= F ( x (t )) + u (t )
Start with
dt
H ( x ( 0 ) ) = Φ ( x ( 0 ) , 0, T ) − x ( 0 ) = 0
Use Newton’s method
∂Φ ( x, 0, T )
JH ( x) =
−I
∂x
k
k +1
k
k
J H ( x )( x − x ) = − H ( x )
SMA-HPC ©2003 MIT
Shooting Method
Matrix-Free Approach
Matrix-Vector Product
Solve Newton equation with Krylov-subspace method
 ∂Φ ( x k , 0, T )

k +1
k
k
k

− I  ( x − x ) = x − Φ ( x , 0, T )

 14243 1442443
∂x
144

42444
3
x
b
A
Matrix-Vector Product Computation
k
j
k
 ∂Φ ( x k , 0, T )

x
ε
p
,
0,
T
x
, 0, T )
Φ
+
−
Φ
(
)
(
j

−Ip ≈
− pj


ε
∂x


Krylov method search direction
SMA-HPC ©2003 MIT
Shooting Method
Matrix-Free Approach
Convergence for GCR
Example
dx
− Ax = 0 eig ( A ) real and negative
dt
Shooting-Newton Jacobian
∂Φ ( x, 0, T )
AT
−I =e −I
∂x
SMA-HPC ©2003 MIT
Matrix-Free Approach
Shooting Method
Convergence for GCR-evals
e
AT
e

−I =S


λ1T
−1
O
e
λN T

 −1
S
− 1
Many Fast Modes cluster at 1
1
Few Slow Modes larger than 1
SMA-HPC ©2003 MIT
Summary
• Periodic Steady-state problems
– Application examples and simple cases
• Finite-difference methods
– Formulating large matrices
• Shooting Methods
– State transition function
– Sensitivity matrix
Download