Differential-Algebraic Equations (DAEs) =

advertisement
10.34, Numerical Methods Applied to Chemical Engineering
Professor William H. Green
Lecture #15: Differential Algebraic Equations (DAEs). Introduction: Optimization.
Differential-Algebraic Equations (DAEs)
ℑ( x (t ), x (t )) = 0
If M is invertible, x = M
M ( x ) * x = G ( x )
ode15i
ode15s
−1
( x)G ( x) .
ordinary ODE
Quasi-Steady-State Assumption (QSSA)
* make stiff equation into algebraic
1
OH + CO ⎯
H + CO2
⎯→
O≈
d[OH]
2
H + O2 ⎯⎯→
OH + O
O≈
d[H]
k
k
/dt = k2[H][O2] – k1[OH][CO] + 2k3[O][H2O]
/dt = k1[OH][CO] – k2[H][O2]
3
2OH
O + H2O ⎯⎯→
k
•
originally had 6 Differential equations
•
now have 2 algebraic and 4 differential equations
•
takes you from ODE system to a DAE system
QSSA is not always helpful because ODE is faster and more accurate to solve. Solving a
D.A.E. is like solving a stiff equation. QSSA has not removed original stiffness.
Another problem of DAE:
Consistent Initial Conditions
x(t 0 ), x (t 0 )
ℑ( x(t 0 ), x (t 0 )) = 0
You need both x and
x . If x is not provided, you have to use Newton’s Method.
High index DAEs
DAE
ODE
ℑ=0
Eventually you will have a normal ODE if you take enough derivatives.
dℑ
=0
dt
High-index refers to system that requires a high degree of
d 2ℑ
=0
dt 2
differentiation. MATLAB stiff solvers or D. A. E. solvers cannot solve
these equations if the index is too high.
dℑ
∂ℑ dxn
∂ℑ dx n
=∑
+∑
=0
dt
∂x n dt
∂x n dt
Cite as: William Green, Jr., course materials for 10.34 Numerical Methods Applied to Chemical
Engineering, Fall 2006. MIT OpenCourseWare (http://ocw.mit.edu), Massachusetts Institute of
Technology. Downloaded on [DD Month YYYY].
v n ≡ x n
∂ℑ
∑ ∂x
n
ℑ( x, v) = 0
vn + ∑
∂ℑ
v n = 0
∂x n
M in =
M ( x, v )
Identity Matrix = I
⎛M
⎜
⎜O
⎝
∂ℑi
∂v n
2n entries
∂ℑ
⎛
⎞
⎜ − ∑ 1 vn ⎟
⎜ n ∂x n ⎟
∂ℑ
⎜
⎟
− ∑ 2 vn ⎟
∂ℑ
⎜
⎛
⎞
O ⎞⎛ v ⎞ ⎜ − ∑ i v n ⎟ ⎜ n ∂x n ⎟
⎟⎜ ⎟ =
∂x n ⎟ = ⎜
#
⎟
I ⎟⎠⎜⎝ x ⎟⎠ ⎜⎜ n
⎟
v
⎝
⎠ ⎜
⎟
v1
⎜
⎟
v2
⎜
⎟
⎜
⎟
v3
⎝
⎠
Eventually, matrix M becomes invertible
If M is invertible:
x = M-1(x)G(x) ODE
Professor Barton is an expert in the field.
Predictor Corrector Method
x(tk-2), x(tk-1), x(tk)
∆t
x(tk+1)
Extrapolate Æ “Predictor” = initial guess for implicit solve
“Corrector”: ℑ( x k +1 , x k +1 ) = 0
polynomial fit
This method was developed by Bill Gear “Gear Predictor-Corrector”.
“BDF polynomials” (Backward Differential Formula)Æ guarantees numerical stability
Predictor is simple to find through extrapolation
Corrector is difficult and complicated
DASSL
DASPK
Linda Petzold
Another similar to DASSL is DASAC
(Univ. California, Santa Cruz)
Make ∆t small in first few steps to minimize error
Another package is DASAC at the Univ. of Wisconsin, Madison. For linear equations, use
MATLAB. Use small Δt on first step, because initial guess is low in information and equations
take a long time to solve.
10.34, Numerical Methods Applied to Chemical Engineering
Prof. William Green
Lecture 15
Page 2 of 4
Cite as: William Green, Jr., course materials for 10.34 Numerical Methods Applied to Chemical
Engineering, Fall 2006. MIT OpenCourseWare (http://ocw.mit.edu), Massachusetts Institute of
Technology. Downloaded on [DD Month YYYY].
Optimization
min f (x ) with constraints g (x ) = 0 and h( x ) ≥ 0 .
x
What derivatives of f(x) can I compute easily?
1) None Æ simplex algorithms Æ fminsearch
2) Gradient
∇f
a. Newton-type solvers
b. Conjugate-gradient type solvers
Newton-type
f(x+p) = f(x) +
⎛ ∂f
⎜
⎜ ∂x1
⎜ ∂f
∇ f = ⎜ ∂x 2
⎜
⎜ ∂f
⎜ ∂x3
⎜ #
⎝
⎞
⎟
⎟
⎟
⎟
⎟
⎟
⎟
⎟
⎠
f(x+p) = f(x) +
∇ f|x·p + ½pTH·p + O(|p|3)
⎛ ∂2 f
⎜
2
⎜ ∂x1
⎜ ∂2 f
H =⎜
⎜ ∂x 2 ∂x1
⎜
⎜
⎝
∂2 f
∂x1∂x 2
∂2 f
∂x 22
∂2 f
∂x1∂x3
∂2 f
∂x 2 ∂x3
⎞
"⎟
⎟
⎟
"⎟
⎟
⎟
⎟
⎠
approximate Hessian; maybe use I
∇ f|x·p + ½pTB·p
1) What direction is the next step? downhill
2) How far should I step?
f(xk+1) < f(xk)
10.34, Numerical Methods Applied to Chemical Engineering
Prof. William Green
Lecture 15
Page 3 of 4
Cite as: William Green, Jr., course materials for 10.34 Numerical Methods Applied to Chemical
Engineering, Fall 2006. MIT OpenCourseWare (http://ocw.mit.edu), Massachusetts Institute of
Technology. Downloaded on [DD Month YYYY].
Contour Map
local maximum
x
global minimum
first step downhill
x
xguess (initial guess)
Figure 1. A contour map.
Choice #1: downhill
⎛ ∇f ⎞
⎟
p = c⎜ −
⎜ ∇f ⎟
⎝
⎠
optimal ‘c’, if f(x+p) = f(x) +
∇ f|x·p + ½pTB·p is true: Cauchy point
Choice #2: assume 2nd order expansion is exact
•
Newton step
o
dangerous when you are far from the solution
Go downhill first to get closer and then apply Newton’s Step.
10.34, Numerical Methods Applied to Chemical Engineering
Prof. William Green
Lecture 15
Page 4 of 4
Cite as: William Green, Jr., course materials for 10.34 Numerical Methods Applied to Chemical
Engineering, Fall 2006. MIT OpenCourseWare (http://ocw.mit.edu), Massachusetts Institute of
Technology. Downloaded on [DD Month YYYY].
Download