Chebfun: Numerical computing with functions . Alex Townsend

advertisement
.
.
Alex Townsend
Oxford University
IMACS: Nonlinear Evolution Equations and Wave Phenomena
Alex Townsend
Chebfun
.
.
..
.
Chebfun: Numerical computing with
functions
. Chebfun is about Chebyshev technology
Chebyshev technology is a powerful approach for computing with
functions. It’s all based on polynomial interpolation at Chebyshev
points:
xj = cos(jπ/n),
0 ≤ j ≤ n.
There exist fast, accurate algorithms for integration,
differentiation, rootfinding, minimization, solution of ODEs, etc
www.chebfun.org
Alex Townsend
Chebfun
. The Chebfun team
Chebyshev technology can also bring us together. My friends:
Chebfun has resulted in a book: ATAP by L. N. Trefethen (2013).
Alex Townsend
Chebfun
. The Chebfun philosophy
There are two visions behind Chebfun [Trefethen 2007]:
Continuous analogue of vectors Overload Matlab vectors
(and recently matrices) to functions. sum, max,
roots, norm, qr,. . .
Floating point analogy Extend IEEE arithmetic from doubles to
functions. In IEEE doubles are rounded to 16 relative
digits, and in Chebfun functions are approximated to
16 digits.
f = chebfun( @(x) sin(pi*x) );
length( f )
ans = 20
length( f.*f )
% without truncation length = 40
ans = 29
Alex Townsend
Chebfun
. Interpolation at Chebyshev points
Key point 1: Polynomial interpolants in equispaced points in
[−1, 1] can have very poor approximation properties, but interpolants in Chebyshev points are excellent.
1
Runge function:
f (x) =
, x ∈ [−1, 1]
1 + 25x 2
Degree 20 equally−spaced interpolant
Degree 20 Chebyshev interpolant
1
1
0.8
0.8
0.6
0.6
0.4
0.4
0.2
0.2
0
0
−0.2
−0.2
−0.4
−0.4
−0.6
−0.6
−0.8
−1
−1
−0.8
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
Alex Townsend
−1
−1
−0.8
Chebfun
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
. The Runge region
At equally-spaced points a function on [−1, 1] must be analytic
inside the Runge region, and there is a convergence rate vs.
ill-conditioning trade-off [Platte, Trefethen, Kuijlaars 2011].
Runge region for equispaced interpolation
0.5255249145i
0.6
0.4
0.2
0
−0.2
−0.4
−0.6
−0.8
−2
−1.5
−1
−0.5
0
0.5
1
1.5
2
In contrast, if f is analytic in a Bernstein ellipse Eρ and bounded
there then its Chebyshev interpolant satisfies [ATAP 2013]
kf − pn k∞ ≤
4Mρ−n
,
ρ−1
Alex Townsend
Chebfun
M ∈ R.
. Fast Fourier transform
Key point 2: Chebyshev points are equally-spaced when projected
onto the unit semi-circle.
Chebyshev points for n = 17
1
0.8
0.6
0.4
0.2
0
−1
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
This fact allows us to convert between n + 1 values on the grid to
n coefficients in O(n log n) operations:
values
←→
|{z}
coefficients
DCT
Alex Townsend
Chebfun
. Integration
Key point 3: Gauss quadrature is optimal, but Clenshaw–Curtis
quadrature is better.
∫
1
I =
−1
f (x)dx ≈
N
∑
wk f (xk ) = IN .
k=0
Gauss–Legendre quadrature
∗
|I − IN | ≤ 4E2N+1
Clenshaw–Curtis quadrature
|I − IN | ≤ 4EN∗
Gauss is “twice as good” as Clenshaw–Curtis.
Alex Townsend
Chebfun
. Integration
The full factor of 2 is rarely seen [Trefethen 2008].
1/(1+16x2)
0
10
C−C
Gauss
−2
10
−4
10
E*
n
−6
Error
10
E*
2n+1
−8
10
−10
10
Kink
phenomenon
−12
10
−14
10
−16
10
0
20
40
60
80
100
n
Gauss–Legendre requires O(n2 ) operations (due only to polynomial
evaluation [Hale & T. 2013]) and Clenshaw–Curtis requires
O(n log n) operations (using the DCT for evaluation).
Alex Townsend
Chebfun
. Rootfinding
Key point 4: Rootfinding for large degree polynomials is very practical and remarkably important.
The roots of the polynomial
p(x) =
n
∑
Tk (x) = cos(k cos−1 (x)),
ak Tk (x),
k=0
are the eigenvalues of the colleague matrix [Specht 1960, Good
1961]

0
1
2


C =




1
0
1
2

1
2
0
..
.
1
2
..
.
..
1
2
.








1

−
 2a 
n 




1
2
a0
0





.



a1
a2
...
an−1
For large degree polynomials we subdivide the interval for O(n2 )
complexity algorithm [Boyd 2003].
Alex Townsend
Chebfun
. Rootfinding and global optimization
Two important consequences of robust rootfinding:
.1. Locating breakpoints: Useful for commands such as abs,
sign, max.
.. Global optimization: Computed via roots(diff(f)).
2
.
.
x = chebfun(’x’, [0 10]);
h = max( sin(x) + sin(x.^2), 1-abs(x-5)/5 );
mx = max(h);
2
1.5
1
0.5
0
0
1
2
3
4
Alex Townsend
5
6
Chebfun
7
8
9
10
DEMO
Alex Townsend
Chebfun
. Chebop: Differential operators
Key point 1: Overload the Matlab backslash command \ for
operators [Driscoll, Bornemann & Trefethen 2008].
L = chebop( @(x,u) diff(u,2) - x.*u, [-30 30] );
L.lbc = 1; L.rbc = 0; u = L \ 0; plot(u)
Solution to Airy equation
6
4
2
0
−2
−4
−6
−8
−30
−20
−10
Alex Townsend
0
10
Chebfun
20
30
. Spectral method basics
Given values on a Chebyshev grid, what are the values of the
derivative on the same grid?:
u’ u’
u’
u’
u1 u2
u5
u9
x
x
x
1
1
2
5
x
2
5
   0
u1
u1
 ..   .. 
Dn  .  =  .  ,
un
9
9
Dn = diffmat(n).
un0
For example, u 0 (x) + cos(x)u(x) is represented as
Ln = Dn + diag (cos(x1 ), . . . , cos(xn )) ∈ Rn×n
Alex Townsend
Chebfun
. Imposing boundary conditions
Key point 2: Imposing boundary conditions by boundary bordering
can be automated.
Values
Coefficients



























Boundary
Bordering





x
.
..
x
...
..
.
x
.
..
x
...
..
.
...
x
.
..
x

Basis
Recombination





...
x
.
..
x
Dense, efficient for
nonlinear
Alex Townsend

x x
. .
x .. ..
.. ..
. . x
x x
x x
.. ..
x . .
.. ..
. . x
x x













Sparse and can be
well-conditioned
Chebfun
Can be
automated
Preserves
structure
. Rectangular projection
Key point 3: Projection is essential for treating less standard
boundary conditions.
Projection if there are four bcs
u
u
u
u
x
x
x
x
1
2
1
5
2
9
5
9
u’
u’
u’
y1
y2
y5
1
2
5
The second grid is chosen so the solution is interpolated on a grid
of size n − m [Hale & Driscoll 2013]. Also allows us to impose
non-boundary conditions:
∫ 1
0.1u 00 (x) + xu(x) = sin(x),
u(−1) = 1,
u(s)ds = 0.
−1
Alex Townsend
Chebfun
. Four examples of ODEs
Potential barrier
ODE with imposed breakpoint
6
0
−0.2
4
−0.4
2
−0.6
−0.8
0
−1
−2
−1.2
−1.4
−4
−1.6
−6
−1.8
−2
−8
−1
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
x
(non−linear) Lotka−Volterra predator−prey
profit/loss versus underlying share price
200
1.4
Rabbits
Foxes
1.3
150
1.2
100
profit
1.1
1
50
0.9
0
0.8
−50
0
5
10
15
20
25
30
35
40
45
50
0.7
0
2
4
6
8
10
x
share price in pounds
Alex Townsend
Chebfun
12
14
16
18
. AD and nonlinear ODEs
Key point 4: Automatic differentiation and Newton’s method allows for nonlinear differential equations.
Newton’s method for finding f (x) = 0,
[ (
)]−1 (
)
x (k+1) = x (k) − f x (k)
f x (k) .
If N is an operator then the analogous iteration for finding
N (u) = 0 is
(
)
(
)
u (k+1) = u (k) + v (k) , N 0 u (k) v (k) = −N u (k) ,
where N 0 (u) is the Fréchet derivative of N .
We use automatic differentiation to compute the Fréchet derivative
[Birkisson & Driscoll 2012].
Alex Townsend
Chebfun
. A nonlinear ODE
% A solution to Carrier equation
N = chebop(@(x,u) 0.01*diff(u,2) + 2*(1-x.^2).*u + u.^2);
N.lbc = 0; N.rbc = 0;
N.init = chebfun(@(x) 2*x.*(x.^2-1).*(1-2./(1+20*x.^2));
u = N \ 1;
A solution to Carrier equation
Norm of updates
0
2
10
1.5
−2
10
1
−4
10
u
0.5
0
−6
10
−0.5
−8
10
−1
−10
10
1
4
7
10
13
16
19
−1.5
−1
−0.8
−0.6
−0.4
−0.2
0
0.2
0.4
0.6
0.8
1
x
Iteration number
If no initial condition is supplied then chebop computes a small
degree polynomial satisfying the linearized boundary conditions.
Alex Townsend
Chebfun
. A fast and well-conditioned spectral method
Key point 5: Spectral methods do not have to result in dense,
ill-conditioned matrices [Olver & T. 2013].
The idea is to use simple relations between Chebyshev polynomials:

1
{


2 (Uk − Uk−2 ) k ≥ 2
kU
k
≥
1
dTk
k−1
=
Tk = 12 U1
k =1

dx
0
k = 0,

U0
k = 0.




Dn = 



0 1
2



..
,
.

n − 1
0
Alex Townsend

1
 0

 1
Sn = − 2



1
2
0
..
.
1
2
..
.
− 21
Chebfun



.


..
.
0
1
2
. A fast and well-conditioned spectral method
Well-conditioned The condition number of the N × N linear
system obtained by discretization is typically O(1).
Fast Solution The linear systems are sparse and can sometimes
be solved in O(N) operations (details in [Olver & T.
2013]).
High accuracy The solution is typically accurate to 15-16 digits.
5
1.2
10
1
0
10
||u1.01n−un||2
u(x)
0.8
0.6
0.4
0.2
−5
10
−10
10
0
−15
10
−0.2
−1
−0.5
0
0.5
1
0
x
0.5
1
1.5
n
Alex Townsend
Chebfun
2
4
x 10
DEMO
Alex Townsend
Chebfun
. Chebfun2 (released on 4th of March 2013)
Chebfun2 is a software project to extend Chebfun to scalar- and
vector-valued functions of two variables [T. & Trefethen 2013a].
Based on low rank function approximation using Gaussian
elimination.
Exploits 1D Chebyshev technology to perform 2D
computations. (Extremely important)
“Hello World”: My first example (back in October 2011):
Rank 1
Rank 3
Rank 5
Rank 10
Alex Townsend
Chebfun
. Chebfun2: Chebfun in two dimensions
Key point 1: Gaussian elimination can be used for low rank function
approximation.
The standard point of view:
A
L
U
A different, equally simple point of view [T. & Trefethen 2013b]:
A ←− A − A(j, :)A(:, k)/A(j, k)
(GE step for matrices)
f ←− f − f (x, :)f (:, y )/f (x, y )
(GE step for functions)
Each step of GE is a rank-1 update.
Alex Townsend
Chebfun
. SVD versus GE
The Singular Value Decomposition (SVD) can be used to find the
best rank-k approximation to a sufficiently smooth function f :
∞
∑
kf − fk k2L2 =
σj2 < ∞,
(continuous analogue of kA − Ak k2F )
j=k+1
However, for analytic and differentiable functions GE gives a
near-optimal rank-k approximation:
0
0
10
10
SVD
−4
GE
Relative error in L2
10
10
−6
10
−8
10
−10
10
γ = 100
−12
10
γ=1
−14
10
γ = 10
GE
−4
10
φ3,0 ∈ C0
−6
10
−8
10
φ3,1 ∈ C2
−10
10
−12
10
φ3,3 ∈ C6
−14
10
−16
10
SVD
−2
10
Relative error in L2
−2
−16
0
5
10
15
20
25
30
10
0
Rank of approximant
Alex Townsend
50
100
150
Rank of approximant
Chebfun
200
. Exploiting 1D technology in 2D computations
Key point 2: Two dimensional computations can efficiently exploit
1D technology.
f (x, y ) ≈
k
∑
σj uj (y )vj (x)
j=1
Operations such as sum2, diff, and f(x,y) can be computed
using 1D Chebsyhev technology. For example:
∫
1
−1
∫
1
−1
f (x, y )dxdy ≈
k
∑
j=1
∫
σj
∫
1
−1
uk (y )dy
1
−1
vk (x)dx
F = @(x,y) exp(-(x.^2 + y.^2 + cos(4*x.*y)));
QUAD2D: I = 1.399888131932670 time = 0.0717 secs
SUM2: I = 1.399888131932670 time = 0.0097 secs
Alex Townsend
Chebfun
. Zero contours and bivariate rootfinding
Key point 3: The existing Matlab command contourc can be
used for rootfinding.
We can solve two very different rootfinding problems (both by
using contourc):
f (x, y ) = 0,
and
f (x, y ) = g (x, y ) = 0.
The fourth SIAM digit challenge problem
Zero contours
1
4
0.8
3
0.6
2
0.4
1
0.2
0
0
−0.2
−1
−0.4
−2
−0.6
−3
−4
−5
−0.8
−4
−3
−2
−1
0
1
2
3
4
5
−1
−1
−0.5
0
0.5
1
Zero contours are approximated by complex-valued chebfuns!
Alex Townsend
Chebfun
. Parametric surfaces (for fun)
Functions F : R2 → R3 can be used to represent surfaces with
each component of the vector-valued function represented by a low
rank approximant.
Parametric surfaces in Chebfun2 was an idea from Rodrigo Platte
(Arizona State University).
Alex Townsend
Chebfun
DEMO
Alex Townsend
Chebfun
. Chebfun References:
[Trefethen 2007] Computing numerically with functions instead of
numbers (Math. Comp. Sci., 2007).
[Platte, Trefethen, Kuijlaars 2011] Impossibility of fast stable
approximation of analytic functions from equispaced samples,
(SIREV 2011).
[Trefethen 2008] Is Gauss quadrature better than Clenshaw–Curtis,
(SIREV, 2008).
[Hale, T. 2013] Fast and accurate computation of Gauss–Legendre
and Gauss-Jacobi quadature nodes and weights, (to appear in
SISC, 2013).
[Driscoll, Bornemann, Trefethen 2008] The chebop system for
automatic solution of differential equations (BIT Numer. Math.,
2008).
Alex Townsend
Chebfun
. Chebfun References:
[Hale, Driscoll 2013] Rectangular projection, (in preparation, 2013).
[Birkisson, Driscoll 2012] Automatic Fréchet differentiation for the
numerical solution of boundary-value problems, (ACM Trans.
Math. Softw., 2012).
[Olver, T. 2013], A fast and well-conditioned spectral method, (to
appear in SIREV, 2013).
[T., Trefethen 2013a], An extension of Chebfun to two dimensions,
(submitted, 2013).
[T., Trefethen 2013b], Gaussian elimination as an iterative
algorithm, (SIAM news, 2013).
Alex Townsend
Chebfun
Download