A New Guaranteed Adaptive Trapezoidal Rule Algorithm

advertisement
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
A New Guaranteed
Adaptive Trapezoidal Rule Algorithm
Fred J. Hickernell
Department of Applied Mathematics, Illinois Institute of Technology
hickernell@iit.edu www.iit.edu/~hickernell
Joint work with Martha Razo (IIT BS student) and
Sunny Yun (Stevenson High School 2014 graduate)
Supported by NSF-DMS-1115392
February 18, 2015
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
1 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
We Need Adaptive Algorithms
We Need Adaptive Numerical Algorithms
I
We rely on the numerical software to solve mathematical and statistical
problems: the NAG library (The Numerical Algorithms Group, 2013),
MATLAB (The MathWorks, 2014), Mathematica, (Wolfram Research Inc.
2014), and R (R Development Core Team, 2014).
I
Functions like cos and erf give us the answer with the desired accuracy
automatically.
I
Many numerical algorithms that we use are adaptive: MATLAB’s integral,
fminbnd, and ode45, and the Chebfun MATLAB toolbox (Hale et al., 2014).
They determine how much effort is needed to satisfy the error tolerance.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
2 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
We Need Adaptive Algorithms
We Need Better Adaptive Numerical Algorithms
Most adaptive algorithms use heuristics. There are no guarantees that they
actually do what they claim. Exceptions are
I
guaranteed algorithms for finding one zero of a function and for finding
minima of unimodal functions that date from the early 1970s (Brent, 2013),
I
guaranteed adaptive multivariate integration algorithms using Monte Carlo
(Hickernell et al., 2014) and quasi-Monte Carlo methods (Hickernell and
Jiménez Rugama, 2014; Jiménez Rugama and Hickernell, 2014), and
I
guaranteed adaptive algorithms for univariate function approximation (Clancy
et al., 2014) and optimization of multimodal univariate functions (Tong,
2014) using linear splines, and
I
a guaranteed adaptive trapezoidal rule for univariate integration (Clancy et
al., 2014).
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
3 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
We Need Adaptive Algorithms
We Need a Better Adaptive Trapezoidal Rule
b−a
[f (t0 ) + 2f (t1 ) + · · · + 2f (tn−1 ) + f (tn )],
2n
i(b − a)
ti = a +
, i = 0, . . . , n, n ∈ N := {1, 2, . . .}.
n
Z
b
(b − a)2 Var(f 0 )
n ∈ N.
err(f, n) := f (x) dx − Tn (f ) ≤
=: err(f, n),
a
8n2
Tn (f ) :=
The adaptive trapezoidal rule in (Clancy et al., 2014) takes [a, b] = [0, 1] and
works for integrands in
Z 1
Cb := f ∈ V : Var(f 0 ) ≤ τ
|f 0 (x) − f (1) + f (0)| dx
0
where 1/τ ≈ the width of the spike that you want to capture. The computational
cost to ensure that err(f, n) ≤ ε is
p
≤ τ Var(f 0 )/(4ε) + τ + 4
As τ increases there is an additive and a multiplicative penalty. We want to
remove the latter.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
4 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Three Algorithms
Three Algorithms
Z b
(b − a)2 Var(f 0 )
err(f, n) := f (x) dx − Tn (f ) ≤
=: err(f, n),
a
8n2
n ∈ N.
(b − a)2 σ
to bound err(f, n).
8n2
Non-adpative. Works for integrands in Bσ := {f : Var(f 0 ) ≤ σ}.
ballint Taught in calculus courses. Uses
flawint Taught in numerical
analysis courses.
Uses
Tn (f ) − Tn/2 (f )
to estimate err(f, n). Adaptive.
err(f,
c n) :=
3
Bad idea according to James Lyness (1983). Works for what kind
of integrands?
integral Our new algorithm. Adaptive. Need not know Var(f 0 ) but need to
know the spikyness of f . Details to follow.
Disclaimer: we are not pursuing interval arithmetic approaches (Rump, 1999; Moore et al., 2009;
Rump, 2010).
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
5 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Four Typical Integrands
An Easy Integrand
Algorithm feasy fbig ffluky fspiky
ballint 3
flawint 3
integral 3
7
7
3
7
3 3
r
2 −2x2
e
feasy (x) =
π
0.8
feasy
0.7
T4 (feasy )
0.6
0.5
0.4
0.3
1
Z
7
7
7
feasy (x) dx = 0.4772
0
T4 (f ) = 0.4750
0
Var(feasy
) = 1.5038
0.2
0.1
0
0
0.25
0.5
0.75
1
x
err(feasy , 4) = 0.0022 ≤ 0.0117 = err(feasy , 4)
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
6 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Four Typical Integrands
A Big Integrand
Algorithm feasy fbig ffluky fspiky
ballint 3
flawint 3
integral 3
7
3
3
7
7
3
7
7
7
2 ×10
4
1.5
0
m4
Tn (fbig (x; m)) = 1 + 4
4n
10m4
0
Var(fbig (x; m)) = √
3
0.5
0
-0.5
-1
-1.5
-2
0
0.2
0.4
0.6
0.8
1
x
err(fbig (x; m), n) =
hickernell@iit.edu
fbig (x; 16)
1
fbig (x; m) := 1
15m4 1
+
− x2 (1 − x)2
2
30
Z 1
fbig (x; m) dx = 1
m4
5m4
≤
= err(f
c big (x; m), n)
4
4n
4n4
New Adaptive Trapezoidal Rule
Meshfree Methods
7 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Four Typical Integrands
A Fluky Integrand (Inspired by Lyness (1983))
Algorithm feasy fbig ffluky fspiky
7
3
3
7
7
3
7
7
7
4
1.5
1
ffluky (x; m) := fbig (x; m)
15m2
1
− + x(1 − x)
+
2
6
Z 1
ffluky (x; m) dx = 1
ffluky (x; 16)
ballint 3
flawint 3
integral 3
2 ×10
0.5
0
-0.5
-1
-1.5
-2
0
m2 (m2 − 5n2 )
Tn (ffluky (x; m)) = 1 +
4n4
0
0.2
0.4
0.6
0.8
1
x
err(ffluky (·; n), n) = 1 > 0 = err(f
c fluky (·; n), n)
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
8 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Four Typical Integrands
A Spiky Integrand
Algorithm feasy fbig ffluky fspiky
7
3
3
7
7
3
7
7
7
2
1.5
fspiky (x; m)
= 30[{mx}(1 − {mx})]2
{x} := x mod 1
1
Z
fspiky (x; 16)
ballint 3
flawint 3
integral 3
1
0.5
fspiky (x; m) dx = 1
0
0
40m2
= √
3
m
Tn (fspiky (·; m)) = 0 for
∈N
n
0
Var(fspiky
(·; m))
0
0.2
err(fspiky (·; m), n) = 1 > 0 = err(f
c spiky (·; m), n)
hickernell@iit.edu
New Adaptive Trapezoidal Rule
0.4
0.6
0.8
1
x
for
m
∈ N.
n
Meshfree Methods
9 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Cone of Integrands
For Which f Can
Var(f 0 ) Be Well Appproximated?
(
0
Var(f ) := sup
n
X
0
0
|f (xi ) − f (xi−1 )| :
{xi }ni=0
)
is a partition, n ∈ N
i=1
partition: a = x0 ≤ x1 ≤ · · · ≤ xn = b,
0
+
f (x) := f (x ),
hickernell@iit.edu
a ≤ x < b,
0
size({xi }ni=0 ) := max (xi − xi−1 )
i=1,...,n
−
f (b) := f (b ),
New Adaptive Trapezoidal Rule
V = {f : Var(f 0 ) < ∞}
Meshfree Methods
10 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Cone of Integrands
For Which f Can Var(f 0 ) Be Well Appproximated?
0
Var(f ) := sup
( n
X
)
0
0
|f (xi ) − f (xi−1 )| :
{xi }ni=0
is a partition, n ∈ N
i=1
Define an approximation to Var(f 0 ) as follows:
Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ) :=
n−1
X
i=2
hickernell@iit.edu
|∆i − ∆i−1 | ≤ Var(f 0 ),
0 +
∆i between f 0 (x−
i ) and f (xi )
New Adaptive Trapezoidal Rule
Meshfree Methods
10 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Cone of Integrands
For Which f Can Var(f 0 ) Be Well Appproximated?
0
Var(f ) := sup
( n
X
)
0
0
|f (xi ) − f (xi−1 )| :
{xi }ni=0
is a partition, n ∈ N
i=1
Define an approximation to Var(f 0 ) as follows:
Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ) :=
n−1
X
i=2
|∆i − ∆i−1 | ≤ Var(f 0 ),
0 +
∆i between f 0 (x−
i ) and f (xi )
Define the cone of integrands for which Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ) does not
underestimate Var(f 0 ) by much:
C := {f ∈ V : Var(f 0 ) ≤ C(size({xi }ni=0 ))Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 )
n
n
for all n ∈ N, {∆i }n−1
i=1 , and {xi }i=0 with size({xi }i=0 ) < h}
Cut-off h ∈ (0, b − a] and inflation factor C : [0, h) → [1, ∞) non-decreasing.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
10 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Cone of Integrands
How Spiky Can(f Be?
n
X
0
Var(f ) := sup
)
0
0
{xi }ni=0
|f (xi ) − f (xi−1 )| :
i=1
Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 )
:=
n−1
X
is a partition, n ∈ N
|∆i − ∆i−1 | ≤ Var(f 0 ),
∆i btwn f 0 (x±
i )
i=2
0
n
C := {f ∈ V : Var(f ) ≤ C(h)Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ), h = size({xi }i=0 ) < h}
0.25
0.25
0.2
twopk(x, 0.65, 0.1, +)
peak(x, 0.25, 0.2)
0.2
0.15
0.1
0.05
0
-0.05
0.15
0.1
0.05
0
-0.05
0
0.2
0.4
0.6
0.8
1
x
0.2
0.4
0.6
0.8
1
x
peak(x, t, h) := (h − |x − t|)+
∈ C if h ≥ h, a + h ≤ t ≤ b − 3h
hickernell@iit.edu
0
twopk(x, t, h, ±) := peak(x, 0, h)
±
3[C(h) − 1]
peak(x, t, h) ∈ C
4
New Adaptive Trapezoidal Rule
Meshfree Methods
11 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Practically Bounding Var(f 0 )
Practically Bounding Var(f 0 )
0
Var(f ) := sup
( n
X
)
0
0
|f (xi ) − f (xi−1 )| :
{xi }ni=0
is a partition, n ∈ N
i=1
Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ) :=
n−1
X
|∆i − ∆i−1 | ≤ Var(f 0 ),
∆i btwn f 0 (x±
i )
i=2
n
C := {f ∈ V : Var(f 0 ) ≤ C(h)Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ), h = size({xi }i=0 ) < h}
But Vb relies on derivative values.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
12 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Practically Bounding Var(f 0 )
Practically Bounding Var(f 0 )
0
Var(f ) := sup
( n
X
)
0
0
|f (xi ) − f (xi−1 )| :
{xi }ni=0
is a partition, n ∈ N
i=1
Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ) :=
n−1
X
|∆i − ∆i−1 | ≤ Var(f 0 ),
∆i btwn f 0 (x±
i )
i=2
n
C := {f ∈ V : Var(f 0 ) ≤ C(h)Vb (f 0 , {xi }ni=0 , {∆i }n−1
i=1 ), h = size({xi }i=0 ) < h}
But Vb relies on derivative values. In practice we may use
Ven (f ) :=
n−1
n X
|f (ti+1 ) − 2f (ti ) + f (ti−1 )| ,
b − a i=1
ti = a +
i(b − a)
n
n+1
n
n
= Vb (f 0 , {xi }n+1
i=0 , {∆i }i=1 ) for some {xi }i=0 , {∆i }i=1
So Ven (f ) ≤ Var(f 0 ) ≤ C(2(b − a)/n)Ven (f ) for n > 2(b − a)/h and f ∈ C.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
12 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Guranteed Adaptive Algorithm integral
New, Guaranteed Adaptive Algorithm integral
Given an interval, [a, b], an inflation function,
key mesh size, h, and a
C, a positive
2(b − a)
positive error tolerance, ε, set j = 1, n1 =
+ 1, and V 0 = ∞.
h
2(b − a) e
Vnj (f ) . If
Step 1 Compute Venj (f ) and V j = min V j−1 , C
nj
e
Vnj (f ) > V j , then widen C and repeat this step. Otherwise, proceed.
Step 2 If (b − a)2 V j ≤ 8n2j ε, then return Tnj (f ) as the answer.
Step 3 Otherwise, increase the number of trapezoids to nj+1 = max(2, m)nj ,
where
m = min{r ∈ N : η(rnj )Venj (f ) ≤ ε},
with η(n) :=
(b − a)2 C(2(b − a)/n)
,
8n2
increase j by one, and go to Step 1.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
13 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Guranteed Adaptive Algorithm integral
integral Works as Advertised
Theorem
Algorithm integral is successful, i.e.,
Z
b
f (x) dx − integral(f, a, b, ε) ≤ ε
a
hickernell@iit.edu
New Adaptive Trapezoidal Rule
∀f ∈ C.
Meshfree Methods
14 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Bounds on the Computational Cost of integral
Bounds on the Computational Cost of integral
Theorem
Let N (f, ε) denote the final number of trapezoids that is required by
integral(f, a, b, ε). Then this number is bounded below and above in terms of
the true, yet unknown, Var(f 0 ).
max
&
'!
r
2(b − a)
Var(f 0 )
+ 1, (b − a)
≤ N (f, ε)
h
8ε
&
'!
r
2(b − a)
C(αh) Var(f 0 )
≤ 2 min max
+ 1, (b − a)
.
0<α≤1
αh
8ε
The number of function values required by integral(f, a, b, ε) is N (f, ε) + 1.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
15 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Bounds on the Computational Cost of integral
Proof of Lower Bound on Computational Cost
2(b − a)
+ 1.
The number of trapezoids must be at least n1 =
h
The number of trapezoids is increased until (b − a)2 V j ≤ 8n2j ε, which implies that
(b − a)2 Var(f 0 )
(b − a)2 V j
≤
≤ ε.
2
8nj
8n2j
This implies the lower bound on N (f, ε).
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
16 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost
Let J be the value of j for which integral terminates, so N (f, ε) = nJ . Since
n1 satisfies the upper bound, we may assume that J ≥ 2.
Let m∗ = max(2, m) where m comes from Step 3. Note that
η((m∗ − 1)nJ−1 ) Var(f 0 ) > ε. For m∗ = 2 this follows because
η(nJ−1 ) Var(f 0 ) ≥
(b − a)2 C(2(b − a)/nJ−1 )VenJ−1 (f )
8n2j−1
≥
(b − a)2 V J−1 (f )
> ε.
8n2j−1
For m∗ = m > 2 this follows by the definition of m in Step 3.
Since η is a decreasing function, this implies that
2(b − a)
+ 1, η(n) Var(f 0 ) ≤ ε .
(m∗ − 1)nJ−1 < n∗ := min n ∈ N : n ≥
h
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
17 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost cont’d
Since
∗
(m − 1)nJ−1
2(b − a)
0
< n := min n ∈ N : n ≥
+ 1, η(n) Var(f ) ≤ ε .
h
∗
it follows that
nJ = m∗ nJ−1 <
m∗
n∗ ≤ 2n∗ .
m∗ − 1
Now we need an upper bound on n∗ .
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
18 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Bounds on the Computational Cost of integral
Proof of Upper Bound on Computational Cost cont’d
So far we have
2(b − a)
0
N (f, ε) ≤ 2n , n := min n ∈ N : n ≥
+ 1, η(n) Var(f ) ≤ ε .
h
2(b − a)
∗
+ 1,
For fixed α ∈ (0, 1], we need only consider the case where n >
αh
2(b − a)
2(b − a)
so n∗ − 1 ≥
+1>
. Then
αh
αh
r
η(n∗ − 1) Var(f 0 )
∗
∗
n − 1 < (n − 1)
ε
s
2
(b − a) C(2(b − a)/(n∗ − 1)) Var(f 0 )
= (n∗ − 1)
8(n∗ − 1)2 ε
r
C(αh) Var(f 0 )
≤ (b − a)
,
8ε
which completes the proof of the upper bound on n∗ .
∗
hickernell@iit.edu
∗
New Adaptive Trapezoidal Rule
Meshfree Methods
19 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Lower Complexity Bound for Integration on C
Lower Complexity Bound for Integration on C
Theorem
Let int be any (possibly adaptive) algorithm that succeeds for all integrands in C,
and only uses function values. For any error tolerance ε > 0 and any arbitrary
value of Var(f 0 ), there will be some f ∈ C for which int must use at least
r
3
[C(0) − 1] Var(f 0 )
− + (b − a − 3h)
2
32ε
function values. As Var(f 0 )/ε → ∞ the asymptotic rate of increase is the same as
the computational cost of integral, provided C(0) > 1.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
20 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity
Suppose that int(·, a, b, ε) evaluates α peak(·; 0, h) at n nodes with
a + 3h = x0 ≤ x1 ≤ · · · ≤ xm ≤ xm+1 = b − h, h :=
b − a − 3h
, m ≤ n.
2n + 3
There must be at least one of these xi with i = 0, . . . , m for which
xi+1 − xi
xm+1 − x0
xm+1 − x0
b − a − 3h − h
b − a − 3h
≥
≥
=
=
= h.
2
2(m + 1)
2n + 2
2n + 2
2n + 3
Choose one such xi , and call it t.
int(·, a, b, ε) cannot distinguish between α peak(·; 0, h) and α twopk(·; t, h, ±).
Since they all belong to C, int is successful for them all.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
21 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity cont’d
By the definitions of peak and twopk
"Z
1 b
ε≥
α twopk(x; t, h, −) dx − int(α twopk(·; t, h, −), a, b, ε)
2 a
Z
#
b
+
α twopk(x; t, h, +) dx − int(α twopk(·; t, h, +), a, b, ε)
a
"
Z
b
1 α twopk(x; t, h, −) dx
≥
int(α peak(·; 0, h), a, b, ε) −
2 a
#
Z
b
α twopk(x; t, h, +) dx − int(α peak(·; 0, h), a, b, ε)
+
a
Z
Z
b
1 b
≥ α twopk(x; t, h, +) dx −
α twopk(x; t, h, −) dx
2 a
a
Z b
3α[C(h) − 1]h2
[C(h) − 1]h2 Var(α peak(·; 0, h))
=
α peak(x; t, h) dx =
=
8
8
a
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
22 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Lower Complexity Bound for Integration on C
Proof of Lower Bound on Complexity cont’d
[C(h) − 1]h2 Var(α peak(·; 0, h))
8
Substituting for h in terms of n gives a lower bound on n:
ε≥
r
b − a − 3h
[C(h) − 1] Var(α peak0 (·; 0, h))
2n + 3 =
≥ (b − a − 3h)
h
8ε
r
[C(0) − 1] Var(α peak0 (·; 0, h))
≥ (b − a − 3h)
.
8ε
Since α is an arbitrary positive number, the value of Var(α peak0 (·; 0, h)) is
arbitrary as well.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
23 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Why Is This New and Improved?
Why Is Our New integral Improved?
I
ballint is non-adaptive and requires σ = maxf Var(f 0 ), which may be
affected by both the vertical and horizontal scales of f .
I
flawint has a flawed error estimate as pointed out by Lyness (1983).
I
Clancy Et al.’s (2014) adaptive quadrature rule has a cost of
p
≤ τ Var(f 0 )/(4ε) + τ + 4
I
Our new adaptive quadrature algorithm
which goes up multiplicatively in τ as τ increases.
≤ 2 min max
0<α≤1
!
r
C(αh) Var(f 0 )
2(b − a)
+ 1, (b − a)
+1
αh
8ε
which goes up additively in 1/h as h → 0.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
24 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
80
1.6
70
1.4
60
1.2
50
1
40
0.8
0.6
30
0.4
20
0.2
10
0
0
0
0.2
hickernell@iit.edu
0.4
0.6
0.8
1
0
New Adaptive Trapezoidal Rule
0.2
0.4
0.6
0.8
Meshfree Methods
1
25 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
>> NewOldIntegral
Ordinary peaky function
Old integral_g
Elapsed time is 0.169305 seconds.
Tol = 1e-10, Error = 3.3784e-13, ErrEst = 2.5001e-11
Npts = 1719037
New integralNoPenalty_g
Elapsed time is 0.013951 seconds.
Tol = 1e-10, Error = 3.7074e-11, ErrEst = 8.346e-11
Npts = 164242
But should use = [75%, 91%] Npts if we knew Var(f’)
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
26 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
Why Is This New and Improved?
Old integral g.m Vs. New integralNoPenalty g.m
>> NewOldIntegral
Very peaky function
nlo=1e4; nhi=nlo;
Old integral_g
Elapsed time is 0.427659 seconds.
Tol = 1e-08, Error = 1.5667e-12, ErrEst = 9.9963e-09
Npts = 6129388
New integralNoPenalty_g
Elapsed time is 0.075723 seconds.
Tol = 1e-08, Error = 8.8407e-11, ErrEst = 8.2902e-09
Npts = 1169884
But should use = [74%, 91%] Npts if we knew Var(f’)
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
27 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
What Comes Next?
What Comes Next?
I
Simpson’s rule
I
Relative error
I
Other problems
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
28 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
What Comes Next?
References I
Brent, R. P. 2013. Algorithms for minimization without derivatives, Dover Publications, Inc.,
Mineola, NY. republication of the 1973 edition by Prentice-Hall, Inc.
Clancy, N., Y. Ding, C. Hamilton, F. J. Hickernell, and Y. Zhang. 2014. The cost of
deterministic, adaptive, automatic algorithms: Cones, not balls, J. Complexity 30, 21–45.
Hale, N., L. N. Trefethen, and T. A. Driscoll. 2014. Chebfun version 5.
Hickernell, F. J., L. Jiang, Y. Liu, and A. B. Owen. 2014. Guaranteed conservative fixed width
confidence intervals via Monte Carlo sampling, Monte Carlo and quasi-Monte Carlo methods
2012, pp. 105–128.
Hickernell, F. J. and Ll. A. Jiménez Rugama. 2014. Reliable adaptive cubature using digital
sequences. submitted for publication, arXiv:1410.8615 [math.NA].
Jiménez Rugama, Ll. A. and F. J. Hickernell. 2014. Adaptive multidimensional integration
based on rank-1 lattices. submitted for publication, arXiv:1411.1966.
Lyness, J. N. 1983. When not to use an automatic quadrature routine, SIAM Rev. 25, 63–87.
Moore, R. E., R. B. Kearfott, and M. J. Cloud. 2009. Introduction to interval analysis,
Cambridge University Press, Cambridge.
R Development Core Team. 2014. The R Project for Statistical Computing.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
29 / 30
Motivation
New algorithm integral
Computational Cost of integral
Discussion
References
What Comes Next?
References II
Rump, S. M. 1999. INTLAB - INTerval LABoratory, Developments in Reliable Computing,
pp. 77–104. http://www.ti3.tuhh.de/rump/.
. 2010. Verification methods: Rigorous results using floating-point arithmetic, Acta
Numer. 19, 287–449.
The MathWorks, Inc. 2014. MATLAB 8.4, Natick, MA.
The Numerical Algorithms Group. 2013. The NAG library, Mark 23, Oxford.
Tong, X. 2014. A guaranteed, adaptive, automatic algorithm for univariate function
minimization, Master’s Thesis.
Wolfram Research Inc. 2014. Mathematica 10, Champaign, IL.
hickernell@iit.edu
New Adaptive Trapezoidal Rule
Meshfree Methods
30 / 30
Download