Function approximation: Fourier, Chebyshev, Lagrange

advertisement
Function approximation: Fourier, Chebyshev, Lagrange
¾
¾
¾
¾
¾
¾
¾
Orthogonal functions
Fourier Series
Discrete Fourier Series
Fourier Transform: properties
Chebyshev polynomials
Convolution
DFT and FFT
Scope: Understanding where the Fourier Transform comes
from. Moving from the continuous to the discrete world. The
concepts are the basis for pseudospectral methods and the
spectral element approach.
Orthogonal functions
1
Fourier Series: one way to derive them
The Problem
we are trying to approximate a function f(x) by another function gn(x)
which consists of a sum over N orthogonal functions Φ(x) weighted by
some coefficients an.
N
f ( x) ≈ g N ( x) = ∑ ai Φi ( x)
i =0
Orthogonal functions
2
The Problem
... and we are looking for optimal functions in a least squares (l2) sense ...
1/ 2
⎡
⎤
2
f ( x) − gN ( x) = ⎢∫ f ( x) − gN ( x) dx⎥
2
⎣a
⎦
b
{
}
= Min!
... a good choice for the basis functions Φ(x) are orthogonal functions.
What are orthogonal functions? Two functions f and g are said to be
orthogonal in the interval [a,b] if
b
∫ f (x)g(x)dx = 0
a
How is this related to the more conceivable concept of orthogonal
vectors? Let us look at the original definition of integrals:
Orthogonal functions
3
Orthogonal Functions
⎞
⎛N
⎜ ∑ fi ( x) gi ( x)Δx ⎟
∫a f (x)g(x)dx = Nlim
→∞
⎠
⎝ i=1
b
... where x0=a and xN=b, and xi-xi-1=Δx ...
If we interpret f(xi) and g(xi) as the ith components of an N component
vector, then this sum corresponds directly to a scalar product of vectors.
The vanishing of the scalar product is the condition for orthogonality of
vectors (or functions).
fi
Orthogonal functions
gi
fi • gi = ∑ fi gi = 0
i
4
Periodic functions
Let us assume we have a piecewise continuous function of the form
f ( x + 2π ) = f ( x)
40
f ( x + 2π ) = f ( x) = x2
30
20
10
0
-15
-10
-5
0
5
10
15
20
... we want to approximate this function with a linear combination of 2π
periodic functions:
1, cos( x ), sin( x ), cos( 2 x ), sin( 2 x ),..., cos( nx ), sin( nx )
N
1
⇒ f ( x) ≈ g N ( x) = a0 + ∑{ak cos(kx) + bk sin(kx)}
2
k =1
Orthogonal functions
5
Orthogonality
... are these functions orthogonal ?
j ≠ k
⎧ 0
⎪
=
cos(
jx
)
cos(
kx
)
dx
j = k = 0
⎨ 2π
∫− π
⎪ π
j = k > 0
⎩
⎧ 0 j ≠ k , j, k > 0
π
⎪
=
sin(
jx
)
sin(
kx
)
dx
⎨
∫− π
⎪π
j = k > 0
⎩
π
π
∫π cos(
jx ) sin( kx ) dx = 0
j ≥ 0, k > 0
−
... YES, and these relations are valid for any interval of length 2π.
Now we know that this is an orthogonal basis, but how can we obtain the
coefficients for the basis functions?
from minimising f(x)-g(x)
Orthogonal functions
6
Fourier coefficients
optimal functions g(x) are given if
g n ( x) − f ( x)
2
= Min !
or
∂
∂a k
{g
n
( x) − f ( x)
2
}= 0
... with the definition of g(x) we get ...
∂
g n (x) − f (x)
∂ak
2
2
π
∂ ⎡ ⎡1
=
⎢ ∫ ⎢ a0 +
∂ a k ⎢ −π ⎣ 2
⎣
2
⎤
⎤
{a k cos( kx ) + b k sin( kx ) }− f ( x ) ⎥ dx ⎥
∑
k =1
⎥⎦
⎦
N
leading to
N
1
g N ( x ) = a 0 + ∑ {a k cos( kx ) + b k sin( kx )}
2
k =1
ak =
bk =
Orthogonal functions
1
π
1
π
with
π
∫π f ( x ) cos( kx ) dx ,
k = 0 ,1,..., N
−
π
∫π f ( x ) sin( kx ) dx ,
k = 1, 2 ,..., N
−
7
Fourier approximation of |x|
... Example ...
−π ≤ x ≤ π
f ( x) = x ,
leads to the Fourier Serie
g (x) =
1
4 ⎧ cos( x ) cos( 3 x ) cos( 5 x )
⎫
+
+
+
...
π − ⎨
⎬
2
32
52
π ⎩ 12
⎭
.. and for n<4 g(x) looks like
4
3
2
1
0
-20
Orthogonal functions
-15
-10
-5
0
5
10
8
Fourier approximation of x2
... another Example ...
0 < x < 2π
f ( x) = x 2 ,
leads to the Fourier Serie
4π
g N (x) =
3
2
+
N
∑
k =1
4π
⎫
⎧ 4
sin( kx ) ⎬
⎨ 2 cos( kx ) −
k
⎭
⎩k
.. and for N<11, g(x) looks like
40
30
20
10
0
-1 0
-1 0
Orthogonal functions
-5
0
5
10
9
Fourier - discrete functions
... what happens if we know our function f(x) only at the points
2π
xi =
i
N
it turns out that in this particular case the coefficients are given by
N
2
ak =
N
∑
2
bk =
N
N
*
*
j =1
∑
j =1
f ( x j ) cos( kx j ) ,
k = 0 ,1, 2 ,...
f ( x j ) sin( kx j ) ,
k = 1, 2 ,3,...
.. the so-defined Fourier polynomial is the unique interpolating function to
the function f(xj) with N=2m
{
}
1 * m −1 *
1
g ( x ) = a 0 + ∑ a k cos( kx ) + b *k sin( kx ) + a m* cos( kx )
2
2
k =1
*
m
Orthogonal functions
10
Fourier - collocation points
... with the important property that ...
g m* ( x i ) = f ( x i )
... in our previous examples ...
3.5
3
2.5
2
1.5
1
0.5
0
-10
-5
0
5
f(x)=|x| => f(x) - blue ; g(x) - red; xi - ‘+’
Orthogonal functions
11
Fourier series - convergence
f(x)=x2 => f(x) - blue ; g(x) - red; xi - ‘+’
Orthogonal functions
12
Fourier series - convergence
f(x)=x2 => f(x) - blue ; g(x) - red; xi - ‘+’
Orthogonal functions
13
Gibb’s phenomenon
f(x)=x2 => f(x) - blue ; g(x) - red; xi - ‘+’
N = 16
N = 32
6
6
6
4
4
4
2
2
2
0
0
0
-2
-2
-2
-4
-4
-4
-6
0
0.5
1
N = 128
1.5
-6
0
6
6
4
4
2
2
0
0
-2
-2
-4
-4
-6
0
0.5
Orthogonal functions
1
1.5
-6
0.5
1
N = 256
1.5
-6
0
0.5
The overshoot for equispaced Fourier
interpolations is ≈14% of
the step height.
0
0.5
1
1.5
14
Chebyshev polynomials
We have seen that Fourier series are excellent for interpolating
(and differentiating) periodic functions defined on a regularly
spaced grid. In many circumstances physical phenomena which
are not periodic (in space) and occur in a limited area. This quest
leads to the use of Chebyshev polynomials.
We depart by observing that cos(nϕ) can be expressed by a
polynomial in cos(ϕ):
cos( 2ϕ ) = 2 cos 2 ϕ − 1
cos( 3ϕ ) = 4 cos 3 ϕ − 3 cos ϕ
cos( 4ϕ ) = 8 cos 4 ϕ − 8 cos 2 ϕ + 1
... which leads us to the definition:
Orthogonal functions
15
Chebyshev polynomials - definition
cos( nϕ ) = Tn (cos(ϕ )) = Tn ( x),
x = cos(ϕ ),
x ∈ [ −1,1],
n∈ N
... for the Chebyshev polynomials Tn(x). Note that because of
x=cos(ϕ) they are defined in the interval [-1,1] (which - however can be extended to ℜ). The first polynomials are
T0 ( x) = 1
T1 ( x) = x
T2 ( x ) = 2 x 2 − 1
T3 ( x ) = 4 x 3 − 3 x
T4 ( x ) = 8 x 4 − 8 x 2 + 1
Tn ( x) ≤ 1
Orthogonal functions
for
where
x ∈ [ −1,1]
and n ∈ N 0
16
Chebyshev polynomials - Graphical
The first ten polynomials look like [0, -1]
1
T_n(x)
0.5
0
-0.5
-1
0
0.2
0.4
0.6
0.8
x
The n-th polynomial has extrema with values 1 or -1 at
x
Orthogonal functions
( ext )
k
kπ
= cos(
),
n
k = 0 ,1, 2 ,3,..., n
17
Chebyshev collocation points
These extrema are not equidistant (like the Fourier extrema)
k
x(k)
x
Orthogonal functions
( ext )
k
kπ
= cos(
),
n
k = 0 ,1, 2 ,3,..., n
18
Chebyshev polynomials - orthogonality
... are the Chebyshev polynomials orthogonal?
Chebyshev polynomials are an orthogonal set of functions in the
interval [-1,1] with respect to the weight function 1 / 1 − x 2
such that
1
∫T
k
( x )T j ( x )
−1
dx
1− x
2
0
⎧
⎪
= ⎨π / 2
⎪ π
⎩
for k ≠ j
⎫
⎪
for k = j > 0 ⎬ ,
for k = j = 0 ⎪⎭
k, j ∈ N0
... this can be easily verified noting that
x = cos ϕ ,
dx = − sin ϕdϕ
Tk ( x) = cos(kϕ ), T j ( x) = cos( jϕ )
Orthogonal functions
19
Chebyshev polynomials - interpolation
... we are now faced with the same problem as with the Fourier
series. We want to approximate a function f(x), this time not a
periodical function but a function which is defined between [-1,1].
We are looking for gn(x)
n
1
f ( x ) ≈ g n ( x ) = c 0 T0 ( x ) + ∑ c k T k ( x )
2
k =1
... and we are faced with the problem, how we can determine the
coefficients ck. Again we obtain this by finding the extremum
(minimum)
1
∂ ⎡
dx ⎤
2
⎢ ∫ {g n ( x) − f ( x)}
⎥=0
2
∂ck ⎣ −1
1− x ⎦
Orthogonal functions
20
Chebyshev polynomials - interpolation
... to obtain ...
ck =
2
π
1
∫ f ( x )T
−1
k
( x)
dx
1− x
2
,
k = 0 ,1, 2 ,..., n
... surprisingly these coefficients can be calculated with FFT
techniques, noting that
ck =
2
π
π∫
f (cos ϕ ) cos kϕdϕ ,
k = 0,1,2,..., n
0
... and the fact that f(cosϕ) is a 2π-periodic function ...
ck =
1
π
π
∫π
f (cos ϕ ) cos kϕdϕ ,
k = 0,1,2,..., n
−
... which means that the coefficients ck are the Fourier coefficients
ak of the periodic function F(ϕ)=f(cos ϕ)!
Orthogonal functions
21
Chebyshev - discrete functions
... what happens if we know our function f(x) only at the points
x i = cos
π
N
i
in this particular case the coefficients are given by
2
ck =
N
*
N
∑
f (cos ϕ j ) cos( k ϕ j ) ,
j =1
k = 0 ,1, 2 ,... N / 2
... leading to the polynomial ...
m
1 *
g ( x ) = c 0 T 0 + ∑ c k*T k ( x )
2
k =1
*
m
... with the property
g m* ( x ) = f ( x )
Orthogonal functions
at
x j = cos( π j/N)
j = 0,1,2,..., N
22
Chebyshev - collocation points - |x|
f(x)=|x| => f(x) - blue ; gn(x) - red; xi - ‘+’
N = 8
1
0 .8
8 points
0 .6
0 .4
0 .2
0
-1
-0 .8
-0 .6
-0 .4
-0 .2
0
0 .2
0 .4
0 .6
0 .8
N = 16
1
0.8
0.6
16 points
0.4
0.2
0
-1
Orthogonal functions
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
23
Chebyshev - collocation points - |x|
f(x)=|x| => f(x) - blue ; gn(x) - red; xi - ‘+’
N = 32
1
0.8
0.6
32 points
0.4
0.2
0
-1
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.2
0.4
0.6
0.8
N = 128
1
0.8
128 points
0.6
0.4
0.2
0
-1
Orthogonal functions
-0.8
-0.6
-0.4
-0.2
0
0.6
0.
24
Chebyshev - collocation points - x2
f(x)=x2 => f(x) - blue ; gn(x) - red; xi - ‘+’
N=8
1.2
1
0.8
8 points
0.6
0.4
The interpolating
function gn(x) was
shifted by a small
amount to be
visible at all!
0.2
0
-1
-0.8
-0.6
-0.4
-0.2
0
N = 64
0.2
0.4
0.6
0.8
-0.8
-0.6
-0.4
-0.2
0
0.2
0.4
0.6
0.8
1.2
1
0.8
0.6
64 points
0.4
0.2
0
-1
Orthogonal functions
1
25
Chebyshev vs. Fourier - numerical
Chebyshev
Fourier
N = 16
N=
1
35
0.8
30
25
0.6
20
15
0.4
10
5
0.2
0
0
0
0.2
0.4
0.6
0.8
1
-5
0
2
f(x)=x2 => f(x) - blue ; gN(x) - red; xi - ‘+’
This graph speaks for itself ! Gibb’s phenomenon with Chebyshev?
Orthogonal functions
26
Chebyshev vs. Fourier - Gibb’s
Chebyshev
Fourier
N = 16
N = 16
1.5
1.5
1
1
0.5
0.5
0
0
-0.5
-0.5
-1
-1
-1.5
-1
-0.5
0
0.5
1
-1.5
0
2
f(x)=sign(x-π) => f(x) - blue ; gN(x) - red; xi - ‘+’
Gibb’s phenomenon with Chebyshev? YES!
Orthogonal functions
27
Chebyshev vs. Fourier - Gibb’s
Chebyshev
Fourier
N = 64
N=6
1.5
1.5
1
1
0.5
0.5
0
0
-0.5
-0.5
-1
-1
-1.5
-1
-0.5
0
0.5
1
-1.5
0
2
4
f(x)=sign(x-π) => f(x) - blue ; gN(x) - red; xi - ‘+’
Orthogonal functions
28
Fourier vs. Chebyshev
Chebyshev
Fourier
2π
xi =
i
N
periodic functions
domain
cos( nx ), sin( nx )
basis functions
g m* ( x ) =
1 *
a0
2
m −1
{
+ ∑ a *k cos( kx ) + b *k sin( kx )
x i = cos
collocation points
}
interpolating
function
π
N
i
limited area [-1,1]
T n ( x ) = cos( n ϕ ),
x = cos ϕ
m
1 *
g ( x ) = c 0 T 0 + ∑ c k*T k ( x )
2
k =1
*
m
k =1
+
1 *
a m cos( kx )
2
Orthogonal functions
29
Fourier vs. Chebyshev (cont’d)
Chebyshev
Fourier
N
2
ak =
N
∑
2
bk =
N
N
*
*
j =1
∑
j =1
f ( x j ) cos( kx j )
2
ck =
N
*
coefficients
N
∑
j =1
f (cos ϕ j ) cos( k ϕ j )
f ( x j ) sin( kx j )
• Gibb’s phenomenon for
discontinuous functions
• limited area calculations
some properties
• grid densification at boundaries
• Efficient calculation via FFT
• coefficients via FFT
• infinite domain through
periodicity
• excellent convergence at
boundaries
• Gibb’s phenomenon
Orthogonal functions
30
The Fourier Transform Pair
1
F (ω ) =
2π
∞
f (t ) =
∞
∫ f (t )e
i ωt
dt
Forward transform
−∞
∫ F (ω )e
− iω t
dω
Inverse transform
−∞
Note the conventions concerning the sign of the exponents and the factor.
Orthogonal functions
31
Some properties of the Fourier Transform
Defining as the FT:
¾ Linearity
¾ Symmetry
¾ Time shifting
¾ Time differentiation
Orthogonal functions
f (t ) ⇒ F (ω )
af1 (t ) + bf 2 (t ) ⇒ aF1 (ω ) + bF2 (ω )
f (−t ) ⇒ 2πF (−ω )
f (t + Δt ) ⇒ e iωΔt F (ω )
∂ n f (t )
⇒ (−iω ) n F (ω )
n
∂t
32
Differentiation theorem
¾ Time differentiation
Orthogonal functions
∂ n f (t )
n
⇒
(
−
i
ω
)
F (ω )
n
∂t
33
Convolution
The convolution operation is at the heart of linear systems.
Definition:
f (t ) ∗ g (t ) =
Properties:
∞
∞
−∞
−∞
∫ f (t ' ) g (t − t ' )dt ' = ∫ f (t − t ' ) g (t ' )dt '
f (t ) ∗ g (t ) = g (t ) ∗ f (t )
f (t ) ∗ ∂ (t ) = f (t )
f (t ) ∗ H (t ) = ∫ f (t )dt
H(t) is the Heaviside function:
Orthogonal functions
34
The convolution theorem
A convolution in the time domain corresponds to a
multiplication in the frequency domain.
… and vice versa …
a convolution in the frequency domain corresponds to a
multiplication in the time domain
f (t ) ∗ g (t ) ⇒ F (ω )G (ω )
f (t ) g (t ) ⇒ F (ω ) ∗ G (ω )
The first relation is of tremendous practical implication!
Orthogonal functions
35
Summary
¾ The Fourier Transform can be derived from the problem of
approximating an arbitrary function.
¾ A regular set of points allows exact interpolation (or derivation) of
arbitrary functions
¾ There are other basis functions (e.g., Chebyshev polynomials,
Legendre polynomials) with similar properties
¾ These properties are the basis for the success of the spectral
element method
Orthogonal functions
36
Download