7.3 The Jacobi and Gauss-Seidel Iterative Methods (cont`d) The

advertisement
7.3 The Jacobi and Gauss-Seidel Iterative Methods (cont’d)
The Gauss-Seidel Method. For each
( )
generate the components
( )
[ ∑(
)
(
∑(
)
( )
)
of
( )
from
(
)
by
]
Namely,
( )
(
( )
)
( )
(
(
( )
)
)
(
( )
)
( )
Matrix form of Gauss-Seidel method.
(
( )
(
Define
)
)
(
( )
(
)
(
and
( )
)
(
(
)
)
(
,
Gauss-Seidel
Step 4 If ||
(
||
)
∑
(
)
, then OUTPUT (
method
can
be
written
as
)
Numerical Algorithm of Gauss-Seidel Method
( )
Input:
,
, tolerance TOL, maximum number of iterations
Step 1 Set
Step 2 while (
) do Steps 3-6
Step 3 For
[ ∑
)
.
]
);
1
STOP.
Step 5 Set
Step 6 For
Set
Step 7 OUTPUT (
STOP.
);
Convergence theorems of the iteration methods
Let
the
iteration
( )
(
Lemma 7.18 If the spectral radius satisfies ( )
(
Theorem 7.19 For any
( )
, then (
(
Since ( )
)
(
written
as
exists, and
∑
( )
, the sequence
defined by
)
if and only if ( )
converges to the unique solution of
( )
)
be
)
( )
Proof (only show ( )
method
)
is sufficient condition)
(
(
)
( )
)
(
)
( )
( )
(∑
)
(
)
2
Corollary 7.20 If || ||
( )
(
)
for any natural matrix norm and
converges, for any
(i)
||
( )
||
(ii)
||
( )
||
Theorem 7.21 If
sequences ( )
|| || ||
|| ||
|| ||
||
( )
( )
||
( )
( )
is a given vector, then the sequence
, to a vector
with
( )
defined by
, and the following error bound hold:
||
is strictly diagonally dominant, then for any choice of
that converges to the unique solution of
.
( )
, both the Jacobi and Gauss-Seidel methods give
Rate of Convergence
Corollary 7.20 (i) implies ||
( )
||
Theorem 7.22 (Stein-Rosenberg) If
following statements holds:
(i)
(ii)
(iii)
(iv)
( ) ||
( )
for each
||
and
, for each
, then one and only one of
( )
( )
( )
( )
( )
( )
( )
( )
3
7.4 Relaxation Techniques for Solving Linear Systems
Definition Suppose ̃
is an approximation to the solution of the linear system defined by
̃ with respect to this system is
̃.
. The residual vector for
Objective of accelerating convergence: Let residual vector converge to 0 rapidly.
In
Gauss-Seidel
method,
we
first
( )
associate
( ) (
(
with
)
( )
each
(
)
calculation
(
)
)
of
an
approximate
component
to the solution a residual vector
( )
The ith component of
( )
( )
( )
(
( )
( )
)
is
( )
∑(
)
∑ (
)
( )
(
)
(
)
)
( )
so
(
Also,
( )
( )
∑(
)
∑ (
(
)
)
is computed by
( )
[ ∑(
( )
)
(
∑(
)
)
( )
]
Therefore
(
)
( )
( )
4
Gauss-Seidel method is characterized by
( )
Now
consider
(
( )
)
the
( )
residual
( )
( )
The ith component of
( ),
(
( )
( )
( )
(
associated
)
(
)
with
the
vector
)
is
( )
By
( )
vector
∑(
( )
)
(
∑ (
)
)
( )
( )
Idea of Successive Over-Relaxation (SOR) (technique to accelerate convergence)
( ) to
Modify
( )
so that norm of residual vector
(
( )
)
( )
( )
converges to 0 rapidly. Here
Under-relaxation method when
Over-relaxation method when
Use
( ) and
( )
(
( ),
)
(
)
[
∑(
( )
)
∑ (
(
)
)]
( )
5
Matrix form of SOR
Rewrite Eq. (5) as
( )
∑(
( )
(
)
( )
(
)
(
( )
(
Define
)
SOR can be written as
Example Use SOR with
with
( )
(
(
)
( )
(
,
(
)
(
(
)
(
)
∑ (
(
)
(
)
)
(
)
)
)
(
)
)
)
to solve
)
6
Download