Document 10904708

advertisement
Hindawi Publishing Corporation
Journal of Applied Mathematics
Volume 2012, Article ID 274636, 16 pages
doi:10.1155/2012/274636
Research Article
Improving Results on Convergence of AOR Method
Guangbin Wang, Ting Wang, and Yanli Du
Department of Mathematics, Qingdao University of Science and Technology, Qingdao 266061, China
Correspondence should be addressed to Guangbin Wang, wguangbin750828@sina.com
Received 16 June 2012; Revised 27 August 2012; Accepted 4 October 2012
Academic Editor: Carlos J. S. Alves
Copyright q 2012 Guangbin Wang et al. This is an open access article distributed under the
Creative Commons Attribution License, which permits unrestricted use, distribution, and
reproduction in any medium, provided the original work is properly cited.
We present some sufficient conditions on convergence of AOR method for solving Ax b with A
being a strictly doubly α diagonally dominant matrix. Moreover, we give two numerical examples
to show the advantage of the new results.
1. Introduction
Let us denote all complex square matrices by Cn×n and all complex vectors by Cn .
For A aij ∈ Cn×n , we denote by ρA the spectral radius of matrix A.
Let us consider linear system Ax b, where b ∈ Cn is a given vector and x ∈ Cn is an
unknown vector. Let A D − T − S be given and D is the diagonal matrix, −T and −S are
strictly lower and strictly upper triangular parts of A, respectively, and denote
L D−1 T,
U D−1 S,
1.1
where detD /
0.
Then the AOR method 1 can be written as
xk1 Mσ,ω xk d,
k 0, 1, . . . , x0 ∈ Cn ,
1.2
where
Mσ,ω I − σL−1 1 − ωI ω − σL ωU,
d ωI − σL−1 b,
ω, σ ∈ R.
1.3
2
Journal of Applied Mathematics
2. Preliminaries
We denote
Ri A aij ,
Si A j/
i
aji ,
Pi,α A αRi A 1 − αSi A,
∀i ∈ N.
j/
i
2.1
For any matrix A aij ∈ Cn×n , the comparison matrix MA mij ∈ Rn×n is defined by
mii |aii |,
mij −aij ,
i, j ∈ N, i /
j, N {1, 2, . . . , n}.
2.2
Definition 2.1 see 2. A matrix A ∈ Cn×n is called a strictly diagonally dominant matrix
SD if
|aii | > Ri A,
∀i ∈ N.
2.3
A matrix A ∈ Cn×n is called a strictly doubly diagonally dominant matrix DD if
|aii |ajj > Ri ARj A,
∀i, j ∈ N, i / j.
2.4
Definition 2.2 see 3. A matrix A ∈ Cn×n is called a strictly α diagonally dominant matrix
Dα if there exits α ∈ 0, 1, such that
|aii | > αRi A 1 − αSi A,
∀i ∈ N.
2.5
Definition 2.3 see 4. Let A aij ∈ Cn×n , if there exits α ∈ 0, 1 such that
|aii |ajj > αRi ARj A 1 − αSi ASj A,
∀i, j ∈ N, i /
j,
2.6
then A is called a strictly doubly α diagonally dominant matrix DDα.
In 3, 5, 6, some people studied the convergence of AOR method for solving linear
system Ax b and gave the areas of convergence. In 5, Cvetković and Herceg studied the
convergence of AOR method for strictly diagonally dominant matrices. In 3, Huang and
Wang studied the convergence of AOR method for strictly α diagonally dominant matrices.
In 6, Gao and Huang studied the convergence of AOR method for strictly doubly diagonally
dominant matrices.
Journal of Applied Mathematics
3
Theorem 2.4 see 3. Let A ∈ Dα, then AOR method converges for
2
s,
1 ρM0,1 MA
2σ
2
t, 0 < ω < max
,
1 maxi Pi,α L U 1 ρMσ,σ I 0 ≤ σ < −ω1 − Pi,α L U 2 max0, ω − 1
< σ < 0,
II maxi
2Pi,α L
III t ≤ σ < mini
or
2.7
0 < ω < t,
ω1 Pi,α L − Pi,α U 2 min0, 1 − ω
,
2Pi,α L
or
0 < ω < t.
Theorem 2.5 see 6. Let A ∈ DD, then AOR method converges for
I
2
s,
1 ρM0,1 MA
⎫
⎧
⎪
⎪
⎬
⎨
2σ
2
t ,
0 < ω < max , min
⎪
⎪
⎭
⎩ 1 ρMσ,σ ii,j/ j 1 Ri L URj L U
0≤σ< II
ωP1 −
ω2 P22 P3 min ω2 , ω − 22
max
III
t ≤ σ < min
i,j
i/
j
ωP4 2.8
< σ < 0,
P3
i,j
i
/j
or
0 < ω < t,
or
ω2 P52 P3 min ω2 , ω − 22
P3
,
0 < ω < t,
where
P1 Ri LRj L U Ri L URj L,
P2 Ri LRj L U − Ri L URj L,
P3 4Ri LRj L,
P4 Ri L Rj L − Rj U Rj LRi L − Ri U,
P5 Ri L Rj L − Rj U − Rj LRi L − Ri U.
2.9
3. Upper Bound for Spectral Radius of Mσ,ω
In the following, we present an upper bound for spectral radius of AOR iterative matrix Mσ,ω
for strictly doubly α diagonally dominant coefficient matrix.
Lemma 3.1 see 4. If A ∈ DDα, then A is a nonsingular H-matrix.
4
Journal of Applied Mathematics
j,
Theorem 3.2. Let A ∈ DDα, if 1−σ 2 αRi LRj L1−αSi LSj L > 0, for all j ∈ N, i /
then
ρMσ,ω ≤ max
A2 i,j
i/
j
A22 − 4A1 A3
2A1
,
3.1
where
A1 1 − σ 2 αRi LRj L 1 − αSi LSj L ,
A2 2|1 − ω| 2|ω − σ||σ| αRi LRj L 1 − αSi LSj L
|ω||σ| αRi LRj U 1 − αSi USj L
|ω|σ|| αRi URj L 1 − αSi LSj U ,
A3 1 − ω2 − α|ω − σ|Ri L |ω|Ri U |ω − σ|Rj L |ω|Rj U
− 1 − α|ω − σ|Si L |ω|Si U |ω − σ|Sj L |ω|Sj U .
3.2
Proof. Let λ be an eigenvalue of Mσ,ω such that
detλI − Mσ,ω 0,
3.3
detλI − σL − 1 − ωI ω − σL ωU 0.
3.4
that is,
If λI − σL − 1 − ωI ω − σL ωU ∈ DDα, then by Lemma 3.1, λI − σL − 1 − ωI ω − σL ωU is nonsingular and λ is not an eigenvalue of iterative matrix Mσ,ω , that is, if
⎤
⎡
|λσlit − ω − σlit − ωuit | ⎣ λσljl − ω − σljl − ωujl ⎦
|λ − 1 − ω| > α
2
t/
i
l
/j
⎡
1 − α⎣
⎤
|λσlki − ω − σ|lki − ωuki ⎦
k/
i
⎡
⎤
× ⎣ λσlsj − ω − σlsj − ωusj ⎦,
s/
j
∀i, j ∈ N, i / j,
3.5
Journal of Applied Mathematics
5
then λ is not an eigenvalue of Mσ,ω . Especially, if
|λ||σ||lit | |ω − σ||lit | |ω||uit |
|λ| − |1 − ω| > α
2
t
/i
⎡
⎤
×⎣
|λ||σ|ljl |ω − σ|ljl |ω|ujl ⎦
l/
j
⎡
1 − α⎣
⎤
3.6
|λ||σ||lki | |ω − σ||lki | |ω|uki ⎦
k/
i
⎡
×⎣
⎤
|λ||σ|lsj |ω − σ|lsj |ω|usj ⎦,
s
/j
then λ is not an eigenvalue of Mσ,ω .
j, such that
If λ is an eigenvalue of Mσ,ω , then there exits at least a couple of i, j i /
|λ||σ||lit | |ω − σ||lit | |ω||uit |
|λ| − |1 − ω| ≤ α
2
t/
i
⎤
⎡
×⎣
|λ||σ|ljl |ω − σ|ljl |ω|ujl ⎦
l/
j
⎡
1 − α⎣
⎤
3.7
|λ||σ||lki | |ω − σ||lki | |ω|uki ⎦
k/
i
⎡
×⎣
⎤
|λ||σ|lsj |ω − σ|lsj |ω|usj ⎦,
s
/j
that is,
3.8
A1 |λ|2 − A2 |λ| A3 ≤ 0.
Since A1 1−σ 2 αRi LRj L1−αSi LSj L > 0, and the discriminant Δ of the quadratic
in |λ| satisfies Δ ≥ 0, then the solution of 3.8 satisfies
A2 −
A22 − 4A1 A3
2A1
≤ |λ| ≤
A2 A22 − 4A1 A3
2A1
.
3.9
6
Journal of Applied Mathematics
So
ρMσ,ω ≤ max
A2 i,j
i/
j
A22 − 4A1 A3
2A1
3.10
.
4. Improving Results on Convergence of AOR Method
In this section, we present new results on convergence of AOR method.
Theorem 4.1. Let A ∈ DDα, then AOR method converges if ω, σ satisfy either
I
2
0≤σ< s,
1 ρM0,1 MA
2σ
0 < ω < max ,
1 ρMσ,σ min
i,j
i/
j
II
max
ωP1 −
2
t ,
⎭
1 αRi L URj L U 1 − αSi L USj L U
< σ < 0,
P3
i,j
t ≤ σ < min
i,j
i/
j
or
ω2 P22 P3 min ω2 , ω − 22
i/
j
III
⎫
⎬
ωP4
0 < ω < t,
or
ω2 P52 P3 min ω2 , ω − 22
P3
,
0 < ω < t,
4.1
where
P1 αRi LRj L U 1 − αSi LSj L U αRi L URj L
1 − αSi L USj L,
P2 αRi LRj L U 1 − αSi LSj L U − αRi L URj L
− 1 − αSi L USj L,
P3 4 αRi LRj L 1 − αSi LSj L ,
P4 αRi L Rj L − Rj U 1 − αSi L Sj L − Sj U
P5
αRj LRi L − Ri U 1 − αSj LSi L − Si U,
αRi L Rj L − Rj U 1 − αSi L Sj L − Sj U
− αRj LRi L − Ri U − 1 − αSj LSi L − Si U.
4.2
Journal of Applied Mathematics
7
Proof. It is easy to verify that for each σ, which satisfies one of the conditions I–III, we
have
1 − σ 2 αRi LRj L 1 − αSi LSj L > 0,
∀i /
j; i, j ∈ N.
4.3
Firstly, we consider case I. Since A be a α diagonally dominant matrix, then by
Lemma 3.1, we know that A is a nonsingular H-matrix; therefore, MA is a nonsingular
M-matrix, and it follows that from paper 7, ρMσ,σ < 1 holds for 0 ≤ σ < s and for σ /
0,
Mσ,ω
ω
ω
I
Mσ,σ .
1−
σ
σ
4.4
If 0 < ω < max{2σ/1 ρMσ,σ , t} 2σ/1 ρMσ,σ , then
0<
ω
2
<
,
σ
1 ρMσ,σ 4.5
by extrapolation theorem 6, we have ρMσ,ω < 1.
If 0 < ω < max{2σ/1 ρMσ,σ , t} t, then it remains to analyze the case
2σ
≤ ω < t,
1 ρMσ,σ 0 ≤ σ < s.
4.6
Since when ρMσ,σ < 1,
2σ
,
1 ρMσ,σ σ<
4.7
then 0 ≤ σ < ω. From
A 2 − A3 < A 1
a
A2 < 2A1 ≤ 2
b ⇐⇒
A22 − 4A1 A3 < 2A1 ≤ 2 c
we have ρMσ,ω ≤ max i,j A2 i
/j
A2 A22 − 4A1 A3 /2A1 < 1.
A22 − 4A1 A3
2A1
< 1,
4.8
8
Journal of Applied Mathematics
1 When ω ≤ 1, it easy to verify that 4.8 holds.
2 When ω > 1, since
A1 1 − σ 2 αRi LRj L 1 − αSi LSj L ,
A2 2ω − σ αRi LRj L 1 − αSi LSj L ωσ αRi LRj U 1 − αSi USj L
ωσ αRi URj L 1 − αSi LSj U 2|1 − ω|,
A3 1 − ω2 − αω − σRi L ωRi U ω − σRj L ωRj U
− 1 − αω − σSi L ωSi U ω − σSj L ωSj U ,
4.9
then by A2 − A3 < A1 and 1 − σ 2 αRi LRj L 1 − αSi LSj L > 0, for all i, j ∈ N, i /
j,
we have
ω2 1 − αRi L URj L U − 1 − αSi L USj L U − 4ω 4 > 0.
4.10
It is easy to verify that the discriminant Δ of the quadratic in ω satisfies Δ > 0, and so there
holds
ω1 >
2
,
1 − αRi L URj L U − 1 − αSi L USj L U
4.11
or
ω2 <
2
,
1 αRi L URj L U 1 − αSi L USj L U
∀i /
j.
4.12
For ω1 , we have A2 > 2, it is in contradiction with 4.8b. Therefore, ω1 should be deleted.
Secondly, we prove II.
1 When 0 < ω ≤ 1, σ < 0,
A2 21 − ω − 2ω − σσ αRi LRj L 1 − αSi LSj L
− ωσ αRi LRj U 1 − αSi USj L
− ωσ αRi URj L 1 − αSi LSj U ,
A3 1 − ω2 − αω − σRi L ωRi U ω − σRj L ωRj U
− 1 − αω − σSi L ωSi U ω − σSj L ωSj U .
4.13
Journal of Applied Mathematics
9
By A2 − A3 < A1 , we have
4σ 2 αRi LRj L 1 − αSi LSj L
− 2ωσ αRi LRj L U 1 − αSi LSj L U
αRi L URj L 1 − αSi L USj L
4.14
ω2 αRi L URj L U 1 − αSi L USj L U − 1 < 0.
It is easy to verify that the discriminant Δ of the quadratic in σ satisfies Δ > 0, and
so there holds
ωP1 − ω P22 P3
P3
<σ<
ωP1 ω P22 P3
P3
4.15
.
By σ < 0 and 1 − σ 2 αRi LRj L 1 − αSi LSj L > 0, for all i, j ∈ N, i /
j, we
obtain
max
ωP1 − ω P22 P3
i,j
i/
j
P3
4.16
< σ < 0.
2 When 1 < ω < t, σ < 0,
A2 2ω − 1 − 2ω − σσ αRi LRj L 1 − αSi LSj L
− ωσ αRi LRj U 1 − αSi USj L − ωσ αRi URj L 1 − αSi LSj U ,
A3 1 − ω2 − αω − σRi L ωRi U ω − σRj L ωRj U
− 1 − αω − σSi L ωSi U ω − σSj L ωSj U .
4.17
By A2 − A3 < A1 , we have
4σ 2 αRi LRj L 1 − αSi LSj L
− 2ωσ αRi LRj L U 1 − αSi LSj L U
αRi L URj L 1 − αSi L USj L
ω2 αRi L URj L U 1 − αSi L USj L U
− ω2 4ω − 4 < 0.
4.18
10
Journal of Applied Mathematics
It is easy to verify that the discriminant Δ of the quadratic in σ satisfies Δ > 0, and
so there holds
ωP1 −
ω2 P22 ω − 22 P3
P3
<σ<
ωP1 ω2 P22 ω − 22 P3
P3
.
4.19
By σ < 0 and 1 − σ 2 αRi LRj L 1 − αSi LSj L > 0, for all i, j ∈ N, i / j, we
obtain
max
ωP1 −
ω2 P22 ω − 22 P3
P3
i,j
i/
j
< σ < 0.
4.20
Therefore, by 4.16 and 4.20, we get
max
ωP1 −
ω2 P22 P3 min ω2 , ω − 22
i,j
i/
j
P3
< σ < 0.
4.21
Finally, we prove III.
1 When 0 < ω ≤ 1, σ ≥ t,
A2 21 − ω 2ω − σσ αRi LRj L 1 − αSi LSj L
ωσ αRi LRj U 1 − αSi USj L ωσ αRi URj L 1 − αSi LSj U ,
A3 1 − ω2 − ασ − ωRi L ωRi U σ − ωRj L ωRj U
− 1 − ασ − ωSi L ωSi U σ − ωSj L ωSj U .
4.22
By A2 − A3 < A1 , we have
4σ 2 αRi LRj L 1 − αSi LSj L
− 2ωσ αRi L Rj L − Rj U 1 − αSi L Sj L − Sj U
− 2ωσ αRj LRi L − Ri U 1 − αSj LSi L − Si U
ω2 αRi L − Ri U Rj L − Rj U
1 − αSi L − Si U Sj L − Sj U − ω2 < 0.
4.23
Journal of Applied Mathematics
11
It is easy to verify that the discriminant Δ of the quadratic in σ satisfies Δ > 0, and
so there holds
ωP4 − ω P52 P3
P3
<σ<
ωP4 ω P52 P3
P3
4.24
.
By σ ≥ t, we obtain
t ≤ σ < min
ωP4 ω P52 P3
P3
i,j
i/
j
4.25
.
2 When 1 < ω < t, σ ≥ t,
A2 2ω − 1 2σ − ωσ αRi LRj L 1 − αSi LSj L
ωσ αRi LRj U 1 − αSi USj L
ωσ αRi URj L 1 − αSi LSj U ,
A3 1 − ω2 − ασ − ωRi L ωRi U σ − ωRj L ωRj U
− 1 − ασ − ωSi L ωSi U σ − ωSj L ωSj U .
4.26
By A2 − A3 < A1 , we have
4σ 2 αRi LRj L 1 − αSi LSj L
− 2ωσ αRi L Rj L − Rj U 1 − αSi L Sj L − Sj U
− 2ωσ αRj LRi L − Ri U 1 − αSj LSi L − Si U
ω2 αRi L − Ri U Rj L − Rj U
1 − αSi L − Si U Sj L − Sj U − ω2 4ω − 4 < 0.
4.27
It is easy to verify that the discriminant Δ of the quadratic in σ satisfies Δ > 0, and
so there holds
ωP4 −
ω2 P52 ω − 22 P3
P3
<σ<
ωP4 ω2 P52 ω − 22 P3
P3
.
4.28
By σ ≥ t, we obtain
t ≤ σ < min
i,j
i/
j
ωP4 ω2 P52 ω − 22 P3
P3
.
4.29
12
Journal of Applied Mathematics
Therefore, by 4.25 and 4.29, we obtain
t ≤ σ < min
ωP4
ω2 P52 P3 min ω2 , ω − 22
i/
j
4.30
.
P3
i,j
We can obtain the following results easily.
Theorem 4.2. Let A ∈ DDα. If Ri L URj L U − Si L USj L U > 0, when 0 < ω ≤ 1,
the following conditions hold:
Ri L URj L U − Si L USj L U
0≤α<
I
ωP1 −
II
Ri L URj L U − Si L USj L U
ω2 P22 ω2 P3
P3
III
ωP4 ω2 P52 ω2 P3
P3
ωP1 −
<
ω2 P22 ω2 P3
P3
>
ωP4 ,
4.31
,
ω2 P52 ω2 P3
P3
,
or when 1 < ω < t, the following conditions hold:
I
II
III
Ri L URj L U − Si L USj L U
0≤α<
ωP1 −
Ri L URj L U − Si L USj L U
ω2 P22 ω − 22 P3
P3
ωP4 ω2 P52 ω − 22 P3
P3
<
>
ωP1 −
ω2 P22 ω − 22 P3
P3
ωP4 ,
ω2 P52 ω − 22 P3
P3
4.32
,
,
then the area of convergence of AOR method obtained by Theorem 4.1 is larger than that obtained by
Theorem 2.5.
Journal of Applied Mathematics
13
Theorem 4.3. Let A ∈ DDα. If Ri L URj L U − Si L USj L U < 0, when 0 < ω ≤ 1,
the following conditions hold:
Ri L URj L U − Si L USj L U
< α ≤ 1,
Ri L URj L U − Si L USj L U
ωP1 − ω2 P22 ω2 P3
ωP1 − ω2 P22 ω2 P3
<
,
II
P3
P3
ωP4 ω2 P52 ω2 P3
ωP4 ω2 P52 ω2 P3
>
,
III
P3
P3
I
4.33
or when 1 < ω < t, the following conditions hold:
Ri L URj L U − Si L USj L U
< α ≤ 1,
Ri L URj L U − Si L USj L U
ωP1 − ω2 P22 ω − 22 P3
ωP1 − ω2 P22 ω − 22 P3
<
,
II
P3
P3
ωP4 ω2 P52 ω − 22 P3
ωP4 ω2 P52 ω − 22 P3
>
,
III
P3
P3
I
4.34
then the area of convergence of AOR method obtained by Theorem 4.1 is larger than that obtained by
Theorem 2.5.
5. Examples
In the following examples, we give the areas of convergence of AOR method to show that
our results are better than ones obtained by Theorems 2.4 and 2.5.
Example 5.1 see 6. Let
⎛
⎞
5 3 2
A ⎝2 6 3⎠ D − T − S,
2 1 9
5.1
14
Journal of Applied Mathematics
where
⎛
⎞
5 0 0
D ⎝0 6 0⎠,
0 0 9
⎞
0 0 0
⎟
⎜ 1
⎜−
0 0⎟
⎟,
L D−1 T ⎜
3
⎟
⎜
⎝ 2 1 ⎠
− − 0
9 9
⎛
⎛
⎞
0 −3 −2
S ⎝0 0 −3⎠,
0 0 0
⎛
⎞
3 2
⎛
⎞
3 2
⎜ 0 −5 −5⎟
⎜
⎟
⎜0 − 5 − 5 ⎟
⎜ 1
1⎟
⎜
⎟
⎜−
⎟
−1
0
−
1
⎜
⎟
.
U D S ⎜0 0 − ⎟,
LU ⎜ 3
2⎟
⎜
⎟
⎝
⎠
2
⎜ 2 1
⎟
⎝
⎠
− −
0
0 0 0
9 9
5.2
⎛
⎞
0 0 0
T ⎝−2 0 0⎠,
−2 −1 0
Obviously, A ∈
/ SD, but A ∈ DD1/2.
By Theorem 4.1, we have the following area of convergence:
1
0 ≤ σ < 1.1896,
2
0 < ω ≤ 1,
2σ
0 < ω < max 1.2390, ,
1 ρMσ,σ −1.0266ω < σ < 0,
or
1 < ω < 1.2390,
3
0 < ω ≤ 1,
1.2390 < σ < 2.0266ω,
or
√
22ω − 3 201ω2 − 800ω 800
20
or
1.2390 ≤ σ <
1 < ω < 1.2390,
< σ < 0,
or
√
−2ω 3 201ω2 − 800ω 800
20
.
5.3
Obviously, A ∈ DD.
By Theorem 2.5, we have the following area of convergence:
2σ
0 < ω < max 1.0455, ,
1 ρMσ,σ 1
0 ≤ σ < 1.1896,
2
0 < ω ≤ 1,
−0.6712ω < σ < 0,
or
or
1 < ω < 1.0455,
3
0 < ω ≤ 1,
1.0455 ≤ σ < 1.6712ω,
√
7ω − 3 17ω2 − 64ω 64
8
In addition, A ∈ D1/2.
5.4
or
or
1 < ω < 1.0455,
< σ < 0,
1.0455 ≤ σ <
√
ω 3 17ω2 − 64ω 64
8
.
Journal of Applied Mathematics
15
2.4
2.2
2
1.8
1.6
1.4
1.2
1
0.8
0.6
0.4
0.2
0
−1.4
−1
−0.6 −0.2
0.2
0.6
1
1.4
1.8
2.2
1
1.4
1.8
2.2
Figure 1
2.4
2.2
2
1.8
1.6
1.4
1.2
1
0.8
0.6
0.4
0.2
0
−1.4
−1
−0.6 −0.2
0.2
0.6
Figure 2
By Theorem 2.4, we have the following area of convergence:
2σ
0 < ω < max 1.1250, ,
1 ρMσ,σ 1
0 ≤ σ < 1.1896,
2
0 < ω ≤ 1,
−0.6ω < σ < 0,
3
0 < ω ≤ 1,
1.1250 < σ < 1.31ω,
or
1 < ω < 1.1250,
or
or
7
9
ω − < σ < 0,
5
5
1 < ω < 1.1250,
or
1.1250 ≤ σ < 1.8 − 0.49ω.
5.5
Now we give two figures. In Figure 1, we can see that the area of convergence
obtained by Theorem 4.1 real line is larger than that obtained by Theorem 2.5 virtual line.
In Figure 2, we can see that the area of convergence obtained by Theorem 4.1 real line
16
Journal of Applied Mathematics
is larger than that obtained by Theorem 2.4 virtual line. From above we know that the area
of convergence obtained by Theorem 4.1 is larger than ones obtained by Theorems 2.5 and
2.4.
Example 5.2. Let
⎛
⎞
3 1 1
A ⎝1 2 1 ⎠.
2 1 3
5.6
Obviously, A ∈ DD1/3, A ∈
/ Dα, A ∈
/ DD. So we cannot use Theorems 2.4 and 2.5. By
Theorem 4.1, we have the following area of convergence:
1
0 ≤ σ < 1.0724,
2
0 < ω ≤ 1,
2σ
0 < ω < max 1.0693, ,
1 ρMσ,σ −0.1973ω < σ < 0,
or
1 < ω < 1.0693,
3
or
37ω −
√
1945ω2 − 7776ω 7776
36
< σ < 0,
or 5.7
0 < ω ≤ 1,
1.0693 < σ < 1.1973ω, or
√
1945ω2 − 7776ω 7776 − ω
1 < ω < 1.0693, 1.0693 ≤ σ <
.
36
Acknowledgments
The authors would like to thank the referees for their valuable comments and suggestions,
which greatly improved the original version of this paper. This work was supported by the
National Natural Science Foundation of China Grant no. 11001144 and the Science and
Technology Program of Shandong Universities of China J10LA06.
References
1 A. Hadjidimos, “Accelerated overrelaxation method,” Mathematics of Computation, vol. 32, no. 141, pp.
149–157, 1978.
2 B. Li and M. J. Tsatsomeros, “Doubly diagonally dominant matrices,” Linear Algebra and Its Applications,
vol. 261, pp. 221–235, 1997.
3 T. Z. Huang and G. B. Wang, “Convergence theorems for the AOR method,” Applied Mathematics and
Mechanics, vol. 23, no. 11, pp. 1183–1187, 2002.
4 M. Li and Y. X. Sun, “Discussion on α-diagonally dominant matrices and their applications,” Chinese
Journal of Engineering Mathematics, vol. 26, no. 5, pp. 941–945, 2009.
5 L. Cvetković and D. Herceg, “Convergence theory for AOR method,” Journal of Computational
Mathematics, vol. 8, no. 2, pp. 128–134, 1990.
6 Z.-X. Gao and T.-Z. Huang, “Convergence of AOR method,” Applied Mathematics and Computation, vol.
176, no. 1, pp. 134–140, 2006.
7 A. Hadjidimos and A. Yeyios, “The principle of extrapolation in connection with the accelerated
overrelaxation method,” Linear Algebra and Its Applications, vol. 30, pp. 115–128, 1980.
Advances in
Operations Research
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Advances in
Decision Sciences
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Mathematical Problems
in Engineering
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Journal of
Algebra
Hindawi Publishing Corporation
http://www.hindawi.com
Probability and Statistics
Volume 2014
The Scientific
World Journal
Hindawi Publishing Corporation
http://www.hindawi.com
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
International Journal of
Differential Equations
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Volume 2014
Submit your manuscripts at
http://www.hindawi.com
International Journal of
Advances in
Combinatorics
Hindawi Publishing Corporation
http://www.hindawi.com
Mathematical Physics
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Journal of
Complex Analysis
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
International
Journal of
Mathematics and
Mathematical
Sciences
Journal of
Hindawi Publishing Corporation
http://www.hindawi.com
Stochastic Analysis
Abstract and
Applied Analysis
Hindawi Publishing Corporation
http://www.hindawi.com
Hindawi Publishing Corporation
http://www.hindawi.com
International Journal of
Mathematics
Volume 2014
Volume 2014
Discrete Dynamics in
Nature and Society
Volume 2014
Volume 2014
Journal of
Journal of
Discrete Mathematics
Journal of
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Applied Mathematics
Journal of
Function Spaces
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Optimization
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Download