Document 10824660

advertisement
Hindawi Publishing Corporation
Abstract and Applied Analysis
Volume 2012, Article ID 941063, 10 pages
doi:10.1155/2012/941063
Research Article
Exponential Convergence for Cellular Neural
Networks with Time-Varying Delays in the Leakage
Terms
Zhibin Chen1 and Junxia Meng2
1
2
School of Science, Hunan University of Technology, Hunan, Zhuzhou 412000, China
College of Mathematics, Physics and Information Engineering, Jiaxing University,
Zhejiang, Jiaxing 314001, China
Correspondence should be addressed to Junxia Meng, mengjunxia1968@yahoo.com.cn
Received 18 August 2012; Accepted 3 October 2012
Academic Editor: Narcisa C. Apreutesei
Copyright q 2012 Z. Chen and J. Meng. This is an open access article distributed under the
Creative Commons Attribution License, which permits unrestricted use, distribution, and
reproduction in any medium, provided the original work is properly cited.
We consider a class of cellular neural networks with time-varying delays in the leakage terms. By
applying Lyapunov functional method and differential inequality techniques, we establish new
results to ensure that all solutions of the networks converge exponentially to zero point.
1. Introduction
It is well known that the delayed cellular neural networks CNNs have been successfully
applied to signal and image processing, pattern recognition, and optimization see 1.
Hence, they have been the object of intensive analysis by numerous authors in the past
decades. In particular, extensive results on the problem of the existence and stability of the
equilibrium point for CNNs are given out in many works in the literature. We refer the reader
to 2–6 and the references cited therein. Recently, to consider CNNs with the incorporation
of time delays in the leakage terms, Gopalsamy 7 and Wang et al. 8 investigated a class of
CNNS described by
n
xi t − ci txi t − ηi t aij tfj xj t − τij t
j1
n
bij t
j1
∞
Kij ugj xj t − u du Ii t,
i 1, 2, . . . , n,
0
1.1
2
Abstract and Applied Analysis
in which n corresponds to the number of units in a neural network, xi t corresponds to the
state vector of the ith unit at the time t, and ci t represents the rate with which the ith unit
will reset its potential to the resting state in isolation when disconnected from the network
and external inputs at the time t. aij t and bij t are the connection weights at the time t,
ηi t ≥ 0 and τij t ≥ 0 denote the leakage delay and transmission delay, respectively, Ii t
denotes the external bias on the ith unit at the time t, fj and gj are activation functions of
signal transmission, and i, j 1, 2, . . . , n.
Suppose that the following conditions
H0 ci and ηi are constants, where i 1, 2, . . . , n,
j such that
H0∗ for each j ∈ {1, 2, . . . , n}, there exists a nonnegative constant L
fj u − fj v ≤ L
j |u − v|,
∀u, v ∈ R,
1.2
are satisfied. Avoiding the continuously distributed delay terms, the authors of 7, 8 obtained
that all solutions of system 1.1 converge to the equilibrium point or the periodic solution.
However, to the best of our knowledge, few authors have considered the convergence
behavior for all solutions of system 1.1 without the assumptions H0 and H0∗ . Thus, it is
worthwhile to continue to investigate the convergence behavior of system 1.1 in this case.
The main purpose of this paper is to give the new criteria for the convergence behavior
for all solutions of system 1.1. By applying Lyapunov functional method and differential
inequality techniques, without assuming H0 and H0∗ , we derive some new sufficient
conditions ensuring that all solutions of system 1.1 converge exponentially to zero point.
Moreover, an example is also provided to illustrate the effectiveness of our results.
Throughout this paper, for i, j
1, 2, . . . , n, it will be assumed that
ci , Ii , ηi , aij , bij , τij : R → R and Kij : 0, ∞ → R are continuous functions, and
there exist constants ci , ηi , aij , bij and τij such that
ci sup ci t,
t∈R
ηi sup ηi t,
t∈R
bij supbij t,
t∈R
aij supaij t,
t∈R
τij sup τ ij t.
1.3
t∈R
We also assume that the following conditions H1 , H2 , and H3 hold:
j and Lj such
H1 for each i, j ∈ {1, 2, . . . , n}, there exist nonnegative constants L
that
fj u ≤ L
j |u|,
gj u ≤ Lj |u|,
∀u ∈ R,
1.4
Abstract and Applied Analysis
3
H2 for all t > 0 and i, j ∈ {1, 2, . . . , n}, there exist constants η > 0, λ > 0 and ξi > 0
such that
∞
Kij ueλu du < ∞,
0
−η > − ci t − λe−ληi t − ηi tci t λ ci eληi eληi t ξi
n
j aij teλτij t a ηi tci teληi t eλτij ξj
L
1.5
ij
j1
n
j1
∞
Lj
Kij ueλu du bij t b ηi tci teληi t ξj ;
ij
0
H3 Ii t Oe−λt t → ±∞, i 1, 2, . . . , n.
The initial conditions associated with system 1.1 are of the form
xi s ϕi s,
s ∈ −∞, 0, i 1, 2, . . . , n,
1.6
where ϕi · denotes real-valued-bounded continuous function defined on −∞, 0.
2. Main Results
Theorem 2.1. Let H1 , H2 , and H3 hold. Then, for every solution Zt
x1 t, x2 t, . . . , xn tT of system 1.1 with any initial value ϕ ϕ1 t, ϕ2 t, . . . , ϕn tT ,
there exists a positive constant K such that
|xi t| ≤ Kξi e−λt
∀t > 0, i 1, 2, . . . , n.
2.1
Proof. Let Zt x1 t, x2 t, . . . , xn tT be a solution of system 1.1 with any initial value
ϕ ϕ1 t, ϕ2 t, . . . , ϕn tT , and let
Xi t eλt xi t,
i 1, 2, . . . , n.
2.2
4
Abstract and Applied Analysis
In view of 1.1, we have
⎡
n
Xi t λXi t eλt ⎣−ci txi t − ηi t aij tfj xj t − τij t
j1
⎤
∞
n
bij t
Kij ugj xj t − u du Ii t⎦
0
j1
λXi t − ci teληi t Xi t − ηi t
⎡
n
eλt ⎣ aij tfj e−λt−τij t Xj t − τij t
j1
⎤
∞
n
bij t
Kij ugj e−λt−u Xj t − u du Ii t⎦,
i 1, 2, . . . , n.
0
j1
2.3
Let
M max sup eλs ϕi s .
i1,2,..., n s≤0
From 1.3, H2 , and H3 , we can choose a positive constant K such that
ηi tci teληi t 1 supt∈R Ii teλt , ∀t > 0, i 1, 2, . . . , n.
Kξi > M, η >
K
2.4
2.5
Then, it is easy to see that
|Xi t| ≤ M < Kξi
∀t ≤ 0, i 1, 2, . . . , n.
2.6
We now claim that
|Xi t| < Kξi
∀t > 0, i 1, 2, . . . , n.
2.7
If this is not valid, then, one of the following two cases must occur:
1 there exist i ∈ {1, 2, . . . , n} and t∗ > 0 such that
Xi t∗ Kξi ,
Xj t < Kξj
∀t < t∗ , j 1, 2, . . . , n,
2.8
2 there exist i ∈ {1, 2, . . . , n} and t∗∗ > 0 such that
Xi t∗∗ −Kξi ,
Now, we consider two cases.
Xj t < Kξj
∀t < t∗∗ , j 1, 2, . . . , n.
2.9
Abstract and Applied Analysis
5
Case i. If 2.8 holds. Then, from 2.3, 2.5, and H1 −H3 , we have
0 ≤ Xi t∗ ∗
λXi t∗ − ci t∗ eληi t Xi t∗ − ηi t∗ ⎡
n
∗
∗
∗
eλt ⎣ aij t∗ fj e−λt −τij t Xj t∗ − τij t∗ j1
n
bij t∗ ∞
⎤
Kij ugj e−λt −u Xj t∗ − u du Ii t∗ ⎦
∗
0
j1
∗
∗ λXi t∗ − ci t∗ eληi t Xi t∗ ci t∗ eληi t Xi t∗ − Xi t∗ − ηi t∗ ⎡
n
∗
∗
∗
eλt ⎣ aij t∗ fj e−λt −τij t Xj t∗ − τij t∗ j1
n
bij t∗ ∞
⎤
Kij ugj e−λt −u Xj t∗ − u du Ii t∗ ⎦
∗
0
j1
t∗
∗
∗
− ci t∗ eληi t − λ Xi t∗ ci t∗ eληi t e
⎡
n
λt∗ ⎣
t∗ −ηi t∗ Xi sds
∗
∗
aij t∗ fj e−λt −τij t Xj t∗ − τij t∗ j1
⎤
∞
n
∗
−λt∗ −u
∗
∗ ⎦
bij t Kij ugj e
Xj t − u du Ii t 0
j1
∗
− ci t∗ eληi t − λ Xi t∗ ∗
ci t∗ eληi t ⎡
t∗
t∗ −ηi t∗ ⎣λXi s − ci seληi s Xi s − ηi s
⎛
eλs ⎝
n
aij sfj e−λs−τij s Xj s − τij s
j1
⎞⎤
∞
n
bij s Kij ugj e−λs−u Xj s − u du Ii s⎠⎦ds
j1
e
⎡
n
λt∗ ⎣
0
∗
∗
aij t∗ fj e−λt −τij t Xj t∗ − τij t∗ j1
n
j1
bij t∗ ∞
0
⎤
∗
Kij ugj e−λt −u Xj t∗ − u du Ii t∗ ⎦
6
Abstract and Applied Analysis
∗
∗
≤ − ci t∗ eληi t − λ − ηi t∗ ci t∗ eληi t λ ci eληi ξi K
n
j aij t∗ eλτij t∗ a ηi t∗ ci t∗ eληi t∗ eλτij ξj K
L
ij
j1
∞
n
Kij ueλu du bij t∗ b ηi t∗ ci t∗ eληi t∗ ξj K
Lj
ij
j1
0
∗
ηi t∗ ci t∗ eληi t 1 supt∈R Ii teλt ∗
∗
− ci t∗ − λe−ληi t − ηi t∗ ci t∗ λ ci eληi eληi t ξi
n
j aij t∗ eλτij t∗ a ηi t∗ ci t∗ eληi t∗ eλτij ξj
L
ij
j1
∞
n
Kij ueλu du bij t∗ b ηi t∗ ci t∗ eληi t∗ ξj
Lj
ij
j1
0
∗
ηi t∗ ci t∗ eληi t 1 supt∈R Ii teλt K
K
∗
ηi t∗ ci t∗ eληi t 1 supt∈R Ii teλt < −η K
K
< 0.
2.10
This contradiction implies that 2.8 does not hold.
Case ii. If 2.9 holds. Then, from 2.3, 2.5, and H1 −H3 , we get
0 ≥ Xi t∗∗ ∗∗
− ci t∗∗ eληi t − λ Xi t∗∗ ∗∗
ci t∗∗ eληi t
⎡
t∗∗
t∗∗ −η
i
t∗∗ ⎣λXi s − ci seληi s Xi s − ηi s
⎛
e
λs ⎝
n
aij sfj e−λs−τij s Xj s − τij s
j1
⎞⎤
∞
n
bij s
Kij ugj e−λs−u Xj s − u du Ii s⎠⎦ds
j1
0
Abstract and Applied Analysis
⎡
n
∗∗
∗∗
λt∗∗ ⎣
e
aij t∗∗ fj e−λt −τij t Xj t∗∗ − τij t∗∗ 7
j1
⎤
∞
n
∗∗
bij t∗∗ Kij ugj e−λt −u Xj t∗∗ − u du Ii t∗∗ ⎦
0
j1
∗∗
∗∗
≥ − ci t∗∗ eληi t − λ − ηi t∗∗ ci t∗∗ eληi t λ ci eληi −ξi K
n
j aij t∗∗ eλτij t∗∗ a ηi t∗∗ ci t∗∗ eληi t∗∗ eλτij −ξj K
L
ij
j1
∞
n
Kij ueλu du bij t∗∗ b ηi t∗∗ ci t∗∗ eληi t∗∗ −ξj K
Lj
ij
0
j1
∗∗
∗∗
− ηi t∗∗ ci t∗∗ eληi t |Ii t∗∗ | eλt
⎧
⎪
⎨
⎪
⎩
∗∗
∗∗
− ci t∗∗ − λe−ληi t − ηi t∗∗ ci t∗∗ λ ci eληi eληi t ξi
n
j aij t∗∗ eλτij t∗∗ a ηi t∗∗ ci t∗∗ eληi t∗∗ eλτij ξj
L
ij
j1
n
j1
∞
Lj
Kij ueλu du bij t∗∗ b ηi t∗∗ ci t∗∗ eληi t∗∗ ξj
ij
0
∗∗
ηi t∗∗ ci t∗∗ eληi t 1 supt∈R Ii teλt −K
K
∗∗
ηi t∗∗ ci t∗∗ eληi t 1 supt∈R Ii teλt > −η −K
K
> 0,
2.11
which is a contradiction and yields that 2.9 does not hold.
Consequently, we can obtain that 2.7 is true. Thus,
|xi t| ≤ Kξi e−λt
∀t > 0, i 1, 2, . . . , n.
This implies that the proof of Theorem 2.1 is now completed.
2.12
8
Abstract and Applied Analysis
3. An Example
Example 3.1. Consider the following CNNs with time-varying delays in the leakage terms:
x1 t
−
1 |t|sin2 t
20 −
1 2|t|
x1 t −
1 |sin t|
2000
!
|t|3 sin t
f
3 1
1 4|t|
x1 t − 2 sin2 t
∞
|t|7 sin t
2
x
t
−
3
sin
f
t
e−u g1 x1 t − udu
2
2
5
7
1 36|t|
1 4|t| 0
∞
t2 sin t
e−u g2 x2 t − udu e−3t sin t,
1 36t2 0
⎞
⎛
!
1 |t|7 cos2 t
1 |cos t|
t5 cos t ⎟
⎜
2
x
t
−
2
sin
x2 t − ⎝40 −
f
t
t
−
⎠x
2
1
1
2000
1 2|t|7
1 2|t|5
|t|5 sin t
3.1
|t|3 cos t ∞
t cos t f2 x2 t − 5 sin2 t e−u g1 x1 t − udu
1 5|t|
1 6|t|3 0
1 |t| cos t ∞ −u
e g2 x2 t − udu e−t sin t,
7 7|t|
0
where f1 x f2 x x cosx3 , g1 x g2 x x sinx2 .
Noting that
1 |t|7 cos2 t
1 |t|sin2 t
≤ 20,
18 ≤ c1 t 20 −
1 2|t|
η1 t a11 t b12 t a22 t 1
1 |sin t|
≤
,
2000
1000
|t|3 sin t
,
1 4|t|3
t2 sin t
,
1 36t2
t cos t
,
1 5|t|
τ12 t 3 sin2 t,
38 ≤ c2 t 40 −
η2 t |t|7 sin t
b11 t 1 4|t|
a21 t b22 t τ22 t 5 sin2 t,
,
7
t5 cos t
1 2|t|
,
5
1 |t| cos t
,
7 7|t|
i 1,
Li L
1 2|t|7
≤ 40,
1
1 |cos t|
≤
,
2000
1000
a12 t b21 t |t|5 sin t
1 36|t|5
|t|3 cos t
1 6|t|3
,
,
τ11 t τ21 t 2 sin2 t,
Kij u e−u ,
i, j 1, 2.
3.2
Abstract and Applied Analysis
9
Define a continuous function Γi ω by setting
Γi ω − ci t − ωe−ωηi t − ηi tci t ω ci eωηi eωηi t
2
j aij teωτij t a ηi tci teωηi t eωτij
L
ij
3.3
j1
∞
2
Lj Kij ueωu du bij t bij ηi tci teωηi t ,
j1
where t > 0, i 1, 2.
0
Then, we obtain
2
j aij t a ηi tci t
Γi 0 − ci t − ηi tci tci L
ij
j1
∞
2
Kij udu bij t b ηi tci t ,
Lj
ij
j1
3.4
i 1, 2.
0
Therefore,
Γ1 0 ≤ − 18 −
!
$
!
1
1 1
1
× 20 × 20 2
×
× 20 1000
4 4 1000
!%
1
1
1
×
× 20 ,
36 36 1000
< − 10,
! $
!
1 1
1
1
× 40 × 40 ×
× 40 1000
2 2 1000
$
!
!%
1 1
1
1 1
1
×
× 40 ×
× 40
6 6 1000
7 7 1000
Γ2 0 ≤ − 38 −
!%
1 1
1
×
× 40
5 5 1000
< − 20,
3.5
which, together with the continuity of Γi ω, implies that we can choose positive constants
λ > 0 and η > 0 such that for all t > 0, there holds
− η > Γi λ
− ci t − λ − ηi tci t λ ci eληi eληi t ξi
10
Abstract and Applied Analysis
2
j aij teλτij t a ηi tci teληi t eλτij ξj
L
ij
j1
2
∞
Lj
j1
Kij ueλu du bij t b ηi tci teληi t ξj ,
0
ij
where ξi 1, i 1, 2.
3.6
This yields that system 3.1 satisfied H1 , H2 , and H3 . Hence, from Theorem 2.1, all
solutions of system 3.1 converge exponentially to the zero point 0, 0, . . . , 0T .
Remark 3.2. Since f1 x f2 x x cosx3 , g1 x g2 x x sinx2 , and CNNs 3.1 are
a very simple form of CNNs with time-varying delays in the leakage terms, it is clear that
the conditions H0 and H0∗ are not satisfied. Therefore, all the results in 7–9 and the
references therein cannot be applicable to prove that all solutions of system 3.1 converge
exponentially to the zero point.
Acknowledgments
The authors would like to express their sincere appreciation to the reviewers for their helpful
comments in improving the presentation and quality of the paper; this work was supported
by the National Natural Science Foundation of China Grant no. 11201184, the Hunan
Provincial National Natural Science Foundation of China 12JJ3007, the Natural Scientific
Research Fund of Zhejiang Provincial of China Grants nos. Y6110436, LY2A01018, and
the Natural Scientific Research Fund of Zhejiang Provincial Education Department of China
Grant no. Z201122436.
References
1 L. O. Chua and T. Roska, “Cellular neural networks with nonlinear and delay-type template elements,”
in Proceedings of IEEE International Workshop on Cellular Neural Networks and Their Applications, pp. 12–
25, 1990.
2 H. J. Cho and J. H. Park, “Novel delay-dependent robust stability criterion of delayed cellular neural
networks,” Chaos, Solitons and Fractals, vol. 32, no. 3, pp. 1194–1200, 2007.
3 C. Ou, “Almost periodic solutions for shunting inhibitory cellular neural networks,” Nonlinear
Analysis. Real World Applications, vol. 10, no. 5, pp. 2652–2658, 2009.
4 B. Liu and L. Huang, “Global exponential stability of BAM neural networks with recent-history
distributed delays and impulses,” Neurocomputing, vol. 69, no. 16-18, pp. 2090–2096, 2006.
5 B. Liu, “Exponential convergence for a class of delayed cellular neural networks with time-varying
coefficients,” Physics Letters A, vol. 372, no. 4, pp. 424–428, 2008.
6 H. Zhang, W. Wang, and B. Xiao, “Exponential convergence for high-order recurrent neural networks
with a class of general activation functions,” Applied Mathematical Modelling, vol. 35, no. 1, pp. 123–129,
2011.
7 K. Gopalsamy, “Leakage delays in BAM,” Journal of Mathematical Analysis and Applications, vol. 325, no.
2, pp. 1117–1132, 2007.
8 H. Wang, C. Li, and H. Xu, “Existence and global exponential stability of periodic solution of cellular
neural networks with impulses and leakage delay,” International Journal of Bifurcation and Chaos in
Applied Sciences and Engineering, vol. 19, no. 3, pp. 831–842, 2009.
9 B. Liu, “Global exponential stability for BAM neural networks with time-varying delays in the leakage
terms,” Nonlinear Analysis. Real World Applications, vol. 14, no. 1, pp. 559–566, 2013.
Advances in
Operations Research
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Advances in
Decision Sciences
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Mathematical Problems
in Engineering
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Journal of
Algebra
Hindawi Publishing Corporation
http://www.hindawi.com
Probability and Statistics
Volume 2014
The Scientific
World Journal
Hindawi Publishing Corporation
http://www.hindawi.com
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
International Journal of
Differential Equations
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Volume 2014
Submit your manuscripts at
http://www.hindawi.com
International Journal of
Advances in
Combinatorics
Hindawi Publishing Corporation
http://www.hindawi.com
Mathematical Physics
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Journal of
Complex Analysis
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
International
Journal of
Mathematics and
Mathematical
Sciences
Journal of
Hindawi Publishing Corporation
http://www.hindawi.com
Stochastic Analysis
Abstract and
Applied Analysis
Hindawi Publishing Corporation
http://www.hindawi.com
Hindawi Publishing Corporation
http://www.hindawi.com
International Journal of
Mathematics
Volume 2014
Volume 2014
Discrete Dynamics in
Nature and Society
Volume 2014
Volume 2014
Journal of
Journal of
Discrete Mathematics
Journal of
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Applied Mathematics
Journal of
Function Spaces
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Optimization
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Hindawi Publishing Corporation
http://www.hindawi.com
Volume 2014
Download