Document 10951805

advertisement
Hindawi Publishing Corporation
Mathematical Problems in Engineering
Volume 2009, Article ID 826908, 13 pages
doi:10.1155/2009/826908
Research Article
Robust Stability Analysis of Fuzzy Neural Network
with Delays
Kaihong Zhao1, 2 and Yongkun Li1
1
2
Department of Mathematics, Yunnan University, Kunming, Yunnan 650091, China
Department of Mathematics, Yuxi Normal University, Yuxi, Yunnan 653100, China
Correspondence should be addressed to Yongkun Li, yklie@ynu.edu.cn
Received 19 April 2009; Revised 6 July 2009; Accepted 29 December 2009
Recommended by Tamas Kalmar-Nagy
We investigate local robust stability of fuzzy neural networks FNNs with time-varying and
S-type distributed delays. We derive some sufficient conditions for local robust stability of
equilibrium points and estimate attracting domains of equilibrium points except unstable
equilibrium points. Our results not only show local robust stability of equilibrium points but also
allow much broader application for fuzzy neural network with or without delays. An example is
given to illustrate the effectiveness of our results.
Copyright q 2009 K. Zhao and Y. Li. This is an open access article distributed under the Creative
Commons Attribution License, which permits unrestricted use, distribution, and reproduction in
any medium, provided the original work is properly cited.
1. Introduction
For the study of current neural network, two basic mathematical models are commonly
adopted: either local field neural network models or static neural network models. The basic
model of local field neural network is described as
ẋi t −xi t n
ωij gj xj t Ii ,
i 1, 2, . . . , n,
1.1
j1
where gj denotes the activation function of the jth neuron; xi is the state of the ith neuron; Ii
is the external input imposed on the ith neuron; ωij denotes the synaptic connectivity value
between the ith neuron and the jth neuron; n is the number of neurons in the network. With
the same notations, static neural network models can be written as
⎛
⎞
n
ẋi t −xi t gi ⎝ ωij xj t Ii ⎠,
j1
i 1, 2, . . . , n.
1.2
2
Mathematical Problems in Engineering
It is well known that local field neural network not only models Hopfield-type
networks 1 but also models bidirectional associative memory networks 2 and cellular
neural networks 3. Many deep theoretical results have been obtained for local field neural
network; we can refer to 4–12 and references cited therein. Meanwhile static neural network
has a great potential of applications. It not only includes the recurrent back-propagation
network 13–15 but also includes other extensively studied neural network such as the
optimization type network introduced in 16–18 and the brain-state-in-a-box BSB type
network 19, 20. In the past few years, there has been increasing interest in studying
dynamical characteristics such as stability, persistence, periodicity, local robust stability
of equilibrium points, and domains of attraction of local field neural network see21–
25.
However, in mathematical modeling of real world problems, we will encounter
some other inconvenience, for example, the complexity and the uncertainty or vagueness. Fuzzy theory is considered as a more suitable setting for the sake of taking
vagueness into consideration. Based on traditional cellular neural networks CNNs,
Yang and Yang proposed the fuzzy CNNs FCNNs 26, which integrates fuzzy
logic into the structure of traditional CNNs and maintains local connectedness among
cells. Unlike previous CNNs structures, FCNNs have fuzzy logic between its template
input and/or output besides the sum of product operation. FCNNs are very useful
paradigm for image processing problems, which is a cornerstone in image processing
and pattern recognition. Therefor, it is necessary to consider both the fuzzy logic
and delay effect on dynamical behaviors of neural networks. Nevertheless, to the
best of our knowledge, there are few published papers considering the local robust
stability of equilibrium points and domain of attraction for the fuzzy neural network
FNNs.
Therefore, in this paper, we will study the local robust stability of fuzzy neural network
with time-varying and S-type distributed delays:
⎛
n 0
⎝
u̇i t − ci λui t gi
j1
n
−τλ
⎞
uj t θdωij θ, λ Ii ⎠ n
j1
n
αij λfj uj t − τj t βij λfj uj t − τj t ,
j1
aij λfj uj t
1.3
i 1, 2, . . . , n,
j1
where αij λ and βij λ are elements of fuzzy feedback MIN template and fuzzy feedback
MAX template, respectively. aij λ are elements of feedback template. ui t stands for state
of the ith neurons. τj t is the transmission delay and fj t is the activation function.
∧ and ∨ denote the fuzzy AND and fuzzy OR operation, respectively. λ ∈ Ξ ⊂ R is
the parameter. The main purpose of this paper is to investigate local robust stability of
equilibrium points of FNNs 1.3. Sufficient conditions are gained for local robust stability
of equilibrium points. Meanwhile, the attracting domains of equilibrium points are also
estimated.
Mathematical Problems in Engineering
3
Throughout this paper, we always assume the following
A aij λ, αij λ, and βij λ are bounded in Ξ i, j 1, 2, . . . , n.
H1 infλ∈Ξ ci λ > 0, 0 ≤ τλ ≤ τ and ωij θ, λ i, j 1, 2, . . . , n are nondecreasing
0
bounded variation function on −τλ, 0 with ωij θ, λ > 0, and −τλ uj t θdωij θ, λ is Lebesgue-Stieltjes integral. I I1 , I2 , . . . , In T is a constant vector
which denotes an external input.
H2 gi ·, i 1, 2, . . . , n are second-order differentiable, bounded, and Lipschitz
continuous. There exist positive constants Li and Bi such that |gi x − gi y| ≤
Li |x − y| and |gi x| ≤ Bi for any x, y ∈ R.
H3 The activation functions fi ut with fi 0 0 bounded and Lipschitz continuous;
that is, there are some numbers μi > 0 and li > 0 such that |fi u| ≤ μi and |fi u −
fi v| ≤ li |u − v| for any u, v ∈ R, i 1, 2, . . . , n.
H4 Functions τj t, j 1, 2, . . . , n are nonnegative, bounded, and continuously
differentiable defined on R and 0 ≤ τj t ≤ τλ.
The rest of this paper is organized as follows. In Section 2, we will give some basic
definitions and basic results about the attracting domains of FNNs 1.3. In Section 3, we
discuss the local robust stability of equilibrium points of FNNs 1.3. In Section 4, an example
is given to illustrate the effectiveness of our results. Finally, we make a conclusion in Section 5.
2. Preliminaries
As usual, we denote by C−τλ, 0, Rn the set of all real-valued continuous mappings from
−τλ, 0 to Rn equipped with supremum norm · ∞ defined by
φ max sup φi t,
2.1
1≤i≤n −τλ<t≤0
where φ φ1 , φ2 , . . . , φn T ∈ C−τλ, 0, Rn . Denote by ut, φ, λ the solution of FNNs 1.3
with initial condition φ ∈ C−τλ, 0, Rn .
Definition 2.1. A vector u∗ λ u∗1 λ, u∗2 λ, . . . , u∗n λT is said to be an equilibrium point of
FNNs 1.3 if for each i 1, 2, . . . , n, one has
⎛
⎞
n
n
ci λu∗i λ gi ⎝ ω
ij λu∗j λ Ii ⎠ aij λfj u∗j λ
j1
where ω
ij λ :
0
−τλ
n
j1
αij λfj u∗j λ
j1
n
βij λfj u∗j λ
j1
2.2
,
i 1, 2, . . . , n,
dωij θ, λ. Denote by Ω the set of all equilibrium points of FNNs 1.3.
Definition 2.2. Let u∗ λ ∈ Ω.u∗ λ is said to be a locally robust attractive equilibrium point
if for any given λ ∈ Ξ, there is a neighborhood Yλ u∗ λ ⊂ C−τλ, 0, Rn such that
4
Mathematical Problems in Engineering
φ ∈ Yλ u∗ λ implies that limt → ∞ ut, φ, λ − u∗ λ 0. Otherwise, u∗ λ is said not to
be a locally robust attractive equilibrium point. Denote by Ω0 the set of all not locally robust
attractive equilibrium points of FNNs 1.3.
be subsets of Rn and let ut, φ, λ be a solution of FNNs 1.3 with
Definition 2.3. Let D, D
n
φ ∈ C−τλ, 0, R .
i For any given λ ∈ Ξ, if uσ, φ, λ ∈ D for some σ ≥ 0 implies that ut, φ, λ ∈ D for
all t ≥ σ, then D is said to be an attracting domain of FNNs 1.3.
for all θ ∈ −τλ, 0 implies that ut, φ, λ
ii For any given λ ∈ Ξ, if φθ ∈ D
∗
is said to be an attracting domain of u∗ λ ∈ Ω.
converges to u λ, then D
Correspondingly, the union of all attracting domains of equilibrium points of Ω is said to be
an attracting domain of Ω.
For a class of differential equation with the term of fuzzy AND and fuzzy OR
operation, there is the following useful inequality.
Lemma 2.4 26. Let u u1 , u2 , . . . , un T and v v1 , v2 , . . . , vn T be two states of 1.3; then
one has
n
n n
αij fj uj − αij fj vj ≤
αij fj uj − fj vj ,
j1
j1
j1
2.3
n
n n
αij fj uj − αij fj vj ≤
αij fj uj − fj vj .
j1
j1
j1
Lemma 2.5. Let ut be any solution of FNNs 1.3. Then ut is uniformly bounded. Moreover, H
is an attracting domain of FNNs 1.3, where
H : H1 × H2 × · · · × Hn ,
Bi Mi
Bi Mi
,
,
Hi −
infλ∈Ξ ci λ infλ∈Ξ ci λ
i 1, 2 . . . , n.
2.4
Proof. By 1.3 and Lemma 2.4, we have
d
|ui t| ≤ −inf ci λ|ui t| Bi Mi ,
λ∈Ξ
dt
2.5
where
,
Mi n max μj sup aij λ αij λ βij λ
1≤j≤n
i 1, 2 . . . , n.
2.6
λ∈Ξ
By using differential inequality, we have for t ≥ σ,
Bi Mi
Bi Mi
,
|ui t| ≤ exp σ − tinf ci λ |ui σ| −
λ∈Ξ
infλ∈Ξ ci λ
infλ∈Ξ ci λ
i 1, 2 . . . , n,
2.7
Mathematical Problems in Engineering
5
which leads to the uniform boundedness of ut. Furthermore, given any |ui σ| ≤ Bi Mi /infλ∈Ξ ci λ, i 1, 2 . . . , n, we get for all t ≥ σ,
|ui t| ≤
Bi Mi
.
infλ∈Ξ ci λ
2.8
Hence H is an attracting domain of FNNs 1.3. The proof is complete.
By Lemma 2.4, we have the following theorem.
Theorem 2.6. All equilibrium points of FNNs 1.3 lie in the attracting domain H, that is, Ω ⊂ H.
3. Local Robust Stability of Equilibrium Points
In this section, we should investigate local robust stability of equilibrium points of FNNs
1.3. We derive some sufficient conditions to guarantee local robust stable of equilibrium
points in Ω/Ω0 and estimate the attracting domains of these equilibrium points.
Theorem 3.1. Let u∗ λ u∗1 λ, u∗2 λ, . . . , u∗n λT ∈ Ω. If there exist positive constants βi i 1, 2, . . . , n such that for each i 1, 2, . . . , n
n
j1
< βi inf ci λ,
sup ω
ji λ ġj κj λ lj aij λ αij λ βij λ
βj
λ∈Ξ
λ∈Ξ
where κi λ n
j1
3.1
ω
ij λu∗j λ Ii , then one has the following.
1 u∗ λ ∈ Ω/Ω0 , that is, u∗ λ is locally robust stable.
2 Let
R : 2 min
i∈N
βi infλ∈Ξ ci λ
jk λ
ji λω
j1 βj maxζ∈R g̈j ζ supλ∈Ξ ω
k1
n
ji λġj κj λ lj aij λ αij λ βij λ
j1 βj supλ∈Ξ ω
.
−
n n
βj maxζ∈R g̈j ζsup
jk λ
ω
ji λω
n n
k1
j1
λ∈Ξ
3.2
Then every solution ut, φ, λ of FNNs 1.3 with φ ∈ Ou∗ λ satisfies
lim u t, φ, λ − u∗ λ∞ 0,
t → ∞
3.3
6
Mathematical Problems in Engineering
where
∗
Ou λ R
.
βi /min1≤i≤n βi
φ ∈ C−τλ, 0, R : φ − u∗ λ∞ < n n
i1
3.4
3 The open set
R
u ∈ R : u − u λ∞ < n /min
β
i
1≤i≤n βi
i1
∗
Bu λ :
n
u∗ λ∈Ω
∗
3.5
is an attracting domain of Ω, and Bu∗ λ is an attracting domain of u∗ λ.
The proof of Theorem 3.1 relies on the following lemma.
Lemma 3.2. Let u∗ λ u∗1 λ, u∗2 λ, . . . , u∗n λT ∈ Ω satisfying 3.1. Let ut, φ, λ be an
arbitrary solution of FNNs 1.3 other than u∗ , where φ ∈ C−τλ, 0, Rn . Let
V t n
βi ui t, φ, λ − u∗i λ,
3.6
i1
where βi is given by 3.1. Then one has the following.
A1 If uσ ·, φ, λ − u∗ λ < R for some σ ≥ 0, then D V σ < 0.
A2 If φ − u∗ λ∞ < R/ ni1 βi /min1≤i≤n βi and supσ−τ≤s≤σ V s ≤ sup−τ≤s≤0 V s for
some σ ≥ 0, then uσ ·, φ, λ − u∗ λ < R.
A3 If φ − u∗ λ∞ < R/
n
i1
βi /min1≤i≤n βi , then D V t < 0 for all t ≥ 0.
Proof. Under transformation yt ut, φ, λ − u∗ λ, we get that
n n d yi t
αij λ βij λ yj t − τj t ≤ −ci λyi t aij λyj t dt
j1
j1
n 0
ġi κi λ
j1
−τλ
yj t θdωij θ, λ
⎞2
⎛ n 0
g̈i ζi yj t θdωij θ, λ⎠ ,
⎝
2
j1 −τλ
3.7
Mathematical Problems in Engineering
7
due to
⎛
gi ⎝
⎞
n 0
j1
−τλ
⎛
n 0
⎠
⎝
uj t θdωij θ, λ Ii − gi
j1
⎛
⎞
n
n 0
∗
⎝
⎠
ġi
ω
ij λuj λ Ii
j1
j1
−τλ
−τλ
⎞
u∗j λdωij θ, λ Ii ⎠
yj t θdωij θ, λ
3.8
⎞2
⎛ n 0
g̈i ζi yj t θdωij θ, λ⎠ ,
⎝
2
−τλ
j1
0
0
where ζi lies between nj1 −τλ uj t θdωij θ, λ Ii and nj1 −τλ u∗j λdωij θ, λ Ii . From
3.7, we can derive that
⎧
n
n ⎨
d V t ≤
βi −inf ci λyi t lj aij λyj t
⎩
λ∈Ξ
dt
i1
j1
n
lj αij λ βij λ yj t − τj t j1
⎤
⎡
n 0
g̈i ζi yj t θdωij θ, λ⎦
⎣ġi κi λ 2
−τλ
j1
×
⎫
⎬
yj t θdωij θ, λ
⎭
−τλ
n 0
j1
⎧
n
n
⎨
≤
βi −inf ci λyi t lj sup aij λ αij λ βij λ sup yj s
⎩
λ∈Ξ
t−τ≤s≤t
i1
j1 λ∈Ξ
⎫
⎤
⎡
n
n
g̈i ζi ⎬
⎣ġi κi λ ω
ij λ sup yj s⎦ × ω
ij λ sup yj s
⎭
2
t−τ≤s≤t
t−τ≤s≤t
j1
j1
⎧
n
n
⎨
≤
βi −inf ci λyi t lj sup aij λ αij λ βij λ sup yj s
⎩ λ∈Ξ
t−τ≤s≤t
i1
j1 λ∈Ξ
⎫
⎤
⎡
n
n
g̈i ζi ⎬
⎣ġi κi λ ω
ij λ sup yj s⎦ × ω
ij λ sup yj s
⎭
2
t−τ≤s≤t
t−τ≤s≤t
j1
j1
8
Mathematical Problems in Engineering
≤
⎧
n ⎨
−βi inf ci λ i1
⎩
λ∈Ξ
n
βj lj sup aij λ αij λ βij λ
j1
λ∈Ξ
⎡
⎤⎫
n
n
⎬
g̈j ζj sup yi s.
βj ω
ji λ⎣ġj κj λ ω
jk λ sup yk s⎦
⎭t−τ≤s≤t
2
t−τ≤s≤t
j1
j1
3.9
As uσ ·, φ, λ − u∗ λ∞ < R, we have for each i 1, 2, . . . , n, supt−τ≤s≤t |yi s| < R, which
imply that D V σ < 0.
Since min1≤i≤n {βi }uσ ·, φ, λ − u∗ λ∞ ≤ supσ−τ≤s≤τ V s and
sup V s −τ≤s≤0
n
βi
i1
n
∗
sup ui s, φ, λ − ui λ ≤
βi φ − u∗ λ∞ ,
−τ≤s≤0
3.10
i1
we have uσ ·, φ, λ − u∗ λ∞ ≤ ni1 βi /min1≤i≤n βi φ − u∗ λ∞ < R.
Since φ − u∗ λ∞ < R/ ni1 βi /min1≤i≤n βi < R, from A1 , we know that D V 0 <
0. We assert that A3 holds. Otherwise, there exist t0 > 0 such that D V t0 ≥ 0 and
D V t < 0 for all t ∈ 0, t0 . This implies that V t is strictly monotonically decreasing on
the interval 0, t0 . It is obvious that supt0 −τ≤s≤t0 V s ≤ sup−τ≤s≤t0 V s. By using A2 , we get
that ut0 ·, φ, λ − u∗ λ∞ < R. From A1 , D V t0 < 0. This leads to a contradiction. Hence
D V t < 0 for all t ≥ 0.
Now we are in a position to complete the proof of Theorem 3.1.
Proof. Let ut, φ, λ be an arbitrary solution of FNNs 1.3 other than u∗ λ and satisfy
φ − u∗ λ∞ < R/ ni1 βi /min1≤i≤n βi . It follows from A3 that D V t < 0 for all
t ≥ 0, that is, supt−τ≤s≤t V s ≤ sup−τ≤s≤0 V s for all t ≥ 0. Together with A2 we get
ut ·, φ, λ − u∗ λ∞ < R for all t ≥ 0. Take
χi βi inf ci λ
λ∈Ξ
n
− βj
λ∈Ξ
j1
ηi sup ω
ji λġj κj λ lj sup aij λ αij λ βij λ ,
n
n k1 j1
λ∈Ξ
3.11
βj maxg̈j ζsup ω
jk λ R.
ji λω
ζ∈R
λ∈Ξ
It is obvious that χi − ηi > 0 for each i 1, 2, . . . , n. From 3.9 we have
n
n yi s.
D V t < −min χi − ηi
sup yi s ≤ −min χi − ηi
1≤i≤n
i1 t−τ≤s≤t
1≤i≤n
i1
3.12
Mathematical Problems in Engineering
9
By integrating both sides of above inequality from 0 to t, we have
n
t
yi s ≤ V 0.
V t min χi − ηi
1≤i≤n
3.13
0 i1
It follows that
n
t
yi s ≤ V 0 < ∞.
lim sup min χi − ηi
t→∞
1≤i≤n
3.14
0 i1
Note that ut, φ, λ is bounded on R by Lemma 2.4; it follows from FNNs 1.3 that u̇ is
bounded on R . Hence |ut, φ, λ − u∗ λ| is uniformly continuous on R . From Lemma 2.5,
we get that limt → ∞ ni1 |ui t, φ, λ − u∗i λ| 0. So the assertions of 1 and 2 hold. Let
us consider an arbitrary solution ut, φ, λ of FNNs 1.3 satisfying φs ∈ Bu∗ λ for all
s ∈ −τλ, 0 and some u∗ λ ∈ Ω. Then it is obvious that
φ − u∗ λ < n
∞
R
.
β
/min
1≤i≤n βi
i1 i
3.15
From 2, we get limt → ∞ ut, φ, λ − u∗ λ∞ 0. Hence Bu∗ λ is an attracting domain of
u∗ λ. Consequently, the open set ∪u∗ λ∈Ω Bu∗ λ is an attracting domain of Ω. The proof is
complete.
Corollary 3.3. Let u∗ λ u∗1 λ, u∗2 λ, . . . , u∗n λT ∈ Ω. If there exist positive constants βi i 1, 2, . . . , n such that for each i 1, 2, . . . , n
n
βj
ω
ji ġj κj lj sup aij λ αij λ βij λ
n
j1
< βi ci ,
λ∈Ξ
j1
where κi 3.16
ω
ij u∗j λ Ii , then one has the following.
1 u∗ λ ∈ Ω/Ω0 , that is, u∗ λ is locally asymptotically stable.
2 Let
βi ci
ji ω
jk
j1 βj maxζ∈R g̈j ζ ω
k1
n
ω
ji ġj κj lj supλ∈Ξ aij λ αij λ βij λ
j1 βj
.
−
n n
ji ω
jk
j1 βj maxζ∈R g̈j ζ ω
k1
R : 2 min n n
i∈N
3.17
Then every solution ut, φ, λ of FNNs 1.3 with φ ∈ Ou∗ λ satisfies
lim u t, φ, λ − u∗ λ∞ 0,
t → ∞
3.18
10
Mathematical Problems in Engineering
where
R
∗
φ ∈ C−τ, 0, R : φ − u λ ∞ < n .
i1 βi /min1≤i≤n βi
∗
Ou λ n
3.19
3 The open set
∗
Bu λ :
u∗ λ∈Ω
R
u ∈ R : u − u λ∞ < n β
/min
i
1≤i≤n βi
i1
∗
n
3.20
is an attracting domain of Ω, and Bu∗ λ is an attracting domain of u∗ λ.
4. Illustrative Example
For convenience of illustrative purpose, we only consider simple fuzzy neural network with
time-varying and S-type distributed delays satisfying
⎧
⎪
ωij λ,
⎪
⎪
⎨
ωij θ, λ ⎪
⎪
⎪
⎩
0,
θ 0,
4.1
τλ ≡ τ.
−τ ≤ θ < 0,
Then fuzzy neural network with two neurons can be modeled by
⎛
⎞
2
2
u̇i t − ci λui t gi ⎝ uj tωij λ Ii ⎠ aij λfj uj t
j1
2
j1
2
αij λfj uj t − τj t βij λfj uj t − τj t ,
j1
4.2
i 1, 2.
j1
Take
c1 λ tanh4 − 2 sin λ,
ω11 λ 4.02 − 2 sin λ,
c2 λ tanh2.3 − cos λ,
g1 ξ g2 ξ tanh ξ,
2
τj t τ arctan t,
π
f1 ξ f2 ξ sin πξ,
ω12 λ 0.02,
ω21 λ 0.01, ω22 λ 2.31 − cos λ,
) π*
, I1 −0.02, I2 −0.01,
Ξ 0,
2
j 1, 2,
aij λ αij λ βij λ − sin λ
,
100
i, j 1, 2.
4.3
Mathematical Problems in Engineering
11
It is easy to check that (H1 )–(H5 ) hold and Li Bi μi li 1 for i 1, 2. We can check that
2
i1
sup ωi1 λLi l1 sup
λ∈0,π/2
|ai1 λ| |αi1 λ| βi1 λ
λ∈0,π/2
4.1 >
inf c1 λ tanh 2.
λ∈0,π/2
4.4
From
simple
calculations,
we
know
that
−1.06/tanh 2, 1.06/tanh 2 ×
−1.06/tanh 1.3, 1.06/tanh 1.3 is an attracting domain of FNNs 4.2. All equilibrium
points of FNNs 4.2 lie in −1.06/tanh 2, 1.06/tanh 2 × −1.06/tanh 1.3, 1.06/tanh 1.3.
From some calculations, we have two equilibrium points O1 1, 0, O2 0, 1.
For equilibrium O2 1, 1, we have κ1 λ 4 − 2 sin λ, κ2 λ 2.3 − cos λ and
supλ∈0,π/2 |ġ1 κ1 λ| 0.0680, supλ∈0,π/2 |ġ2 κ2 λ| 0.0386. Taking β1 β2 1, we
get
2
j1
sup ω1j λġj κj λ lj sup a1j λ α1j λ β1j λ
λ∈0,π/2
< 0.04 < tanh 2 2
j1
λ∈0,π/2
inf c1 λ,
λ∈0,π/2
sup ω2j λġj κj λ lj sup a2j λ α2j λ β2j λ
λ∈0,π/2
< 0.04 < tanh 1.3 4.5
λ∈0,π/2
inf c2 λ.
λ∈0,π/2
Similarly, we can check that 3.1 holds for Ok k 1, 2. Therefore, from Theorem 3.1, the
four equilibrium points Ok k 1, 2 are locally robust stable and their convergent radius is
0.04.
Remark 4.1. The above example implies that the system has multiple equilibrium points
under the relevant assumption of monotone nondecreasing activation functions. These
equilibrium points do not globally converge to the unique equilibrium point.
5. Conclusions
In this paper, we derive some sufficient conditions for local robust stability of fuzzy neural
network with time-varying and S-type distributed delays and give an estimate of attracting
domains of stable equilibrium points except isolated equilibrium points. Our results not only
show local robust stability of equilibrium points but also allow much broader application for
fuzzy neural network with or without delays. An example is given to show the effectiveness
of our results.
Acknowledgment
This work is supported by the National Natural Sciences Foundation of China under Grant
10971183.
12
Mathematical Problems in Engineering
References
1 J. J. Hopfield, “Neurons with graded response have collective computational properties like those of
two-state neurons,” Proceedings of the National Academy of Sciences of the United States of America, vol.
81, no. 10, pp. 3088–3092, 1984.
2 B. Kosko, “Bidirectional associative memories,” IEEE Transactions on Systems, Man, and Cybernetics,
vol. 18, no. 1, pp. 49–60, 1988.
3 L. O. Chua and L. Yang, “Cellular neural networks: theory,” IEEE Transactions on Circuits and Systems,
vol. 35, no. 10, pp. 1257–1272, 1988.
4 C.-Y. Cheng, K.-H. Lin, and C.-W. Shih, “Multistability in recurrent neural networks,” SIAM Journal
on Applied Mathematics, vol. 66, no. 4, pp. 1301–1320, 2006.
5 X. F. Yang, X. F. Liao, Y. Tang, and D. J. Evans, “Guaranteed attractivity of equilibrium points in a
class of delayed neural networks,” International Journal of Bifurcation and Chaos in Applied Sciences and
Engineering, vol. 16, no. 9, pp. 2737–2743, 2006.
6 Y. M. Chen, “Global stability of neural networks with distributed delays,” Neural Networks, vol. 15,
no. 7, pp. 867–871, 2002.
7 H. Zhao, “Global asymptotic stability of Hopfield neural network involving distributed delays,”
Neural Networks, vol. 17, no. 1, pp. 47–53, 2004.
8 J. Cao and J. Wang, “Global asymptotic and robust stability of recurrent neural networks with time
delays,” IEEE Transactions on Circuits and Systems I, vol. 52, no. 2, pp. 417–426, 2005.
9 S. Mohamad, “Global exponential stability in DCNNs with distributed delays and unbounded
activations,” Journal of Computational and Applied Mathematics, vol. 205, no. 1, pp. 161–173, 2007.
10 S. Mohamad, K. Gopalsamy, and H. Akça, “Exponential stability of artificial neural networks with
distributed delays and large impulses,” Nonlinear Analysis: Real World Applications, vol. 9, no. 3, pp.
872–888, 2008.
11 Z. K. Huang, Y. H. Xia, and X. H. Wang, “The existence and exponential attractivity of κ-almost
periodic sequence solution of discrete time neural networks,” Nonlinear Dynamics, vol. 50, no. 1-2, pp.
13–26, 2007.
12 Z. K. Huang, X. H. Wang, and F. Gao, “The existence and global attractivity of almost periodic
sequence solution of discrete-time neural networks,” Physics Letters A, vol. 350, no. 3-4, pp. 182–191,
2006.
13 L. B. Almeida, “Backpropagation in perceptrons with feedback,” in Neural Computers, pp. 199–208,
Springer, New York, NY, USA, 1988.
14 F. J. Pineda, “Generalization of back-propagation to recurrent neural networks,” Physical Review
Letters, vol. 59, no. 19, pp. 2229–2232, 1987.
15 R. Rohwer and B. Forrest, “Training time-dependence in neural networks,” in Proceedings of the 1st
IEEE International Conference on Neural Networks, pp. 701–708, San Diego, Calif, USA, 1987.
16 M. Forti and A. Tesi, “New conditions for global stability of neural networks with application to linear
and quadratic programming problems,” IEEE Transactions on Circuits and Systems I, vol. 42, no. 7, pp.
354–366, 1995.
17 Y. S. Xia and J. Wang, “A general methodology for designing globally convergent optimization neural
networks,” IEEE Transactions on Neural Networks, vol. 9, no. 6, pp. 1331–1343, 1998.
18 Y. S. Xia and J. Wang, “On the stability of globally projected dynamical systems,” Journal of
Optimization Theory and Applications, vol. 106, no. 1, pp. 129–150, 2000.
19 J.-H. Li, A. N. Michel, and W. Porod, “Analysis and synthesis of a class of neural networks: linear
systems operating on a closed hypercube,” IEEE Transactions on Circuits and Systems, vol. 36, no. 11,
pp. 1405–1422, 1989.
20 I. Varga, G. Elek, and S. H. Zak, “On the brain-state-in-a-convex-domain neural models,” Neural
Networks, vol. 9, no. 7, pp. 1173–1184, 1996.
21 H. Qiao, J. Peng, Z.-B. Xu, and B. Zhang, “A reference model approach to stability analysis of neural
networks,” IEEE Transactions on Systems, Man, and Cybernetics B, vol. 33, no. 6, pp. 925–936, 2003.
22 Z.-B. Xu, H. Qiao, J. Peng, and B. Zhang, “A comparative study of two modeling approaches in neural
networks,” Neural Networks, vol. 17, no. 1, pp. 73–85, 2004.
23 M. Wang and L. Wang, “Global asymptotic robust stability of static neural network models with Stype distributed delays,” Mathematical and Computer Modelling, vol. 44, no. 1-2, pp. 218–222, 2006.
24 P. Li and J. D. Cao, “Stability in static delayed neural networks: a nonlinear measure approach,”
Neurocomputing, vol. 69, no. 13–15, pp. 1776–1781, 2006.
Mathematical Problems in Engineering
13
25 Z. K. Huang and Y. H. Xia, “Exponential p-stability of second order Cohen-Grossberg neural networks
with transmission delays and learning behavior,” Simulation Modelling Practice and Theory, vol. 15, no.
6, pp. 622–634, 2007.
26 T. Yang and L.-B. Yang, “The global stability of fuzzy cellular neural network,” IEEE Transactions on
Circuits and Systems I, vol. 43, no. 10, pp. 880–883, 1996.
Download