EXISTENCE AND GLOBAL STABILITY OF PERIODIC SOLUTION FOR DELAYED DISCRETE HIGH-ORDER

advertisement
EXISTENCE AND GLOBAL STABILITY OF PERIODIC
SOLUTION FOR DELAYED DISCRETE HIGH-ORDER
HOPFIELD-TYPE NEURAL NETWORKS
HONG XIANG, KE-MING YAN, AND BAI-YAN WANG
Received 6 February 2005
By using coincidence degree theory as well as a priori estimates and Lyapunov functional,
we study the existence and global stability of periodic solution for discrete delayed highorder Hopfield-type neural networks. We obtain some easily verifiable sufficient conditions to ensure that there exists a unique periodic solution, and all theirs solutions converge to such a periodic solution.
1. Introduction
It is well known that studies on neural dynamical systems not only involve discussion of
stability property, but also involve other dynamics behaviors such as periodic oscillatory,
bifurcation and chaos. In many applications, the property of periodic oscillatory solutions are of great interest. For example, the human brain has been in periodic oscillatory
or chaos state, hence it is of prime importance to study periodic oscillatory and chaos
phenomenon of neural networks. Recently, Liu and Liao [8], Zhou and Liu [15] consider
the existence and global exponential stability of periodic solutions of delayed Hopfield
neural networks and delayed cellular neural networks. Liu et al. [7] address the existence
and global exponential stability of periodic solutions of delayed BAM neural networks.
Since high-order neural networks have stronger approximation property, faster convergence rate, greater storage capacity, and higher fault tolerance than lower-order neural
networks, they have attracted considerable attention (see, e.g., [1, 2, 4, 5, 10, 11, 13, 14]).
In our previous paper [12], we study the global exponential stability and existence of
periodic solutions of the following high-order Hopfield-type neural networks
dxi (t)
= −ai (t)xi (t) +
bi j (t) f j x j (t)
dt
j =1
m
+
m
bi j (t) f j x j t − τi j (t)
+
j =1
+
m
m m
m ei jl (t) f j x j (t) fl xl (t)
j =1 l=1
ei jl (t) f j x j t − σi jl (t)
j =1 l=1
Copyright © 2005 Hindawi Publishing Corporation
Discrete Dynamics in Nature and Society 2005:3 (2005) 281–297
DOI: 10.1155/DDNS.2005.281
fl xl t − σi jl (t)
+ Ii (t),
(1.1)
282
Discrete high-order neural networks
where i = 1,2,...,m, t > 0, xi (t) denotes the potential (or voltage) of the cell i at time t.
ai (t) are positive ω-periodic functions, they denote the rate with which the cell i reset their
potential to the resting state when isolated from the other cells and inputs. bi j (t), ei jl (t)
are the first-and second-order connection weights of the neural network, respectively;
Ii (t) denote the ith component of an external input source introduced from outside the
network to the cell i.
In [6], Li investigates global stability and existence of periodic solutions of discrete delayed cellular neural networks. However, few authors have studies the dynamical behaviors of the discrete-time analogues of delayed high-order Hopfield-type neural networks
with variable coefficient. In this paper, we are concerned with the following discrete analogue of (1.1) of the form
xi (n + 1) = e−ai (n)h xi (n) + θi (h)
+ θi (h)
n
n
bi j (n) f j x j (n)
j =1
bi j (n) f j x j n − τi j (n)
+ θi (h)
j =1
+ θi (h)
n n
ei jl (n) f j x j n − σi jl (n)
ei jl (n) f j x j (n) fl xl (n)
j =1 l =1
j =1 l=1
n
n fl xl t − σi jl (n)
i = 1,2,...,m,
+ θi (h)Ii (n),
(1.2)
in which θi , ai , bi j , ei jl , i, j, l = 1,2,...,m, will be specified in the next section.
With the help of Mawhin’s continuation theorem of coincidence degree theory and
constructing Lyapunov functional, we obtain some sufficient conditions ensure that for
the discrete networks (1.2) there exists a unique periodic solution, and all theirs solutions
converge to such a periodic solution. To the best of our knowledge, this is the first time
to study the existence and global attractivity of the periodic solution for the discrete-time
analogues of delayed high-order Hopfield-type neural networks with variable coefficient.
The tree of this paper is as follows. In Section 2, following the semi-discretization technique [6, 9], we obtain a discrete-time analogue of (1.1). In Section 3, with the help of
Mawhin’s continuation theorem of coincidence degree theory, we study the existence of
the periodic solution of (1.2). In Section 4, by constructing Lyapunov functional, we derive sufficient conditions to ensure that the periodic solution of (1.2) is globally asymptotically stable.
2. Discrete-time analogues
There is no unique way of deriving discrete time version of dynamical equations corresponding to continuous time formulation. First, following [6, 9], we reformulate system
(1.1) by an approximation of the form
dxi (t)
= −ai
dt
+
m
j =1
t
h xi (t) + bi j
h
j =1
bi j
m
t
h fj xj
h
t
h fj xj
h
t
h − τi j
h
t
h
h
t
h
h
Hong Xiang et al. 283
+
m m
j =1 l=1
+
m m
t
h fj xj
h
ei jl
t
h
h
t
h fj xj
h
ei jl
j =1 l=1
t
h
h
f l xl
t
h − σi jl
h
t
h
h
t
t
t
× f l xl
h − σi jl
h
+ Ii
h ,
h
h
h
(2.1)
where h is a positive number denoting a uniform discretization step size and [t/h] denotes
the greatest integer in t/h. For convenience, we denote [t/h] = n, n ∈ Z0+ , and ai (nh) =
ai (n), bi j (nh) = bi j (n), bi j (nh) = bi j (n), ei jl (nh) = ei jl (n), ei jl (nh) = ei jl (n), τi j (nh) = τi j ,
σi jl (nh) = σi jl , xi (nh) = xi (n), Ii (nh) = Ii . Thus (2.1) takes on the form
dxi (t)
= −ai (n)xi (t) +
bi j (n) f j x j (n)
dt
j =1
m
+
m
bi j (n) f j x j n − τi j (n)
j =1
+
m m
ei jl (n) f j x j (n) fl xl (n)
(2.2)
j =1 l=1
+
m m
ei jl (n) f j x j n − σi jl (n)
j =1 l=1
× fl xl n − σi jl (n) + Ii (n),
n ∈ Z0+ .
Integrate it over the interval [nh,t] for t < (n + 1)h to obtain
xi (t)eai (n)t − xi (n)eai (n)nh
=
eai (n)t − eai (n)nh
ai (n)
m
bi j (n) f j x j (n) +
j =1
+
m
bi j (n) f j x j n − τi j (n)
j =1
m m
ei jl (n) f j x j (n) fl xl (n)
(2.3)
j =1 l=1
+
m
m ei jl (n) f j x j (n) fl xl (n) fl xl n − σi jl (n)
j =1 l=1
+ Ii (n) ,
i = 1,2,...,m.
284
Discrete high-order neural networks
We let t → (n + 1)h and obtain
xi (n + 1) = xi (n)e−ai (n)h + θi (h)
m
bi j (n) f j x j (n)
j =1
+ θi (h)
m
bi j (n) f j x j n − τi j (n)
j =1
+ θi (h)
m
m ei jl (n) f j x j (n) fl xl (n)
(2.4)
j =1 l=1
+ θi (h)
m m
ei jl (n) f j x j n − σi jl (n)
fl xl n − σi jl (n)
j =1 l=1
+ θi (h)Ii (n),
i = 1,2,...,m, n ∈ Z0+ ,
where
θi (h) =
1 − e−ai (n)h
,
ai (n)
i = 1,2,...,m, n ∈ Z0+ .
(2.5)
It is not difficult to verify that θi (h) > 0 if ai ,h > 0 and θi (h) ≈ h + o(h2 ) for small h > 0.
Also, one can see that (1.2) converges towards (1.1) when h → 0+ . The system (1.2) is
supplemented with initial values given by
xi (s) = ϕi (s),
s ∈ − τ ∗ ,0 Z , τ ∗ = max
1≤i, j,l≤m
max τi j (n),σi jl (n)
n ∈Z
.
(2.6)
In this paper, we assume that
(H1) ai : Z → (0, ∞), bi j , bi j , ei jl , ei jl , Ii ∈ Z → R, τi j , σi jl : Z → Z0+ , i, j,l = 1,2,...,m,
h ∈ (0, ∞).
(H2) f j are Lipschitzian with Lipschitz constants L j > 0,
f j (x) − f j (y) ≤ L j |x − y |
(2.7)
for any x, y ∈ R, ( j ∈ {1,...,m}).
(H3) There exist positive constants N j > 0, j ∈ {1,...,m} such that
f j (x) ≤ N j ,
j ∈ {1,...,m}.
(2.8)
For convenience, we will introduce the notation:
ω−1
Iω = {0,1,...,ω − 1},
u=
1 u(k),
ω k =0
(2.9)
Hong Xiang et al. 285
where u(k) an ω-perodic sequence of real numbers defined for k ∈ Z and notations:
ai = min ai (n) ,
n∈Iω
IM
= max Ii (n), i = 1,2,...,m ,
n∈Iω
biMj
i = 1,2,...,m,
M = sup f j (u), j = 1,2,...,m ,
= max bi j (n) ,
u∈R
= max bi j (n)
n∈Iω
eiMjl = max ei jl (n) .
biMj
n∈Iω
eiMjl = max ei jl (n) ,
n∈Iω
(2.10)
n∈Iω
3. Existence of periodic solution
In this section, based on the Mawhin’s continuation theorem and Lyapunov functional,
we will study the existence of periodic solutions of discrete-time high-order Hopfieldtype neural networks (1.2).
First, we will make some preparations.
Let X and Z be two Banach spaces. Consider an operator equation
Lx = λNx,
λ ∈ (0,1),
(3.1)
where L : DomL ∩ X → Z is a linear operator and λ is a parameter. Let P and Q denote
two projectors such that
P : X ∩ DomL −→ KerL,
Q : Z −→
Z
L.
Im
(3.2)
Denote by H : ImQ → Ker L is an isomorphism of ImQ onto Ker L. In the sequel, we will
use the following result of Mawhin [3, page 40].
Lemma 3.1. Let X and Z be two Banach spaces and L a Fredholm mapping of index zero.
Assume that N : Ω → Z is L-compact on Ω with Ω open bounded in X. Furthermore assume:
(a) for each λ ∈ (0,1), y ∈ ∂Ω ∩ DomL,
Lx = λNx;
(3.3)
(b) for each x ∈ ∂Ω ∩ KerL,
QNx = 0,
deg{HQNx,Ω ∩ Ker L,0} = 0.
(3.4)
Then the equation Lx = Nx has at least one solution in domL ∩ Ω.
Recall that a linear mapping L : DomL ∩ x → Z with KerL = L−1 (0) and ImL =
L(Dom L), will be called a Fredholm mapping if the following two conditions hold:
(i) KerL has a finite dimension;
(ii) ImL is closed and has a finite codimension.
286
Discrete high-order neural networks
Recall also that the codimension of ImL is the dimension of Z/ Im L, that is, the dimension of the cokernel coker L of L.
When L is a Fredholm mapping, its index is the integer IndL = dimkerL-codimIm L.
We will say that a mapping N is L-compact on Ω if the mapping QN : Ω → Z is continuous, QN(Ω) is bounded, and K p (I − Q)N : Ω → Y is compact, that is, it is continuous
and K p (I − Q)N(Ω) is relatively compact, where K p : ImL → DomL ∩ Ker P is a inverse
of the restriction L p of L to DomL ∩ KerP, so that LK p = I and K p L = I − P.
Next, we will state and prove the existence of periodic solutions of system (1.2).
Theorem 3.2. Assume that the condition (H1), (H2), and (H3) are satisfied. Furthermore,
assume that
(H4) ai , bi j , bi j , ei jl , ei jl , Ii , i, j, l = 1,2,...,m, are all ω-periodic functions.
Then system (1.2) has at least one ω-periodic solution.
Proof. Similar to that of [6], we define
lm = x = x(k) : x(k) ∈ Rm , k ∈ Z .
(3.5)
Let lω ⊂ lm denote the subspace of all ω periodic sequences equipped with the usual supremum norm · , that is,
T x
= x1 (k),...,xm (k) =
m
i =1
max xi (k),
k∈Iω
for any x =
(3.6)
x1 (k),...,xm (k) : k ∈ Z ∈ lω .
It is not difficult to show that lω is a finite-dimensional Banach space.
Let
l0ω
ω
= x = x(k) ∈ l :
ω
−1
x(k) = 0 ,
k =0
(3.7)
lcω = x = x(k) ∈ lω : x(k) = h ∈ Rm , k ∈ Z ,
then it follows that l0ω and lcω are both closed linear subspaces of lω and
lω = l0ω ⊕ lcω ,
dimlcω = m.
(3.8)
Hong Xiang et al. 287
In order to use Lemma 3.1 to system (1.2), we take X = Y = lω , (Lx)(k) = x(k + 1) − x(k),
and let

m
 x1 (n) e−a1 (n)h − 1 + θ1 (h) b1 j (n) f j x j (n) +


j =1


..
Nx(n) = 
.


m
−a (n)h

xm (n) e m
− 1 + θm (h) bm j (n) f j x j (n) +
j =1
m
θ1 (h)
b1 j (n) f j x j n − τ1 j (n)
+ θ1 (h)I1 (n)
j =1
θm (h)
m
..
.
bm j (n) f j x j n − τm j (n)
j =1
+θ1 (h)
m
m m m
θ1 (h)
(3.9)
..
.
e1 jl (n) f j x j n − σ1 jl (n)
j =1 l=1
θm (h)
em jl (n) f j x j (n) fl xl (n) +
j =1 l=1
n
n + θm (h)Im (n)
e1 jl (n) f j x j (n) fl xl (n) +
j =1 l=1
+θm (h)
n
n fl xl t − σ1 jl (n)

..
.
em jl (n) f j x j n − σm jl (n)
fl xl t − σm jl (n)





.




j =1 l=1
It is trivial to see that L is a bounded linear operator and
KerL = lcω ,
ImL = l0ω ,
(3.10)
as well as
dimKer L = m = codim ImL;
(3.11)
then it follows that L is a Fredholm mapping of index zero.
Define
Px =
ω −1
1 x(s),
ω s =0
x ∈ X,
Qz =
ω −1
1 z(s),
ω s =0
z ∈ Y.
(3.12)
It is not difficult to show that P and Q are continuous projectors such that
ImP = KerL,
ImL = KerQ = Im(I − Q).
(3.13)
288
Discrete high-order neural networks
Furthermore, the generalized inverse (to L) K p : ImL → KerP ∩ domL has the form
K p (z) =
k
−1
z(s) −
s =0
ω −1
1 (ω − s)z(s).
ω s =0
(3.14)
Clearly, QN and K p (I − Q)N are continuous. Since X is a finite-dimensional Banach
space, using the Arzela-Ascoli theorem, it is not difficult to show that QN(Ω), K p (I −
Q)N(Ω) are relatively compact for any open bounded set Ω ⊂ X. Hence, N is L-compact
on Ω, here Ω is any open bounded set in X.
Now we reach the position to search for an appropriate open, bounded subset Ω for
the application of the Lemma 3.1. Corresponding to the operator equation Lx = λNx,
λ ∈ (0,1), we have
xi (n + 1) − xi (n) = λ xi (n) e−ai (n)h − 1 + θi (h)
m
bi j (n) f j x j (n)
j =1
+ θi (h)
m
bi j (n) f j x j n − τi j (n)
j =1
+ θi (h)
m m
ei jl (n) f j x j (n) fl xl (n)
(3.15)
j =1 l=1
+ θi (h)
m m
ei jl (n) f j x j n − σi jl (n)
fl xl n − σi jl (n)
j =1 l=1
i = 1,2,...,m.
+ θi (h)Ii (n) ,
Assume that x(n) = (x1 (n),...,xm (n)) ∈ X is a solution of system (3.15) for some λ ∈
(0,1), from (3.15), we obtain
max xi (n) = max xi (n + 1)
n∈Iω
n∈Iω
m
bi j (n) f j x j (n ≤ 1 + λ e−ai (n)h − 1 xi (n) + λθi (h)
j =1
+ λθi (h)
m
bi j (n) f j x j n − τi j (n)
j =1
+ λθi (h)
m m
ei jl (n) f j x j (n) fl xl (n)
j =1 l=1
+ λθi (h)
m
m j =1 l=1
ei jl (n) f j x j n − σi jl (n)
fl xl n − σi jl (n)
+ λθi (h)Ii (n)
Hong Xiang et al. 289
≤ 1 + λ e−ai (n)h − 1 max xi (n) + λθi (h)mbM + λθi (h)mbM
n∈Iω
+ λθi (h)m2 eM 2 + λθi (h)m2 eM 2 + λθi (h)I M ,
i = 1,2,...,m.
(3.16)
Hence,
1 − e−ai (n)h max xi (n)
n∈Iω
+ m2 eM 2 + m2 eM 2 + I M ,
≤ θi (h) mbM + mbM
(3.17)
i = 1,2,...,m.
That is
max xi (n) ≤
n∈Iω
+ m2 eM 2 + m2 eM 2 + I M
θi (h) mbM + mbM
:= Ai .
1 − e−ai (n)h
(3.18)
Denote A = m
i=1 Ai + E, where E is taken sufficiently large such that x < A, clearly, A
is independent of λ. Now, we take Ω = {u ∈ X : x
< A}. It is clear that Ω satisfies the
requirement (a) in Lemma 3.1.
When x ∈ ∂Ω ∩ KerL, x is a constant vector in Rm with x
= A. Furthermore, take
H : ImQ → Ker L, r → r. we can let A be greater such that


x1
 . 

x1 ,...,xm HQN  .. 

=
n i =1
xm
− xi2
−1 −ai (s)h
ω
e
−1
ω
s =0
+ θi (h)xi
m
+ θi (h)xi
j =1
+ θi (h)
b i j f j x j xi
j =1
bi j f j x j xi + θi (h)
m m
m
m
m j =1 l=1
(3.19)
ei jl f j x j fl xl xi
ei jl f j x j fl xl xi + θi (h)I i xi < 0.
j =1 l=1
So for any x ∈ ∂Ω ∩ KerL, QNx = 0. Furthermore, let Ψ(r;u) = −rx + (1 − r)JQNx,
then for any x ∈ ∂Ω ∩ KerL, xT Ψ(r;x) < 0, we get
deg{HQNx,Ω ∩ Ker L,0} = deg{−x,Ω ∩ KerL,0} = 0.
(3.20)
Condition (b) of Lemma 3.1 is also satisfied. By now we have prove that Ω satisfies all
the requirements in Lemma 3.1. Hence, system (1.2) has at least one ω-periodic solution.
The proof is complete.
4. Global stability of the periodic solution
In this section, we will obtain sufficient conditions for the global asymptotic stability and
global exponential stability of the periodic solution of discrete high-order Hopfield-type
networks (1.2).
290
Discrete high-order neural networks
Theorem 4.1. Assume that condition (H1), (H2), (H3), and (H4) are satisfied. Furthermore, assume that τi j (n) = τi j ∈ Z + , σi jl (n) = σi jl ∈ Z + , and
(H5) There exists a positive real number sequence αi such that
λi = αi 1 − eai − Li
m
j =1
− Li
m
m j =1 l=1
− Li
m m
j =1 l=1
α j θ j (h)bM
ji − Li
α j θ j (h)eM
jil M − Li
m
j =1
m
m j =1 l=1
α j θ j (h)eM
jli M > 0,
α j θ j (h)bM
ji − Li
m m
j =1 l=1
α j θ j (h)eM
jil M
α j θ j (h)eM
jli M
(4.1)
i = 1,2,...,m.
Then the ω-periodic solution of (1.2) is unique and all other solutions of (1.2) converges to
its unique ω-periodic solutions.
Proof. According to Theorem 3.2, we know that (1.2) has a ω-periodic solution x∗ (n) =
∗
(x1∗ (n),x2∗ (n),...,xm
(n))T . Obviously, if this periodic solution is globally attractivity, then
it is unique. Let x(n) = (x1 (n),x2 (n),...,xm (n))T is an arbitrary solution of (1.2) and let
xi (n + 1) − xi∗ (n + 1) = −eai (n)h xi (n) − xi∗ (n)
+ θi (h)
m
j =1
+ θi (h)
m
bi j (n) f j x j (n) − f j x∗j (n)
bi j (n) f j x j n − τi j
j =1
+ θi (h)
m
m j =1 l=1
+ θi (h)
m
m − f j x∗j n − τi j
ei jl (n) f j x j (n) fl xl (n) − f j x∗j (n) fl xl∗ (n)
ei jl (n) f j x j n − σi jl
j =1 l=1
∗
+ f j x j n − σi jl
fl xl∗ n − σi jl
fl xl n − σi jl
,
i = 1,2,...,m.
(4.2)
Hence,
xi (n + 1) − x∗ (n + 1)
i
≤ −eai xi (n) − xi∗ (n)
+ θi (h)
m
j =1
+ θi (h)
m
j =1
biMj L j x j (n) − x∗j (n)
biMj L j x j n − τi j − x∗j n − τi j Hong Xiang et al. 291
+ θi (h)
m m
j =1 l=1
+ θi (h)
m m
j =1 l=1
eiMjl M L j x j (n) − x∗j (n) + Ll xl (n) − xl∗ (n)
eiMjl M L j x j n − σi jl − x∗j n − σi jl + Ll xl n − σi jl − xl∗ n − σi jl ,
i=1,2,...,m.
(4.3)
Define a Lyapunov functional V (·) by
V (n) =
m
αi xi (n) − xi∗ (n)
i =1
+ αi θi (h)
m
j =1
+ αi θi (h)
biMj L j
m m
j =1 l=1
n+τi j −1
x j k − τi j − x∗ k − τi j j
k =n
eiMjl M
(4.4)
n+σi jl −1
x j k − τi j − x∗ k − τi j Lj
j
k =n
n+σi jl −1
xl k − τi j − x∗ k − τi j + Ll
.
l
k =n
Then
∆V (n) ≤
m
i=1
αi eai − 1 xi (n) − xi∗ (n) + αi θi (h)
+ αi θi (h)
m
j =1
+ αi θi (h)
≤
m
m
m m m
j =1 l=1
ai
αi e − 1 + Li
i=1
+ Li
j =1 l=1
+ Li
m
m j =1 l=1
≤−
eiMjl M
m
j
α j θ j (h)bM
ji
α j θ j (h)eM
jil M + Li
α j θ j (h)eM
jli M + Li
i
L j x j (n) − x∗ (n) + Ll xl (n) − x∗ (n)
m
λi xi (n) − x∗ (n) ≤ 0.
i=1
eiMjl M L j x j (n) − x∗j (n) + Ll xl (n) − xl∗ (n)
j =1
m m
j =1
biMj L j x j (n) − x∗j (n)
biMj L j x j (n) − x∗j (n)
j =1 l=1
+ αi θi (h)
m
+ Li
m
j =1
m m
j =1 l=1
m
m j =1 l=1
l
(4.5)
α j θ j (h)bM
ji
α j θ j (h)eM
jil M
α j θ j (h)eM
jli M
xi (n) − x∗ (n)
i
292
Discrete high-order neural networks
Summing both sides of (4.5) from 0 to n − 1, we get
V (n) +
m
n
−1 k =0
∗
λi xi (k) − x (k) ≤ V (0),
(4.6)
i
i =1
which yields
m
∞ λi xi (k) − x∗ (k) ≤ V (0) < ∞.
(4.7)
i
k =0 i =1
Therefore, we have
lim xi (n) − xi∗ (n) = 0
(4.8)
n→∞
and we can conclude that the ω-periodic solution of (1.2) is globally attractivity and this
completes the proof of the theorem.
Next, we will study global exponential stability of the periodic solution of discrete
high-order Hopfield-type networks (1.2).
Theorem 4.2. Assume that condition (H1), (H2), (H3), and (H4) are satisfied. Furthermore, assume that τi j (n) = τi j ∈ Z + , σi jl (n) = σi jl ∈ Z + , and
(H5)
ai > L i
m
j =1
+ Li
bM
ji + Li
m m
j =1 l=1
m
j =1
bM
ji + Li
eM
jil M + Li
m m
j =1 l=1
m m
j =1 l=1
eM
jil MLi + Li
elMji M,
m m
j =1 l=1
elMji M
(4.9)
i = 1,2,...,m.
Then the ω-periodic solution of (1.2) is unique and is globally exponentially stable.
Proof. According to Theorem 3.2, we know that (1.2) has a ω-periodic solution x∗ (n) =
∗
(x1∗ (n),x2∗ (n),...,xm
(n))T . Let x(n) = (x1 (n),x2 (n),...,xm (n))T is an arbitrary solution of
(1.2), then
xi (n + 1) − x∗ (n + 1)
i
m
biMj L j x j (n) − x∗j (n)
≤ −eai xi (n) − xi∗ (n) + θi (h)
j =1
+ θi (h)
m
j =1
biMj L j x j n − τi j − x∗j n − τi j Hong Xiang et al. 293
+ θi (h)
m m
j =1 l=1
+ θi (h)
m m
j =1 l=1
eiMjl M L j x j (n) − x∗j (n) + Ll xl (n) − xl∗ (n)
eiMjl M L j x j n − σi jl − x∗j n − σi jl + Ll xl n − σi jl − xl∗ n − σi jl ,
i =,1,2,... ,m.
(4.10)
Let Fi (·, ·), i ∈ {1,...,m} be defined by
Fi υi ,n = 1 − υi e−ai h − Li θi (h)υi
m
j =1
− υi Li θi (h)
m m
j =1 l=1
− Li θi (h)
m m
j =1 l=1
bM
ji − Li θi (h)
eM
jil − υi Li θi (h)
σl ji +1
eM
jil Mυi
m
j =1
m m
j =1 l=1
− Li θi (h)
τi j +1
bM
ji υi
elMji M
m m
j =1 l=1
(4.11)
σl ji +1
elMji υi
M,
where υi ∈ [1, ∞), n ∈ Iω , i ∈ {1,...,m}. Since
Fi (1,n) = 1 − e−ai h − Li θi (h)
m
j =1
− Li θi (h)
m
m j =1 l=1
− Li θi (h)
m m
j =1 l=1
= θi (h) ai − Li
− Li
eM
jil M − Li θi (h)
eM
jil M − Li θi (h)
m
j =1
bM
ji − Li
m
m j =1 l=1
m
j =1
elMji M − Li
≥ min θi (h) γ > 0,
1≤i≤m
bM
ji − Li θi (h)
m
j =1
m
m j =1 l=1
m m
j =1 l=1
bM
ji − Li
m
m j =1 l=1
bM
ji
elMji M
elMji M
m m
j =1 l=1
(4.12)
eM
jil MLi
eM
jil M − Li
m
m j =1 l=1
elMji M
i = 1,2,...,m,
where
γ = ai − L i
− Li
m
j =1
bM
ji − Li
m
m j =1 l=1
m
j =1
elMji M − Li
bM
ji − Li
m
m j =1 l=1
m
m j =1 l=1
eM
jil MLi
eM
jil M − Li
m
m j =1 l=1
(4.13)
elMji M,
294
Discrete high-order neural networks
using the continuity of Fi (υi ,n) on [1, ∞) with respect to υi and the fact that Fi (υi ,n) →
−∞ as υi → ∞ uniformly in n ∈ Iω , i = 1,2,...,m, we see that there exist υi∗ (n) ∈ (1, ∞)
∗
},
such that Fi (υi∗ (n),n) = 0 for n ∈ Iω , i = 1,2,...,m. By choosing λ = min{υ1∗ ,υ2∗ ,...,υm
where λ > 1, we obtain Fi (λ,n) ≥ 0 for n ∈ Iω , i = 1,2,...,m, then
λe−ai (n)h + θi (h)Li λ
m
j =1
+ θi (h)Li λ
bM
ji + θi (h)Li
m
m j =1 l=1
m
j =1
τ ji +1
bM
ji λ
eM
jil MLi + θi (h)Li
m
m j =1 l =1
(4.14)
elMji Mλσi jl +1 ≤ 1.
Now let us consider
ui (n) = λn
xi (n) − x∗ (n)
i
θi (h)
n ∈ (−τ, ∞)Z , i = 1,2,...,m,
,
(4.15)
where λ > 1. Then it follows from (1.2) and (4.15) that
ui (n + 1) ≤ λe−ai h ui (n) + λ
m
j =1
+
m
j =1
+λ
biMj L j θ j (h)λτi j +1 u j n − τi j
m m
j =1 l=1
+
biMj L j θ j (h)u j (n)
m m
j =1 l=1
eiMjl M
L j θ j (h)u j (n) + Ll θl (h)ul (n)
(4.16)
eiMjl M L j θ j (h)λσi jl +1 u j n − σi jl + Ll θl (h)λσi jl +1 ul n − σi jl
.
Define a Lyapunov functional V (·) by
V (n) =
m
ui (n) +
i=1
m
j =1
+
m
m j =1 l=1
biMj L j θ j (h)λτi j +1
n
−1
s =n −τ i j
eiMjl θ j (h)λσi jl +1 M L j
u j (s)
−1
n
s=n−σi jl
u j (s) + Ll
−1
n
s=n−σi jl
ul (s)ds
(4.17)
Hong Xiang et al. 295
then calculating the ∆V (n) = V (n + 1) − V (n) along (4.16), we have
∆V (n)
=
m
ui (n + 1) − ui (n) +
i=1
j =1
m
m ×
≤
m
j =1 l=1
m eiMjl θ j (h)λσi jl +1 M
λe−ai h − 1 ui (n) + λ
i=1
m
j =1
+λ
m m
j =1 l=1
m
m +
≤−
m
j =1 l=1
biMj L j θ j (h)λτi j +1 u j (n) − u j n − τi j
biMj L j θ j (h)u j (n) +
m
j =1
m
j =1
m m
j =1 l=1
− Li θi (h)
m
m j =1 l=1
bM
ji − Li θi (h)
eM
jil M − Li θi (h)
m m
j =1 l=1
σ jil +1
eM
M − Li θi (h)
jil λ
m
j =1
− Li θi (h)
biMj L j θ j (h)λτi j +1 u j (n)
eiMjl λσi jl +1 M L j θ j (h)u j (n) + Ll θl (h)ul (n)
i=1
L j u j (n) − u j n − σi jl +Ll ul (n) − ul n − σi jl
eiMjl M L j θ j (h)u j (n) + Ll θl (h)ul (n)
1 − λe−ai h − Li θi (h)λ
τ ji +1
bM
ji λ
elMji M
m
m j =1 l=1
elMji λσl ji +1 M
ui (n) ≤ 0,
t>0
(4.18)
and hence from (4.17) we have
m
for n ∈ Z + .
ui (n) ≤ V (n) ≤ V (0),
(4.19)
i =1
Thus
m
ui (n) = λ
i=1
n
m xi (n) − x∗ (n)
i
i=1
≤
m θi (h)
ui (0) +
i=1
m
j =1
+
m
m j =1 l=1
biMj L j θ j (h)λτi j +1
eiMjl θ j (h)λσi jl +1 M
−1
s=−τi j
Lj
u j (s)
−1
s=−σi jl
u j (s) + Ll
−1
s=−σi jl
ul (s)ds
296
Discrete high-order neural networks
≤
m
1 + Li θi (h)
i=1
j =1
+ Li θi (h)
≤Υ
m
τ ji +1
bM
τ ji + Li θi (h)
ji λ
m m
j =1 l=1
m
elMji λσl ji +1 Mσl ji
m
m j =1 l=1
eiMjl λσi jl +1 Mσ jil
sup ui (s) − u∗i (s)
s∈[−τ ∗ ,0]
sup ui (s) − u∗i (s) ,
i=1 s∈[−τ ∗ ,0]
(4.20)
where
Υ = max 1 +
1≤i≤m
m
j =1
biMj L j eτi j τi j
+
m m
j =1 l=1
eiMjl eσi jl Mσi jl
L j + Ll
≥ 1,
(4.21)
then
n m
m
xi (n) − x∗ (n) ≤ max1≤i≤m θi (h) Υ 1
i
i=1
min1≤i≤m θi (h)
n m
1
≤δ
λ
λ
sup xi (s) − xi∗ (s)
i=1 s∈[−τ ∗ ,0]
∗
sup xi (s) − xi (s) ,
(4.22)
i=1 s∈[−τ ∗ ,0]
where
max1≤i≤m θi (h)
Υ≥1
δ=
min1≤i≤m θi (h)
(4.23)
and we can conclude that the ω-periodic solution of (1.2) is globally exponentially stable
and this completes the proof of the theorem.
Remark 4.3. If we let ei jl (t) = ei jl (t) = 0, then system (1.2) reduces to the discrete cellular
neural networks, and our Theorems 3.2, 4.1, and 4.2 are [6, Theorem 3.1, Theorem 4.1,
and Theorem 4.2], respectively. So our results generalized the main results of [6].
Acknowledgment
This work was supported by the NSF of Gansu Province of China, the NSF of Bureau of
Education of Gansu Province of China (0416B-08), the Key Research and Development
Program for Outstanding Groups of Lanzhou University of Technology, and the Development Program for Outstanding Young Teachers in Lanzhou University of Technology.
References
[1]
[2]
M. Brucoli, L. Carnimeo, and G. Grassi, Associative memory design using discrete-time secondorder neural networks with local interconnections, IEEE Trans. Circuits Systems I Fund. Theory Appl. 44 (1997), no. 2, 153–158.
A. Dembo, O. Farotimi, and T. Kailath, High-order absolutely stable neural networks, IEEE Trans.
Circuits and Systems 38 (1991), no. 1, 57–65.
Hong Xiang et al. 297
[3]
[4]
[5]
[6]
[7]
[8]
[9]
[10]
[11]
[12]
[13]
[14]
[15]
R. E. Gaines and J. L. Mawhin, Coincidence Degree, and Nonlinear Differential Equations, Lecture Notes in Mathematics, vol. 568, Springer, Berlin, 1977.
E. B. Kosmatopoulos, M. M. Polycarpou, M. A. Christodoulou, and P. A. Ioannou, High-order
neural network structures for identification of dynamical systems, IEEE Trans. Neural Networks 6 (1995), no. 2, 422–431.
E. B. Kosmatopoulos and M. A. Christodoulou, Structural properties of gradient recurrent highorder neural networks, IEEE Trans. Circuits Syst. II: Analog Digital Signal Process. 42 (1995),
no. 9, 592–603.
Y. Li, Global stability and existence of periodic solutions of discrete delayed cellular neural networks, Phys. Lett. A 333 (2004), no. 1-2, 51–61.
Z. Liu, A. Chen, J. Cao, and L. Huang, Existence and global exponential stability of periodic
solution for BAM neural networks with periodic coefficients and time-varying delays, IEEE
Trans. Circuits Systems I Fund. Theory Appl. 50 (2003), no. 9, 1162–1173.
Z. Liu and L. Liao, Existence and global exponential stability of periodic solution of cellular neural
networks with time-varying delays, J. Math. Anal. Appl. 290 (2004), no. 1, 247–262.
S. Mohamad and K. Gopalsamy, Dynamics of a class of discrete-time neural networks and their
continuous-time counterparts, Math. Comput. Simulation 53 (2000), no. 1-2, 1–39.
J. Su, A. Q. Hu, and Z. Y. He, Stability analysis of analogue neural networks, Electron. Lett. 33
(1997), no. 6, 506–507.
, Solving a kind of nonlinear programming problems via analog neural networks, Neurocomputing 18 (1998), no. 1-3, 1–9.
H. Xiang, K.-M. Yan, and B.-Y. Wang, Existence and global exponential stability of periodic solution for delayed high-order Hopfield-type neural networks, submitted.
B. Xu, X. Liu, and X. Liao, Global asymptotic stability of high-order Hopfield type neural networks
with time delays, Comput. Math. Appl. 45 (2003), no. 10-11, 1729–1737.
T. Zhang, S. S. Ge, and C. C. Hang, Neural-based direct adaptive control for a class of general
nonlinear systems, Internat. J. Systems Sci. 28 (1997), no. 10, 1011–1020.
J. Zhou, Z. Liu, and G. Chen, Dynamics of periodic delayed neural networks, Neural Networks
17 (2004), no. 1, 87–101.
Hong Xiang: Institute of Applied Mathematics, Lanzhou University of Technology, Lanzhou,
Gansu 730050, China
E-mail address: xiangh1969@sohu.com
Ke-Ming Yan: Institute of Applied Mathematics, Lanzhou University of Technology, Lanzhou,
Gansu 730050, China
E-mail address: yankm@lut.cn.
Bai-Yan Wang: Institute of Applied Mathematics, Lanzhou University of Technology, Lanzhou,
Gansu 730050, China
Download