Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on Time Scales

Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on... Hindawi Publishing Corporation Advances in Artificial Neural Systems Volume 2014, Article ID 750532, 10 pages http://dx.doi.org/10.1155/2014/750532 Research Article Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on Time Scales Jinxiang Cai, Zhenkun Huang, and Honghua Bin School of Science, Jimei University, Xiamen 361021, China Correspondence should be addressed to Zhenkun Huang; hzk974226@jmu.edu.cn Received 31 December 2013; Accepted 12 February 2014; Published 2 April 2014 Academic Editor: Songcan Chen Copyright © 2014 Jinxiang Cai et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We present stability analysis of delayed Wilson-Cowan networks on time scales. By applying the theory of calculus on time scales, the contraction mapping principle, and Lyapunov functional, new sufficient conditions are obtained to ensure the existence and exponential stability of periodic solution to the considered system. The obtained results are general and can be applied to discrete- time or continuous-time Wilson-Cowan networks. 1. Introduction Motivated by recent results [11–13], we consider the following dynamic Wilson-Cowan networks on time scale T: eTh activity of a cortical column may be mathematically 𝑡 =−𝑎 𝑡 𝑋 𝑡 +[𝑘 𝑡 −𝑟 𝑡 𝑋 𝑡 ] 𝑋 () () () () () () 𝑃 𝑃 𝑃 𝑃 𝑃 𝑃 described through the model developed by Wilson and Cowan [1, 2]. Such a model consists of two nonlinear ordinary 1 ×𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) () () 𝑃 𝑃 𝑃 differential equations representing the interactions between two populations of neurons that are distinguished by the −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑁 𝑃 fact that their synapses are either excitatory or inhibitory 𝑋 (𝑡 )=−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] [2]. A comprehensive paper has been done by Destexhe 𝑁 𝑁 𝑁 𝑁 𝑁 and Sejnowski [3] which summarized all important devel- ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 opment and theoretical results for Wilson-Cowan networks. Its extensive applications include pattern analysis and image 2 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑁 processing [4]. Theoretical results about the existence of (1) asymptotic stable limit cycle and chaos have been reported in [5, 6]. Exponential stability of a unique almost periodic 𝑡∈ T,where 𝑋 (𝑡), 𝑋 (𝑡) represent the proportion of 𝑃 𝑁 solution for delayed Wilson-Cowan type model has been excitatory and inhibitory neurons firing per unit time at the reported in [7]. However, few investigations are xfi ed on the instant 𝑡 ,respectively. 𝑎 (𝑡) > 0 and 𝑎 (𝑡) > 0 represent the 𝑃 𝑁 periodicity of Wilson-Cowan model [8]and it is troublesome function of the excitatory and inhibitory neurons with natural to study the stability and periodicity for continuous and decay over time, respectively. 𝑟 (𝑡) and 𝑟 (𝑡) are related to 𝑃 𝑁 discrete system with oscillatory coefficients, respectively. the duration of the refractory period; 𝑘 (𝑡) and 𝑘 (𝑡) are 𝑃 𝑁 1 1 2 2 eTh refore, it is signicfi ant to study Wilson-Cowan networks positive scaling coefficients. 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡),and 𝑤 (𝑡) 𝑃 𝑁 𝑃 𝑁 on time scales [9, 10]which canunify thecontinuousand are the strengths of connections between the populations. discrete situations. 𝐼 (𝑡), 𝐼 (𝑡) are the external inputs to the excitatory and 𝑃 𝑁 2 Advances in Articfi ial Neural Systems the inhibitory populations. 𝐺(⋅) is the response function of In case𝑡 is right-scattered and𝑢(𝑡) is continuous at𝑡 ,one neuronal activity.𝜏 (𝑡),𝜏 (𝑡)correspond to the transmission gets 𝑃 𝑁 time-varying delays. 𝑢 (𝜎 (𝑡 ))−𝑢 (𝑡 ) + Δ The main aim of this paper is to unify the discrete and 𝐷 𝑢 (𝑡 )= . (5) 𝜎 (𝑡 )−𝑡 continuous Wilson-Cowan networks with periodic coe-ffi cients and time-varying delays under one common frame- Den fi ition 5. Afunction 𝑓: T → R is called right- work andtoobtainsomegeneralized resultstoensurethe dense continuous provided that it is continuous at right- existence and exponential stability of periodic solution on dense points of T and the left-side limit exists (finite) at left- time scales. eTh main technique is based on the theory of time dense continuous functions on T. eTh set of all right-dense scales, the contraction mapping principle, and the Lyapunov continuous functions on T is defined by 𝐶 =𝐶 (T, R). rd rd functional method. Den fi ition 6. Afunction𝑝: T → T is called a regressive 2. Preliminaries function if and only if 1 + 𝑝(𝑡)𝜇(𝑡) =0̸ . In this section, we give some definitions and lemmas on time The set of all regressive and right-dense continuous scales which can be found in books [14, 15]. functions is denoted by R.Let R := {𝑝 ∈ 𝐶 :1 + rd 𝑝(𝑡)𝜇(𝑡) > 0 for all 𝑡∈ T}. Next, we give the definition of Den fi ition 1. Atimescale T is an arbitrary nonempty closed the exponential function and list its useful properties. subset of the real set R. eTh forward and backward jump operators 𝜎 ,𝜌: T → T and the graininess𝜇: T → R Den fi ition 7 (Bohner and Peterson [14]). If 𝑝∈𝐶 is a rd are defined, respectively, by regressive function, then the generalized exponential func- tion 𝑒 (𝑡,)𝑠 is defined by 𝜎 (𝑡 ):= inf {𝑠∈ T :𝑠>𝑡 },𝜌 (𝑡 ):= sup{𝑠∈ T :𝑠<𝑡 }, 𝜇 (𝑡 ):= 𝜎 (𝑡 )−𝑡. 𝑒 (𝑡,𝑠 )= exp{ ∫ 𝜉 (𝑝 (𝜏 ))Δ𝜏}, ,𝑡𝑠 ∈ T, (6) 𝑝 𝜇(𝜏) (2) These jump operators enable us to classify the point {𝑡} with the cylinder transformation of a time scale as right-dense, right-scattered, left-dense, or Log(1+ℎ𝑧 ) left-scattered depending on whether ,ℎ=0̸ , 𝜉 (𝑧 )= (7) 𝜎 𝑡 =𝑡, 𝜎 𝑡 >𝑡, 𝜌 𝑡 =𝑡, 𝜌 𝑡 <𝑡, () () () () 𝑧, ℎ = 0. (3) { respectively, for any 𝑡∈ T. Den fi ition 8. The periodic solution The notation [𝑎,]𝑏 means that [𝑎,]𝑏 := {𝑡 ∈ T :𝑎≤𝑡≤𝑏} . T T ∗ ∗ ∗ Denote T := {𝑡 ∈ T :𝑡≥0} . (8) 𝑍 (𝑡 )= (𝑋 (𝑡 ),𝑋 (𝑡 )) 𝑃 𝑁 of (1) is said to be globally exponentially stable if there exists Den fi ition 2. One can say that a time scale T is periodic if apositiveconstant𝜀 and𝑁=(𝜀)>0 such that all solutions there exists𝑝>0 such that𝑡∈ T;then𝑡±𝑝 ∈ T;the smallest positive number 𝑝 is called theperiodofthe time scale. 𝑍 (𝑡 )=(𝑋 (𝑡 ),𝑋 (𝑡 )) (9) 𝑃 𝑁 Clearly, if T is a 𝑝 -periodic time scale, then 𝜎(𝑡 + )𝑝𝑛 = of (1)satisfy 𝜎(𝑡)+𝑝𝑛 and𝜇(𝑡+)𝑝𝑛 = 𝜇(𝑡) .So,𝜇(𝑡) is a𝑝 -periodic function. 󵄨 󵄨 󵄨 󵄨 ∗ ∗ 󵄨 󵄨 󵄨 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 ) + 𝑋 (𝑡 )−𝑋 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 󵄨 Den fi ition 3. Let T(≠ R)be a periodic time scale with period 𝑝 . One can say that the function 𝑓: T → R is periodic 󵄨 󵄨 󵄨 󵄨 with period𝜔>0 if there exists a natural number𝑛 such that ≤𝑁 𝜖 𝑒 𝑡,𝛼 ( sup 𝑋 𝑠 −𝑋 𝑠 () ( ) 󵄨 () ()󵄨 ⊖𝜖 𝑃 𝑃 󵄨 󵄨 𝑠∈[−𝜏 ,0] 𝜔=𝑛𝑝 , 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all𝑡∈ T and 𝜔 is the smallest 0 T number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) .If T = R,one cansay that 󵄨 ∗ 󵄨 𝑓 is periodic with period𝜔>0 if 𝜔 is the smallest positive 󵄨 󵄨 + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ), 󵄨 𝑁 󵄨 number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all𝑡∈ R. 𝑠∈[−𝜏 ,0] 0 T Den fi ition 4 (Lakshmikantham and Vatsala [16]). For each 𝑡∈ T. 𝑡∈ T,let 𝑁 be a neighborhood of 𝑡 . en, Th one defines the (10) + Δ generalized derivative (or Dini derivative), 𝐷 𝑢 (𝑡),tomean Lemma 9 (Bohner and Peterson [15]). If 𝑝,𝑞 ∈ R,then that, given𝜀>0 , there exists a right neighborhood𝑁(𝜀) ⊂ of 𝑡 such that (i) 𝑒 (𝑡,)𝑠 ≡ 1 and 𝑒 (𝑡,)𝑡 ≡ 1 ; 0 𝑝 𝑢 𝜎 𝑡 −𝑢 𝑠 ( ()) () + Δ (ii) 𝑒 (𝜎(𝑡),𝑠) = (1 + 𝜇(𝑡)𝑝(𝑡))𝑒 (𝑡,)𝑠 ; <𝐷 𝑢 (𝑡 )+𝜀 (4) 𝑝 𝑝 𝑢 (𝑡,𝑠 ) (iii) 1/𝑒 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ,where ⊖𝑝(𝑡) = −𝑝(𝑡)/(1 + 𝑝 ⊖𝑝 for each 𝑠 ∈ 𝑁(𝜀) ,𝑠>𝑡 ,where 𝜇(𝑡,𝑠) = (𝑡) 𝜎 − 𝑠 . 𝜇(𝑡)𝑝(𝑡)) ; 𝑁 Advances in Articfi ial Neural Systems 3 (iv) 𝑒 (𝑡,)𝑠 = 1/𝑒 (𝑠,)𝑡 = 𝑒 (𝑠,)𝑡 ; Proof. Let𝑍(𝑡) = (𝑋 (𝑡),𝑋 (𝑡)) be a solution of (1); we can 𝑝 𝑝 ⊖𝑝 𝑃 𝑁 rewrite (1) as follows: (v) 𝑒 (𝑡,)𝑒𝑠 (𝑠,)𝑟 = 𝑒 (𝑡,)𝑟 ; 𝑝 𝑝 𝑝 Δ 𝜎 Δ (vi) 𝑒 (𝑡,)𝑒𝑠 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ; 𝑝 𝑞 𝑝⊕𝑞 𝑋 (𝑡 )+𝑎 (𝑡 )(𝑋 (𝑡 )−𝜇 (𝑡 )𝑋 (𝑡 )) 𝑃 𝑃 𝑃 (vii) 𝑒 (𝑡,)/𝑒 𝑠 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ; 𝑝 𝑞 𝑝⊖𝑞 =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑃 𝑃 𝑃 Δ 𝜎 (viii) (1/𝑒 (⋅,)) 𝑠 = −𝑝(𝑡)/𝑒 (⋅,)𝑠 . ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 Lemma 10 (contraction mapping principle [17]). If Ω is a closed subset of a Banach space 𝑋 and F :Ω → Ω is a −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑃 contraction, then F has a unique fixed point in Ω. (13) Δ 𝜎 Δ 𝑋 (𝑡 )+𝑎 (𝑡 )(𝑋 (𝑡 )−𝜇 (𝑡 )𝑋 (𝑡 )) 𝑁 𝑁 𝑁 For any𝜔 -periodic function V defined on T, denote V = =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] max V(𝑡), V = min V(𝑡), |V|= max |V(𝑡)|, 𝑁 𝑁 𝑁 𝑡∈[0,𝜔] 𝑡∈[0,𝜔] 𝑡∈[0,𝜔] and |V| = min |V(𝑡)|.Throughout this paper, we make 𝑡∈[0,𝜔] 2 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 the following assumptions: 1 2 1 2 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], (𝐴 ) 𝑘 (𝑡), 𝑘 (𝑡), 𝑟 (𝑡), 𝑟 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑁 𝑁 𝑁 1 𝑃 𝑁 𝑃 𝑁 𝑁 𝑃 𝑃 𝑁 𝑁 𝑎 (𝑡), 𝑎 (𝑡), 𝜏 (𝑡), 𝜏 (𝑡), 𝐼 (𝑡),and 𝐼 (𝑡) are 𝜔 - 𝑃 𝑁 𝑃 𝑁 𝑃 𝑁 which leads to periodic functions den fi ed on T, −𝑎 (𝑡), −𝑎 (𝑡) ∈ 𝑃 𝑁 R . Δ 𝜎 𝑋 (𝑡 )+⊖(−𝑎 )(𝑡 )𝑋 (𝑡 ) 𝑃 𝑃 𝑃 (𝐴 ) 𝐺(⋅) : R → R is Lipschitz continuous; that is, =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] |𝐺(𝑢)−(𝐺 V)| ≤ 𝐿|𝑢− V|,for all𝑢, V ∈ R,and𝐺(0) = 0 , 𝑃 𝑃 𝑃 sup |𝐺( V)| ≤ 𝑀 . V∈R ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 For simplicity, take the following denotations: −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] , 󵄨 󵄨 󵄨 󵄨 𝑁 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 𝑅= max{𝑟 ,𝑟 }, 𝐼 = max{𝐿 𝐼 ,𝐿 𝐼 }, 1−𝜇 (𝑡 )𝑎 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 Δ 𝜎 𝑋 (𝑡 )+⊖(−𝑎 )(𝑡 )𝑋 (𝑡 ) 𝑁 𝑁 𝐾= max{𝑘 ,𝑘 }, 𝑃 𝑁 (11) =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 𝑁 1 1 2 2 𝑊= max{𝐿 𝑤 ,𝐿 𝑤 ,𝐿 𝑤 ,𝐿 𝑤 }, 𝑃 𝑁 𝑃 𝑁 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝜏 = min{|𝜏 |,|𝜏 |}. 0 𝑃 𝑁 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] . 𝑁 𝑁 𝑁 𝑁 Lemma 11. Suppose (𝐴 )holds;then 𝑍(𝑡) is an 𝜔 -periodic 1−𝜇 𝑡 𝑎 𝑡 () () solution of (1) if and only if𝑍(𝑡) is the solution of the following (14) system: Multiplying both sides of the above equalities by 𝑒 (𝑡,0) ⊖(−𝑎 ) and 𝑒 (𝑡,0),respectively, we have ⊖(−𝑎 ) 𝑋 (𝑡 )= 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) [𝑒 (𝑡,0 )𝑋 (𝑡 )] ⊖(−𝑎 ) 𝑃 𝑡+𝜔 𝑃 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑃 𝑃 𝑃 =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 𝑃 𝑃 𝑃 1 1 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 𝑃 𝑃 1 1 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ,𝑠 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )]𝑒 (𝜎 (𝑡 ),0), 𝑁 𝑁 𝑃 𝑁 𝑁 𝑃 ⊖(−𝑎 ) 𝑁 𝑁 [𝑒 (𝑡,0 )𝑋 (𝑡 )] 𝑋 (𝑡 )= ⊖(−𝑎 ) 𝑁 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 2 1−𝜇 (𝑠 )𝑎 (𝑠 ) ×𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) 𝑡 () () 𝑁 𝑃 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑃 𝑃 𝑃 𝑁 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ.𝑠 ×𝑒 (𝜎 (𝑡 ),0). 𝑁 𝑁 𝑁 𝑁 ⊖(−𝑎 ) (12) (15) 4 Advances in Articfi ial Neural Systems Integrating both sides of the above equalities from 𝑡 to 𝑋 (𝑡 )= 𝑒 𝜔,0 −1 ( ) 𝑡+𝜔 and using 𝑋 (𝑡 + )𝜔 = 𝑋 (𝑡) and 𝑋 (𝑡 + )𝜔 = 𝑋 (𝑡), ⊖(−𝑎 ) 𝑃 𝑃 𝑁 𝑁 𝑁 we have 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) 𝑡+𝜔 𝑁 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 𝑋 (𝑡 )= ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 𝑃 𝑃 𝑃 𝑡 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ.𝑠 𝑁 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑃 (18) 𝑒 (𝜎 (𝑠 ),0) ⊖(−𝑎 ) × Δ𝑠 The proof is completed. 𝑒 𝑡+𝜔,0 −𝑒 𝑡,0 ( ) ( ) ⊖(−𝑎 ) ⊖(−𝑎 ) 𝑃 𝑃 𝑡+𝜔 = ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 3. Main Results 𝑃 𝑃 𝑃 In this section, we prove the existence and uniqueness of the ×𝐺[𝑤 𝑠 𝑋 (𝑠 − 𝜏 𝑠 ) () () 𝑃 𝑃 𝑃 periodic solution to (1). −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑁 𝑃 Theorem 12. Suppose (𝐴 )-(𝐴 )holdand max{𝛼,}𝑊 < 1 . 1 2 Then (1) has a unique 𝜔 -periodic solution, where 𝑒 (𝜎 (𝑠 ),𝑡 ) ⊖(−𝑎 ) × Δ𝑠, 𝜔 󵄨 󵄨 𝑒 (𝑡+𝜔,𝑡 )−1 󵄨 󵄨 ⊖(−𝑎 ) 󵄨 󵄨 𝜔 exp(∫ 𝜉 ⊖(−𝑎 (𝜏 )) Δ𝜏)(𝐾 +𝛽𝑅+ 𝑅𝑀/𝑊) 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 󵄨 𝛼 := , 𝑡+𝜔 1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 󵄨 𝑋 (𝑡 )= ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] ⊖(−𝑎 ) 𝑃 󵄨 𝑃 󵄨 𝑁 𝑁 𝑁 𝑁 𝜔 󵄨 󵄨 󵄨 󵄨 𝜔 exp(∫ 󵄨 𝜉 ⊖(−𝑎 𝜏 )󵄨 Δ𝜏)(𝐾 +𝛽𝑅+ 𝑅𝑀/𝑊) ( ) 󵄨 󵄨 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝜇(𝜏) 𝑁 𝑃 𝑃 0 󵄨 󵄨 𝛼 := , 󵄨 󵄨 󵄨 󵄨 2 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 ⊖(−𝑎 ) 󵄨 𝑃 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 󵄨 𝑁 󵄨 𝑁 𝑁 𝑁 (19) 𝑒 (𝜎 (𝑠 ),0) ⊖(−𝑎 ) × Δ𝑠 and𝛼:= max{𝛼 ,𝛼 }. 𝑒 (𝑡+𝜔,0 )−𝑒 (𝑡,0 ) 1 2 ⊖(−𝑎 ) ⊖(−𝑎 ) 𝑁 𝑁 𝑡+𝜔 Proof. Let X = {𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) | 𝑍 ∈ 𝐶 (T, R ), 𝑍(𝑡+ 𝑃 𝑁 rd = ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 𝜔) = 𝑍(𝑡)} with the norm‖𝑍‖ = sup {|𝑧 (𝑡)|+|𝑧 (𝑡)|};then 𝑃 𝑁 𝑡∈ T X is a Banach space [14]. Define ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 F : X 󳨀→ X, (F𝑍 )(𝑡 )=((F𝑍 ) (𝑡 ),(F𝑍 ) (𝑡 )), (20) 𝑃 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑁 𝑁 where 𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈X and 𝑃 𝑁 𝑒 (𝜎 (𝑠 ),𝑡 ) ⊖(−𝑎 ) × Δ𝑠. (F𝑍 ) (𝑡 )= 𝑒 (𝑡+𝜔,𝑡 )−1 𝑃 ⊖(−𝑎 ) 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) (16) 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) Since × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑒 (𝑠,𝑡 ) 𝑃 ⊖(−𝑎 ) =𝑒 (𝜎 (𝑠 ),𝑡 ), ⊖(−𝑎 ) 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 (17) 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) =𝑒 𝜎 𝑠 ,𝑡 ( () ) ⊖(−𝑎 ) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠, 𝑁 𝑁 𝑁 𝑃 1−𝜇 𝑠 𝑎 𝑠 () () and 𝑎 (𝑡 + )𝜔 = 𝑎 (𝑡), 𝑎 (𝑡 + )𝜔 = 𝑎 (𝑡),weobtainthat 𝑃 𝑃 𝑁 𝑁 F𝑍 𝑡 = ( ) () 𝑒 𝜔,0 −1 ( ) 1 ⊖(−𝑎 ) 𝑋 𝑡 = () 𝑒 𝜔,0 −1 ( ) 𝑡+𝜔 ⊖(−𝑎 ) 𝑃 𝑒 𝑠,𝑡 ( ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) 1−𝜇 (𝑠 )𝑎 (𝑠 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 2 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 2 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ,𝑠 (21) 𝑁 𝑁 𝑁 𝑃 Advances in Articfi ial Neural Systems 5 for𝑡∈ T.Notethat It follows from (23)and (24)that ∫ 𝜉 (⊖(−𝑎 )(𝜏))Δ𝜏 𝜇(𝜏) 𝑃 𝑒 (𝑠,𝑡 )=𝑒 ⊖(−𝑎 ) ‖F𝑍 ‖ ≤ 𝐼+𝛼𝛼 𝑊 ‖𝑍 ‖ ≤ . (25) 𝑡+𝜔 ∫ |𝜉 (⊖(−𝑎 )(𝜏))|Δ𝜏 𝜇(𝜏) 𝑃 1−𝑊 𝑡 (22) ≤𝑒 ∫ |𝜉 (⊖(−𝑎 )(𝜏))|Δ𝜏 𝜇(𝜏) 𝑃 =𝑒 . Hence, F𝑍∈Ω . Next, we prove that F is a contraction mapping. For any LetΩ = {𝑍(𝑡) | 𝑍 ∈ X, ‖𝑍‖ ≤ 𝐼/(1−𝑊)} and𝛽:=𝐼/(1−)𝑊 . 󸀠 󸀠 󸀠 𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈ Ω,𝑍 (𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈ Ω,wehave 𝑃 𝑁 𝑃 𝑁 Obviously, Ω is a closed nonempty subset of X.Firstly,we prove that the mapping F maps Ω into itself. In fact, for any 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )−(F𝑍 ) (𝑡 ) 𝑍(𝑡) ∈ Ω ,wehave 󵄨 𝑃 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )󵄨 󵄨 󵄨 󵄨 1 󵄨 = 󵄨 1 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 󵄨 𝑃 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 𝑡+𝜔 𝑒 𝑠,𝑡 ( ) ⊖(−𝑎 ) 𝑡+𝜔 𝑒 (𝑠,𝑡 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] ⊖(−𝑎 ) 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑡 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 󵄨 𝑁 𝑁 𝑃 −𝑤 𝑠 𝑧 (𝑠 − 𝜏 𝑠 )+𝐼 𝑠 ]Δ𝑠 󵄨 () () () 𝑁 𝑁 𝑁 𝑃 𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏) 󵄨 󵄨 𝑒 (𝜔,0 )−1 𝜇(𝜏) 𝑃 0 ⊖(−𝑎 ) 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 𝜔,0 −1󵄨 (1 −𝑎 𝜇) ( ) 󵄨 󵄨 ⊖(−𝑎 ) 𝑃 𝑡+𝜔 󵄨 󵄨 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑡+𝜔 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 𝑡 󵄨 𝑃 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑃 𝑃 𝑃 1 󸀠 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 1 󸀠 1 −𝑤 𝑠 𝑧 (𝑠 − 𝜏 𝑠 )+𝐼 𝑠 ]Δ𝑠 󵄨 󵄨 () () () 𝑁 𝑁 𝑁 𝑃 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 󵄨 𝑁 𝑁 𝑃 󵄨 𝑁 󵄨 𝜔 󵄨 󵄨 󵄨 󵄨 𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏)(𝐾 +)𝛽𝑅 󵄨 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏)(𝐾𝑊 +𝛽𝑊𝑅+ 𝑅𝑀) 0 󵄨 󵄨 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 󵄨 󵄨 󵄨 ≤ 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 󵄨 󵄨 󵄨 𝑒 𝜔,0 −1 (1 −𝑎 𝜇) ⊖(−𝑎 ) 𝑃 ( ) 𝑃 󵄨 󵄨 󵄨 󵄨 ⊖(−𝑎 ) 𝑃 󵄨 󵄨 𝑡+𝜔 𝑡+𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 × ∫ 󵄨 [𝑊𝑧 (𝑠 − 𝜏 (𝑠 ))+𝑊𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼]󵄨 Δ𝑠 𝑃 𝑃 𝑁 𝑁 × ∫ 𝑧 (𝑠 − 𝜏 (𝑠 ))−𝑧 (𝑠 − 𝜏 (𝑠 ))+𝑧 (𝑠 − 𝜏 (𝑠 )) 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 −𝑧 (𝑠 − 𝜏 (𝑠 )) Δ𝑠 󵄨 󵄨 󵄨 󵄨 𝑁 󵄨 ≤𝛼 (𝐼 + 𝑊 sup(𝑧 (𝑡 ) + 𝑧 (𝑡 ))). 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 1 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 𝑡∈ T 󵄨 󵄨 󵄨 󵄨 󸀠 󸀠 󵄨 󵄨 󵄨 󵄨 (23) 󵄨 󵄨 󵄨 󵄨 ≤𝛼 𝑊 sup[𝑧 (𝑡 )−𝑧 (𝑡 ) + 𝑧 (𝑡 )−𝑧 (𝑡 )]. 󵄨 󵄨 󵄨 󵄨 1 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 𝑡∈ T Similarly, we have (26) 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )󵄨 󵄨 󵄨 Similarly, we have 󵄨 1 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )−(F𝑍 ) (𝑡 ) 󵄨 󵄨 𝑡+𝜔 󵄨 𝑁 󵄨 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑃 󵄨 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 1 𝑒 𝜔,0 −1 󵄨 ( ) ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 󵄨 ⊖(−𝑎 ) 𝑃 𝑁 𝑃 𝑡+𝜔 󵄨 𝑒 (𝑠,𝑡 ) 󵄨 ⊖(−𝑎 ) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 𝑁 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 󵄨 𝑁 𝑁 𝑁 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 𝑡 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) ≤𝛼 (𝐼 + 𝑊 sup(󵄨 𝑧 (𝑡 )󵄨 + 󵄨 𝑧 (𝑡 )󵄨 )). 𝑃 𝑃 2 𝑃 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 𝑡∈ T (24) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 𝑁 6 Advances in Articfi ial Neural Systems (𝑋 (𝑡 )−𝑋 (𝑡 )) − 𝑁 𝑒 𝜔,0 −1 ( ) ⊖(−𝑎 ) =−𝑎 (𝑡 )(𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) 𝑁 󸀠 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑁 2 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑡 +𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑃 𝑃 2 󸀠 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 2 𝑃 𝑃 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑁 𝑁 𝑁 2 󸀠 2 ∗ −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 󵄨 −𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑁 𝑁 𝑃 󵄨 𝑃 𝑃 2 ∗ 󵄨 󵄨 󵄨 󵄨 󸀠 󸀠 󵄨 󵄨 󵄨 󵄨 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 󵄨 󵄨 󵄨 󵄨 𝑁 𝑁 ≤𝛼 𝑊 sup[𝑧 (𝑡 )−𝑧 (𝑡 ) + 𝑧 (𝑡 )−𝑧 (𝑡 )]. 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 2 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 𝑡∈ T (27) −𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑁 𝑃 𝑃 𝑃 From (26)and (27), we can get −𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 )+𝐼 𝑡 ] () () () 𝑁 𝑁 𝑁 𝑁 󵄩 󵄩 󵄩 󵄩 󸀠 󸀠 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 (F𝑍 )−(F𝑍 ) ≤𝛼𝑊 𝑍−𝑍 . (28) 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 ∗ 2 ∗ +𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) Note that 𝛼𝑊 < 1 .Thus, F is a contraction mapping. By 𝑁 𝑁 𝑃 𝑃 𝑃 the xfi ed point theorem in the Banach space, F possesses a 2 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], unique fixed point. The proof is completed. 𝑁 𝑁 𝑁 𝑁 (30) Theorem 13. Under the conditions of Theorem 12 ,suppose further the following. which leads to (𝐴 ) There exist some constants 𝜖>0 ,𝜉>0 , 𝜉 >0 such +󵄨 ∗ 󵄨 that 󵄨 󵄨 𝐷 󵄨 𝑋 (𝑡) − 𝑋 (𝑡)󵄨 󵄨 𝑃 󵄨 (1 + 𝜇𝜖(𝑡 + 𝜏 ))(𝐾 +𝛽)𝑅 𝑊 󵄨 ∗ 󵄨 (1 + ) 𝑒 (𝑡 + 𝜏 ,𝑡) < 1, 𝜖 0 󵄨 󵄨 ≤−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 𝑃 𝑃 𝜉 󵄨 𝑃 󵄨 (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))−𝜖 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 (1 + 𝜇𝜖(𝑡 + 𝜏 ))(𝐾 +𝛽)𝑅 𝑊 𝑃 0 0 𝜉 󵄨 𝑃 󵄨 (1 + ) 𝑒 (𝑡 + 𝜏 ,𝑡) < 1; 𝜖 0 𝜉 󵄨 ∗ 󵄨 (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))−𝜖 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 (29) +󵄨 ∗ 󵄨 󵄨 󵄨 𝐷 󵄨 𝑋 (𝑡) − 𝑋 (𝑡)󵄨 󵄨 𝑁 󵄨 then the periodic solution of (1) is globally exponentially stable. 󵄨 ∗ 󵄨 󵄨 󵄨 ≤−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 𝑁 𝑁 󵄨 𝑁 󵄨 Proof. It follows from Theorem 12 that (1)has an 𝜔 -periodic ∗ ∗ ∗ ⊤ solution 𝑍 =(𝑋 (𝑡),𝑋 (𝑡)) . 𝑃 𝑁 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 𝑃 0 0 󵄨 𝑃 󵄨 Let𝑍(𝑡) = (𝑋 (𝑡),𝑋 (𝑡)) be any solution of (1); then we 𝑃 𝑁 have 󵄨 ∗ 󵄨 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )). 󵄨 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 ∗ Δ (𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑃 𝑃 (31) =−𝑎 (𝑡 )(𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑃 𝑃 𝑃 For any 𝛼∈[−𝜏 ,0] , construct the Lyapunov functional 0 T 𝑉(𝑡) = 𝑉 (𝑡) + 𝑉 (𝑡) + 𝑉 (𝑡) + 𝑉 (𝑡),where +𝑘 𝑡 𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) 1 2 3 4 () () () 𝑃 𝑃 𝑃 𝑃 󵄨 󵄨 󵄨 󵄨 𝑉 (𝑡 )=𝜉𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 , −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 1 𝜖 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 𝑁 𝑃 󸀠 󵄨 ∗ 󵄨 󵄨 󵄨 1 ∗ 𝑉 (𝑡 )=𝜉 𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 , 3 𝜖 𝑁 󵄨 𝑁 󵄨 −𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 1 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑉 (𝑡 )=𝜉 ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 𝑁 𝑃 2 0 𝜖 0 𝑁 𝑁 𝑡−𝜏 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 −𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 ×(󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 + 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 )Δ,𝑠 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑁 𝑁 𝑃 𝑁 󸀠 𝑉 𝑡 =𝜉 ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 () 4 0 𝜖 0 𝑡−𝜏 ∗ 1 ∗ +𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 𝑃 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑠 )−𝑋 (𝑠 ) + 𝑋 (𝑠 )−𝑋 (𝑠 ))Δ.𝑠 󵄨 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 󵄨 1 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], (32) 𝑁 𝑁 𝑁 𝑃 Advances in Articfi ial Neural Systems 7 󵄨 󵄨 + Δ ∗ 󸀠 󵄨 󵄨 Calculating 𝐷 𝑉(𝑡) along (1), we can get × 𝑋 𝑡 −𝑋 𝑡 +𝜉 (1 + 𝜇𝜖 𝑡 )𝑒 𝑡,𝛼 (𝐾 +𝛽)𝑅 󵄨 () ()󵄨 () ( ) 𝑁 𝑁 𝜖 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑊×( 󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 𝑃 0 0 + Δ 󵄨 𝑃 󵄨 𝐷 𝑉 (𝑡 ) 󵄨(1) 󵄨 󵄨 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ≤𝜉[𝑒𝜖 𝑡,𝛼 𝑋 𝑡 −𝑋 𝑡 ( )󵄨 () ()󵄨 𝜖 𝑃 𝑃 󵄨 󵄨 󵄨 + Δ 𝐷 𝑉 (𝑡 ) 󵄨(1) +󵄨 ∗ 󵄨Δ 󵄨 󵄨 +𝑒 (𝜎 (𝑡 ),𝛼 )𝐷 𝑋 (𝑡) − 𝑋 (𝑡) ] 󵄨 󵄨 𝜖 󵄨 𝑃 𝑃 󵄨 ≤𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ∗ ∗ 𝑋 +𝑒 ≤𝜉{𝑒𝜖 (𝑡,𝛼 )󵄨 (𝑡 )−𝑋 (𝑡 )󵄨 (𝜎 (𝑡 ),𝛼 ) 󵄨 󵄨 󵄨 󵄨 𝜖 𝑃 𝑃 𝜖 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×[−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 −𝜉 (1 + 𝜇𝜖 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 𝑃 𝑃 󵄨 𝑃 󵄨 𝜖 󵄨 󵄨 󵄨 ∗ 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 ) ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 󵄨 󵄨 𝑃 0 0 𝑃 0 𝑃 0 󵄨 𝑃 󵄨 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 +󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 ). + 󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 )]} 𝑁 0 0 󵄨 𝑁 󵄨 𝑁 0 0 󵄨 𝑁 󵄨 (35) 󵄨 ∗ 󵄨 󵄨 󵄨 = 𝜉[𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))]𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝑃 𝜖 𝑃 󵄨 𝑃 󵄨 We have +𝜉(1+𝜖𝜇 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 + Δ󵄨 𝐷 (𝑉 (𝑡 )+𝑉 (𝑡 )) 3 4 󵄨 󵄨(1) 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 ) 󵄨 󵄨 󵄨 𝑃 0 𝑃 0 󵄨 ≤𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] 󵄨 ∗ 󵄨 󵄨 󵄨 +󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 ), 𝑁 0 0 󵄨 𝑁 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝜖 𝑁 󵄨 󵄨 𝑁 󵄨 + Δ 𝐷 𝑉 (𝑡 ) 󵄨(1) +𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 ≤𝜉(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 󵄨 ∗ 󵄨 󸀠 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨)+𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 )) 𝑁 0 󵄨 𝑁 󵄨 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 ×𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊( 𝛽𝑅 𝑋 (𝑡 )−𝑋 (𝑡 )). 󵄨 󵄨 𝜖 0 󵄨 𝑃 𝑃 󵄨 −𝜉(1+𝜖𝜇 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 𝜖 (36) 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 From (34)and (36), we can get 𝑃 0 0 󵄨 𝑃 󵄨 󵄨 ∗ 󵄨 + Δ 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 󵄨 𝐷 𝑉 (𝑡 ) 󵄨 𝑁 0 𝑁 0 󵄨 󵄨 (1) (33) ≤{𝜉[𝜖−(𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] which leads to +(𝜉+𝜉 )(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝑡)(𝐾+𝑅)𝑊𝛽 } 0 𝜖 0 + Δ󵄨 󵄨 󵄨 󵄨 ∗ 𝐷 (𝑉 (𝑡) + 𝑉 (𝑡)) 󵄨 󵄨 ×𝑒 𝑡,𝛼 𝑋 𝑡 −𝑋 𝑡 1 2 󵄨 ( )󵄨 () ()󵄨 𝜖 𝑃 𝑃 󵄨(1) 󵄨 󵄨 󸀠 󸀠 +{𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] + (𝜉 + 𝜉 ) ≤ 𝜉[𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ×(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝑡)(𝐾+)𝑊𝑅𝛽 } 𝜖 𝑃 0 𝜖 0 󵄨 𝑃 󵄨 (34) 󵄨 ∗ 󵄨 󵄨 󵄨 +𝜉(1+𝜖𝜇(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 . 0 𝜖 0 𝜖 𝑁 󵄨 𝑁 󵄨 (37) 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 󵄨 𝑃 󵄨 By assumption (𝐴 ), it follows that 𝑉(𝑡) ≤ 𝑉(0) for𝑡∈ T . +𝜉(1+𝜖𝜇(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 On the other hand, we have 󵄨 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ). 𝑁 𝑁 󵄨 󵄨 󸀠 𝑉 (0)≤[𝑒𝜉 (0,𝛼 )+(𝜉+𝜉 ) Note that ×∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+𝑅)𝑊𝛽 Δ]𝑠 0 𝜖 0 + Δ 󵄨 −𝜏 𝐷 𝑉 (𝑡 ) 󵄨(1) 󵄨 ∗ 󵄨 󵄨 󵄨 × sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 󵄨 𝑃 󵄨 ≤𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))]𝑒 (𝑡,𝛼 ) 𝑠∈ −𝜏 ,0 𝑁 𝜖 [ ] T 8 Advances in Articfi ial Neural Systems 󸀠 󸀠 Consider the following Wilson-Cowan neural network with +[𝜉 𝑒 (0,𝛼 )+(𝜉+𝜉 ) delays on time scale T: ×∫ (1+𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+𝑅)𝑊𝛽 Δ]𝑠 0 𝜖 0 𝑋 (𝑡 ) −𝜏 󵄨 󵄨 󵄨 󵄨 × sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 =−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑃 𝑠∈[−𝜏 ,0] 0 T 1 1 ×𝐺[𝑤 𝑡 𝑋 𝑡−2 −𝑤 𝑡 𝑋 𝑡−1 +𝐼 𝑡 ], () ( ) () ( ) () 𝑃 𝑃 𝑁 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 =Γ(𝜖 )( sup 𝑋 𝑠 −𝑋 𝑠 󵄨 () ()󵄨 𝑃 𝑃 󵄨 󵄨 𝑠∈ [−𝜏 ,0] 𝑋 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 =−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ), 𝑁 𝑁 𝑁 𝑁 𝑁 𝑁 𝑁 󵄨 󵄨 𝑠∈ −𝜏 ,0 [ 0 ] 2 2 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡−2 )−𝑤 (𝑡 )𝑋 (𝑡−1 )+𝐼 (𝑡 )]. (38) 𝑃 𝑃 𝑁 𝑁 𝑁 (43) where Γ(𝜖) = max{Δ ,Δ }, 1 2 Δ =𝜉𝑒 (0,𝛼 )+(𝜉+𝜉 ) ⊤ 1 𝜖 Case 1. Consider T = R.Take (𝑎 (𝑡),𝑎 (𝑡)) =(2+sin(𝑡),2+ 𝑃 𝑁 cos(𝑡)) .Obviously, 𝑎 =𝑎 =1, 𝑃 𝑁 × ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝑅𝛽 Δ,𝑠 0 𝜖 0 −𝜏 2𝜋 4𝜋 󸀠 󸀠 exp(∫ 𝑎 (𝑠 )𝑑𝑠) Δ =𝜉 𝑒 (0,𝛼 )+(𝜉+𝜉 ) 2 𝜖 = , 2𝜋 4𝜋 𝑒 −1 exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 0 𝑃 × ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝑅𝛽 Δ.𝑠 0 𝜖 0 (44) −𝜏 2𝜋 4𝜋 exp(∫ 𝑎 (𝑠 )𝑑𝑠) (39) = . 2𝜋 4𝜋 𝑒 −1 It is obvious that exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 󵄨 󵄨 󵄨 󵄨 ∗ 󸀠 ∗ 󵄨 󵄨 󵄨 󵄨 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +𝜉 𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝜖 𝑃 𝑃 𝜖 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 ⊤ ⊤ Take (𝐼 (𝑡),𝐼 (𝑡)) =(−1+sin(𝑡),cos(𝑡)) , 𝑘 (𝑡) = 𝑘 (𝑡) = ≤𝑉 (𝑡 )≤𝑉 (0), 𝑃 𝑁 𝑃 𝑁 1 1 2 2 𝑟 (𝑡) = 𝑟 (𝑡) = 0.01, 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 0.1, 𝑃 𝑁 (40) 𝑃 𝑁 𝑃 𝑁 and 𝐺(𝑥) = (1/2)(|𝑥+1|−|𝑥−1|) .Wehave𝐿=1 .Let𝜉=1 , which means that 𝜉 =2.One caneasilyverifythat 󸀠 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 min{𝜉,𝜉 }𝑒 (𝑡,𝛼 )(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝜖 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 2𝜋 exp(∫ 𝑎 𝑠 𝑑𝑠) () 𝑅𝑀 ≤𝑉 (0). 0 𝛼 =𝜔(𝐾+𝛽𝑅+ ) 2𝜋 (41) exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 u Th s, we finally get ≈ 0.831 < 1, 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 2𝜋 exp(∫ 𝑎 (𝑠 )𝑑𝑠) Γ(𝜖 )𝑒 (𝑡,𝛼 ) 󵄨 ∗ 󵄨 𝑅𝑀 ⊖𝜖 0 󵄨 󵄨 ≤ ×( sup 𝑋 (𝑠 )−𝑋 (𝑠 ) 󵄨 󵄨 𝛼 =𝜔(𝐾+𝛽𝑅+ ) 󵄨 𝑃 𝑃 󵄨 󸀠 2 (45) 2𝜋 min{𝜉,𝜉 } 𝑠∈[−𝜏 ,0] 𝑊 0 T exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 󵄨 󵄨 󵄨 󵄨 + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ). ≈ 0.831 < 1, 𝑁 𝑁 󵄨 󵄨 𝑠∈[−𝜏 ,0] 0 T (42) −𝜉(𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.980 < 0, eTh refore, the unique periodic solution of ( 1)isglobally 󸀠 󸀠 exponentially stable. eTh proof is completed. −𝜉 (𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −1.970 < 0. 4. Examples It follows from eTh orems 12 and 13 that (43)has aunique2𝜋 - In this section, two numerical examples are shown to verify periodic solution which is globally exponentially stable (see the eeff ctiveness of the result obtained in the previous section. Figure 1). 𝜉𝑒 Advances in Articfi ial Neural Systems 9 0.02 0.02 0.015 0.015 0.01 0.01 0.005 0.005 0 0 −0.005 −0.005 −0.01 −0.01 −0.015 −0.015 −0.02 −0.02 0 5 10 15 2025303540 02468 10 12 14 16 18 20 Time t X (n) X (t) X (n) X (t) Figure 2: Globally exponentially stable periodic solution of ( 46). Figure 1: Globally exponentially stable periodic solution of ( 43). Case 2. Consider T = Z.Equation(43)reduces to the following difference equation: 𝑅𝑀 =(𝐾+𝛽𝑅+ ) 𝑋 (𝑛+1 )−𝑋 (𝑛 ) 𝑃 𝑃 󵄨 󵄨 𝜔−1 𝜔−1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝜔 1−∏ (1 − 𝑎 (𝑘 )) exp(∑ 󵄨 Log(1 − 𝑎 (𝑘 ))󵄨 ) 󵄨 󵄨 𝑁 𝑁 𝑘=1 𝑘=0 󵄨 󵄨 󵄨 󵄨 =−𝑎 (𝑛 )𝑋 (𝑛 )+[𝑘 (𝑛 )−𝑟 (𝑛 )𝑋 (𝑛 )] 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑃 𝜔−1 󵄨 󵄨 󵄨 󵄨 ∏ (1 − 𝑎 (𝑘 )) (1 −𝑎 ) 󵄨 𝑁 󵄨 𝑁 𝑘=1 󵄨 󵄨 ×𝐺[𝑤 (𝑛 )𝑋 (𝑛−2 ) 𝑃 𝑃 ≈ 0.015 < 1, −𝑤 (𝑛 )𝑋 (𝑛−1 )+𝐼 (𝑛 )], 𝑁 𝑃 −𝜉(𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.480 < 0, 𝑋 𝑛+1 −𝑋 𝑛 ( ) ( ) 𝑁 𝑁 󸀠 󸀠 −𝜉 (𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.970 < 0. =−𝑎 (𝑛 )𝑋 (𝑛 )+[𝑘 (𝑛 )−𝑟 (𝑛 )𝑋 (𝑛 )] 𝑁 𝑁 𝑁 𝑁 𝑁 (47) 2 2 ×𝐺[𝑤 (𝑛 )𝑋 (𝑛−2 )−𝑤 (𝑛 )𝑋 (𝑛−1 )+𝐼 (𝑛 )], 𝑃 𝑁 𝑁 𝑃 𝑁 It follows from eTh orems 12 and 13 that (46)has aunique 6- (46) periodic solution which is globally exponentially stable (see Figure 2). + ⊤ ⊤ for𝑛∈ Z .Take (𝑎 (𝑛),𝑎 (𝑛)) = (1/2,1/2).Obviously, 0 𝑃 𝑁 𝑎 =𝑎 =1/2, 𝑎 = 𝑎 =1/2, (𝐼 (𝑡),𝐼 (𝑡)) =(1+ 𝑃 𝑁 𝑃 𝑁 𝑃 𝑁 5. Conclusion Remarks sin(𝑛𝜋/3), cos(𝑛𝜋/3)) , 𝑘 (𝑡) = 𝑘 (𝑡) = 𝑟 (𝑡) = 𝑟 (𝑡) = 𝑃 𝑁 𝑃 𝑁 1 1 2 2 0.01, 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 0.1,and 𝐺(𝑥) = In this paper, we studied the stability of delayed Wilson- 𝑃 𝑁 𝑃 𝑁 (1/2)(|𝑥 + 1| − |𝑥 − 1|) .Wehave𝐿=1 .Let𝜉=1 , 𝜉 =2.If Cowan networks on periodic time scales and obtained some T = Z, (𝜇(𝑡) = 1),choosing𝜔=6 ,bysimplecalculation,we more generalized results to ensure the existence, uniqueness, have and global exponential stability of the periodic solution. These results can give a significant insight into the complex dynamical structure of Wilson-Cowan type model. eTh conditions are easily checked in practice by simple algebraic 𝑅𝑀 =(𝐾+𝛽𝑅+ ) methods. 󵄨 󵄨 𝜔−1 𝜔−1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝜔 1− ∏ (1 − 𝑎 (𝑘 )) exp(∑ 󵄨 Log(1 − 𝑎 (𝑘 ))󵄨 ) 󵄨 𝑃 󵄨 𝑃 𝑘=1 𝑘=0 󵄨 󵄨 󵄨 󵄨 Conflict of Interests 󵄨 󵄨 𝜔−1 󵄨 󵄨 󵄨 󵄨 ∏ (1 − 𝑎 (𝑘 )) (1 −𝑎 ) 󵄨 󵄨 𝑘=1 𝑃 𝑃 󵄨 󵄨 eTh authors declare that there is no conflict of interests regarding the publication of this paper. ≈ 0.015 < 1, 10 Advances in Articfi ial Neural Systems Acknowledgments [17] A. Ruffing and M. Simon, “Corresponding Banach spaces on time scales,” Journal of Computational and Applied Mathematics, Thisresearchwas supportedbythe National NaturalScience vol. 179, no. 1-2, pp. 313–326, 2005. Foundation of China (11101187 and 11361010), the Foundation for Young Professors of Jimei University, the Excellent Youth Foundation of Fujian Province (2012J06001 and NCETFJ JA11144), and the Foundation of Fujian Higher Education (JA10184 and JA11154). References [1] H. R. Wilson and J. D. Cowan, “Excitatory and inhibitory inter- actions in localized populations of model neurons,” Biophysical Journal,vol.12, no.1,pp. 1–24,1972. [2] H. R. Wilson and J. D. Cowan, “A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue,” Kybernetik,vol.13, no.2,pp. 55–80, 1973. [3] A. Destexhe and T. J. Sejnowski, “eTh Wilson-Cowan model, 36 years later,” Biological Cybernetics,vol.101,no. 1, pp.1–2,2009. [4] K.Mantere,J.Parkkinen,T.Jaaskelainen, andM.M.Gupta, “Wilson-Cowan neural-network model in image processing,” Journal of Mathematical Imaging and Vision,vol.2,no. 2-3, pp. 251–259, 1992. [5] C. van Vreeswijk and H. Sompolinsky, “Chaos in neuronal networks with balanced excitatory and inhibitory activity,” Science,vol.274,no. 5293,pp. 1724–1726, 1996. [6] L.H.A.Monteiro, M. A. Bussab,and J. G. Berlinck,“Analytical results on a Wilson-Cowan neuronal network modified model,” Journal of eTh oretical Biology ,vol.219,no. 1, pp.83–91,2002. [7] S. Xie and Z. Huang, “Almost periodic solution for Wilson- Cowan type model with time-varying delays,” Discrete Dynam- ics in Nature and Society,vol.2013, ArticleID683091, 7pages, [8] V. W. Noonburg, D. Benardete, and B. Pollina, “A periodically forced Wilson-Cowan system,” SIAM Journal on Applied Math- ematics,vol.63, no.5,pp. 1585–1603, 2003. [9] S. Hilger, “Analynis on measure chains-a unified approach to continuous and discrete calculus,” Results in Mathematics,vol. 18, pp. 18–56, 1990. [10] S. Hilger, “Differential and difference calculus—unified!,” Non- linear Analysis:Theory,Methods &Applications ,vol.30, no.5, pp. 2683–2694, 1997. [11] A. Chen and F. Chen, “Periodic solution to BAM neural network with delays on time scales,” Neurocomputing,vol.73, no.1–3,pp. 274–282, 2009. [12] Y. Li, X. Chen, and L. Zhao, “Stability and existence of periodic solutions to delayed Cohen-Grossberg BAM neural networks with impulses on time scales,” Neurocomputing,vol.72, no.7–9, pp.1621–1630,2009. [13] Z.Huang,Y.N.Raoff ul, andC.Cheng,“Scale-limitedactivating sets and multiperiodicity for threshold-linear networks on time scales,” IEEE Transactions on Cybernetics,vol.44,no.4,pp.488– 499, 2014. [14] M. Bohner and A. Peterson, Dynamic Equations on Time Scales: An Introduction with Applications,Birkhaus ¨ er, Boston, Mass, USA, 2001. [15] M. Bohner and A. Peterson, Advance in Dynamic Equations on Time Scales,Birkhaus ¨ er, Boston, Mass, USA, 2003. [16] V. Lakshmikantham and A. S. Vatsala, “Hybrid systems on time scales,” Journal of Computational and Applied Mathematics,vol. 141, no. 1-2, pp. 227–235, 2002. Journal of Advances in Industrial Engineering Multimedia Applied Computational Intelligence and Soft Computing International Journal of The Scientific Distributed World Journal Sensor Networks Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Advances in Fuzzy Systems Modelling & Simulation in Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Submit your manuscripts at http://www.hindawi.com Journal of Computer Networks and Communications  Advances in  Artic fi ial Intelligence Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Advances in Biomedical Imaging Artificial Neural Systems International Journal of Computer Games Advances in Advances in Computer Engineering Technology Software Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Reconfigurable Computing Computational Advances in Journal of Journal of Intelligence and Human-Computer Electrical and Computer Robotics Interaction Neuroscience Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Advances in Artificial Neural Systems Hindawi Publishing Corporation

Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on Time Scales

Loading next page...
 
/lp/hindawi-publishing-corporation/exponential-stability-of-periodic-solution-to-wilson-cowan-networks-dnZmX0ertM
Publisher
Hindawi Publishing Corporation
Copyright
Copyright © 2014 Jinxiang Cai et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
ISSN
1687-7594
DOI
10.1155/2014/750532
Publisher site
See Article on Publisher Site

Abstract

Hindawi Publishing Corporation Advances in Artificial Neural Systems Volume 2014, Article ID 750532, 10 pages http://dx.doi.org/10.1155/2014/750532 Research Article Exponential Stability of Periodic Solution to Wilson-Cowan Networks with Time-Varying Delays on Time Scales Jinxiang Cai, Zhenkun Huang, and Honghua Bin School of Science, Jimei University, Xiamen 361021, China Correspondence should be addressed to Zhenkun Huang; hzk974226@jmu.edu.cn Received 31 December 2013; Accepted 12 February 2014; Published 2 April 2014 Academic Editor: Songcan Chen Copyright © 2014 Jinxiang Cai et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. We present stability analysis of delayed Wilson-Cowan networks on time scales. By applying the theory of calculus on time scales, the contraction mapping principle, and Lyapunov functional, new sufficient conditions are obtained to ensure the existence and exponential stability of periodic solution to the considered system. The obtained results are general and can be applied to discrete- time or continuous-time Wilson-Cowan networks. 1. Introduction Motivated by recent results [11–13], we consider the following dynamic Wilson-Cowan networks on time scale T: eTh activity of a cortical column may be mathematically 𝑡 =−𝑎 𝑡 𝑋 𝑡 +[𝑘 𝑡 −𝑟 𝑡 𝑋 𝑡 ] 𝑋 () () () () () () 𝑃 𝑃 𝑃 𝑃 𝑃 𝑃 described through the model developed by Wilson and Cowan [1, 2]. Such a model consists of two nonlinear ordinary 1 ×𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) () () 𝑃 𝑃 𝑃 differential equations representing the interactions between two populations of neurons that are distinguished by the −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑁 𝑃 fact that their synapses are either excitatory or inhibitory 𝑋 (𝑡 )=−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] [2]. A comprehensive paper has been done by Destexhe 𝑁 𝑁 𝑁 𝑁 𝑁 and Sejnowski [3] which summarized all important devel- ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 opment and theoretical results for Wilson-Cowan networks. Its extensive applications include pattern analysis and image 2 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑁 processing [4]. Theoretical results about the existence of (1) asymptotic stable limit cycle and chaos have been reported in [5, 6]. Exponential stability of a unique almost periodic 𝑡∈ T,where 𝑋 (𝑡), 𝑋 (𝑡) represent the proportion of 𝑃 𝑁 solution for delayed Wilson-Cowan type model has been excitatory and inhibitory neurons firing per unit time at the reported in [7]. However, few investigations are xfi ed on the instant 𝑡 ,respectively. 𝑎 (𝑡) > 0 and 𝑎 (𝑡) > 0 represent the 𝑃 𝑁 periodicity of Wilson-Cowan model [8]and it is troublesome function of the excitatory and inhibitory neurons with natural to study the stability and periodicity for continuous and decay over time, respectively. 𝑟 (𝑡) and 𝑟 (𝑡) are related to 𝑃 𝑁 discrete system with oscillatory coefficients, respectively. the duration of the refractory period; 𝑘 (𝑡) and 𝑘 (𝑡) are 𝑃 𝑁 1 1 2 2 eTh refore, it is signicfi ant to study Wilson-Cowan networks positive scaling coefficients. 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡),and 𝑤 (𝑡) 𝑃 𝑁 𝑃 𝑁 on time scales [9, 10]which canunify thecontinuousand are the strengths of connections between the populations. discrete situations. 𝐼 (𝑡), 𝐼 (𝑡) are the external inputs to the excitatory and 𝑃 𝑁 2 Advances in Articfi ial Neural Systems the inhibitory populations. 𝐺(⋅) is the response function of In case𝑡 is right-scattered and𝑢(𝑡) is continuous at𝑡 ,one neuronal activity.𝜏 (𝑡),𝜏 (𝑡)correspond to the transmission gets 𝑃 𝑁 time-varying delays. 𝑢 (𝜎 (𝑡 ))−𝑢 (𝑡 ) + Δ The main aim of this paper is to unify the discrete and 𝐷 𝑢 (𝑡 )= . (5) 𝜎 (𝑡 )−𝑡 continuous Wilson-Cowan networks with periodic coe-ffi cients and time-varying delays under one common frame- Den fi ition 5. Afunction 𝑓: T → R is called right- work andtoobtainsomegeneralized resultstoensurethe dense continuous provided that it is continuous at right- existence and exponential stability of periodic solution on dense points of T and the left-side limit exists (finite) at left- time scales. eTh main technique is based on the theory of time dense continuous functions on T. eTh set of all right-dense scales, the contraction mapping principle, and the Lyapunov continuous functions on T is defined by 𝐶 =𝐶 (T, R). rd rd functional method. Den fi ition 6. Afunction𝑝: T → T is called a regressive 2. Preliminaries function if and only if 1 + 𝑝(𝑡)𝜇(𝑡) =0̸ . In this section, we give some definitions and lemmas on time The set of all regressive and right-dense continuous scales which can be found in books [14, 15]. functions is denoted by R.Let R := {𝑝 ∈ 𝐶 :1 + rd 𝑝(𝑡)𝜇(𝑡) > 0 for all 𝑡∈ T}. Next, we give the definition of Den fi ition 1. Atimescale T is an arbitrary nonempty closed the exponential function and list its useful properties. subset of the real set R. eTh forward and backward jump operators 𝜎 ,𝜌: T → T and the graininess𝜇: T → R Den fi ition 7 (Bohner and Peterson [14]). If 𝑝∈𝐶 is a rd are defined, respectively, by regressive function, then the generalized exponential func- tion 𝑒 (𝑡,)𝑠 is defined by 𝜎 (𝑡 ):= inf {𝑠∈ T :𝑠>𝑡 },𝜌 (𝑡 ):= sup{𝑠∈ T :𝑠<𝑡 }, 𝜇 (𝑡 ):= 𝜎 (𝑡 )−𝑡. 𝑒 (𝑡,𝑠 )= exp{ ∫ 𝜉 (𝑝 (𝜏 ))Δ𝜏}, ,𝑡𝑠 ∈ T, (6) 𝑝 𝜇(𝜏) (2) These jump operators enable us to classify the point {𝑡} with the cylinder transformation of a time scale as right-dense, right-scattered, left-dense, or Log(1+ℎ𝑧 ) left-scattered depending on whether ,ℎ=0̸ , 𝜉 (𝑧 )= (7) 𝜎 𝑡 =𝑡, 𝜎 𝑡 >𝑡, 𝜌 𝑡 =𝑡, 𝜌 𝑡 <𝑡, () () () () 𝑧, ℎ = 0. (3) { respectively, for any 𝑡∈ T. Den fi ition 8. The periodic solution The notation [𝑎,]𝑏 means that [𝑎,]𝑏 := {𝑡 ∈ T :𝑎≤𝑡≤𝑏} . T T ∗ ∗ ∗ Denote T := {𝑡 ∈ T :𝑡≥0} . (8) 𝑍 (𝑡 )= (𝑋 (𝑡 ),𝑋 (𝑡 )) 𝑃 𝑁 of (1) is said to be globally exponentially stable if there exists Den fi ition 2. One can say that a time scale T is periodic if apositiveconstant𝜀 and𝑁=(𝜀)>0 such that all solutions there exists𝑝>0 such that𝑡∈ T;then𝑡±𝑝 ∈ T;the smallest positive number 𝑝 is called theperiodofthe time scale. 𝑍 (𝑡 )=(𝑋 (𝑡 ),𝑋 (𝑡 )) (9) 𝑃 𝑁 Clearly, if T is a 𝑝 -periodic time scale, then 𝜎(𝑡 + )𝑝𝑛 = of (1)satisfy 𝜎(𝑡)+𝑝𝑛 and𝜇(𝑡+)𝑝𝑛 = 𝜇(𝑡) .So,𝜇(𝑡) is a𝑝 -periodic function. 󵄨 󵄨 󵄨 󵄨 ∗ ∗ 󵄨 󵄨 󵄨 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 ) + 𝑋 (𝑡 )−𝑋 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 󵄨 Den fi ition 3. Let T(≠ R)be a periodic time scale with period 𝑝 . One can say that the function 𝑓: T → R is periodic 󵄨 󵄨 󵄨 󵄨 with period𝜔>0 if there exists a natural number𝑛 such that ≤𝑁 𝜖 𝑒 𝑡,𝛼 ( sup 𝑋 𝑠 −𝑋 𝑠 () ( ) 󵄨 () ()󵄨 ⊖𝜖 𝑃 𝑃 󵄨 󵄨 𝑠∈[−𝜏 ,0] 𝜔=𝑛𝑝 , 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all𝑡∈ T and 𝜔 is the smallest 0 T number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) .If T = R,one cansay that 󵄨 ∗ 󵄨 𝑓 is periodic with period𝜔>0 if 𝜔 is the smallest positive 󵄨 󵄨 + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ), 󵄨 𝑁 󵄨 number such that 𝑓(𝑡 + 𝜔) = 𝑓(𝑡) for all𝑡∈ R. 𝑠∈[−𝜏 ,0] 0 T Den fi ition 4 (Lakshmikantham and Vatsala [16]). For each 𝑡∈ T. 𝑡∈ T,let 𝑁 be a neighborhood of 𝑡 . en, Th one defines the (10) + Δ generalized derivative (or Dini derivative), 𝐷 𝑢 (𝑡),tomean Lemma 9 (Bohner and Peterson [15]). If 𝑝,𝑞 ∈ R,then that, given𝜀>0 , there exists a right neighborhood𝑁(𝜀) ⊂ of 𝑡 such that (i) 𝑒 (𝑡,)𝑠 ≡ 1 and 𝑒 (𝑡,)𝑡 ≡ 1 ; 0 𝑝 𝑢 𝜎 𝑡 −𝑢 𝑠 ( ()) () + Δ (ii) 𝑒 (𝜎(𝑡),𝑠) = (1 + 𝜇(𝑡)𝑝(𝑡))𝑒 (𝑡,)𝑠 ; <𝐷 𝑢 (𝑡 )+𝜀 (4) 𝑝 𝑝 𝑢 (𝑡,𝑠 ) (iii) 1/𝑒 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ,where ⊖𝑝(𝑡) = −𝑝(𝑡)/(1 + 𝑝 ⊖𝑝 for each 𝑠 ∈ 𝑁(𝜀) ,𝑠>𝑡 ,where 𝜇(𝑡,𝑠) = (𝑡) 𝜎 − 𝑠 . 𝜇(𝑡)𝑝(𝑡)) ; 𝑁 Advances in Articfi ial Neural Systems 3 (iv) 𝑒 (𝑡,)𝑠 = 1/𝑒 (𝑠,)𝑡 = 𝑒 (𝑠,)𝑡 ; Proof. Let𝑍(𝑡) = (𝑋 (𝑡),𝑋 (𝑡)) be a solution of (1); we can 𝑝 𝑝 ⊖𝑝 𝑃 𝑁 rewrite (1) as follows: (v) 𝑒 (𝑡,)𝑒𝑠 (𝑠,)𝑟 = 𝑒 (𝑡,)𝑟 ; 𝑝 𝑝 𝑝 Δ 𝜎 Δ (vi) 𝑒 (𝑡,)𝑒𝑠 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ; 𝑝 𝑞 𝑝⊕𝑞 𝑋 (𝑡 )+𝑎 (𝑡 )(𝑋 (𝑡 )−𝜇 (𝑡 )𝑋 (𝑡 )) 𝑃 𝑃 𝑃 (vii) 𝑒 (𝑡,)/𝑒 𝑠 (𝑡,)𝑠 = 𝑒 (𝑡,)𝑠 ; 𝑝 𝑞 𝑝⊖𝑞 =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑃 𝑃 𝑃 Δ 𝜎 (viii) (1/𝑒 (⋅,)) 𝑠 = −𝑝(𝑡)/𝑒 (⋅,)𝑠 . ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 Lemma 10 (contraction mapping principle [17]). If Ω is a closed subset of a Banach space 𝑋 and F :Ω → Ω is a −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], 𝑁 𝑁 𝑃 contraction, then F has a unique fixed point in Ω. (13) Δ 𝜎 Δ 𝑋 (𝑡 )+𝑎 (𝑡 )(𝑋 (𝑡 )−𝜇 (𝑡 )𝑋 (𝑡 )) 𝑁 𝑁 𝑁 For any𝜔 -periodic function V defined on T, denote V = =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] max V(𝑡), V = min V(𝑡), |V|= max |V(𝑡)|, 𝑁 𝑁 𝑁 𝑡∈[0,𝜔] 𝑡∈[0,𝜔] 𝑡∈[0,𝜔] and |V| = min |V(𝑡)|.Throughout this paper, we make 𝑡∈[0,𝜔] 2 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 the following assumptions: 1 2 1 2 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], (𝐴 ) 𝑘 (𝑡), 𝑘 (𝑡), 𝑟 (𝑡), 𝑟 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑤 (𝑡), 𝑁 𝑁 𝑁 1 𝑃 𝑁 𝑃 𝑁 𝑁 𝑃 𝑃 𝑁 𝑁 𝑎 (𝑡), 𝑎 (𝑡), 𝜏 (𝑡), 𝜏 (𝑡), 𝐼 (𝑡),and 𝐼 (𝑡) are 𝜔 - 𝑃 𝑁 𝑃 𝑁 𝑃 𝑁 which leads to periodic functions den fi ed on T, −𝑎 (𝑡), −𝑎 (𝑡) ∈ 𝑃 𝑁 R . Δ 𝜎 𝑋 (𝑡 )+⊖(−𝑎 )(𝑡 )𝑋 (𝑡 ) 𝑃 𝑃 𝑃 (𝐴 ) 𝐺(⋅) : R → R is Lipschitz continuous; that is, =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] |𝐺(𝑢)−(𝐺 V)| ≤ 𝐿|𝑢− V|,for all𝑢, V ∈ R,and𝐺(0) = 0 , 𝑃 𝑃 𝑃 sup |𝐺( V)| ≤ 𝑀 . V∈R ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 For simplicity, take the following denotations: −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] , 󵄨 󵄨 󵄨 󵄨 𝑁 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 𝑅= max{𝑟 ,𝑟 }, 𝐼 = max{𝐿 𝐼 ,𝐿 𝐼 }, 1−𝜇 (𝑡 )𝑎 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 Δ 𝜎 𝑋 (𝑡 )+⊖(−𝑎 )(𝑡 )𝑋 (𝑡 ) 𝑁 𝑁 𝐾= max{𝑘 ,𝑘 }, 𝑃 𝑁 (11) =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 𝑁 1 1 2 2 𝑊= max{𝐿 𝑤 ,𝐿 𝑤 ,𝐿 𝑤 ,𝐿 𝑤 }, 𝑃 𝑁 𝑃 𝑁 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝜏 = min{|𝜏 |,|𝜏 |}. 0 𝑃 𝑁 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] . 𝑁 𝑁 𝑁 𝑁 Lemma 11. Suppose (𝐴 )holds;then 𝑍(𝑡) is an 𝜔 -periodic 1−𝜇 𝑡 𝑎 𝑡 () () solution of (1) if and only if𝑍(𝑡) is the solution of the following (14) system: Multiplying both sides of the above equalities by 𝑒 (𝑡,0) ⊖(−𝑎 ) and 𝑒 (𝑡,0),respectively, we have ⊖(−𝑎 ) 𝑋 (𝑡 )= 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) [𝑒 (𝑡,0 )𝑋 (𝑡 )] ⊖(−𝑎 ) 𝑃 𝑡+𝜔 𝑃 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑃 𝑃 𝑃 =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 𝑃 𝑃 𝑃 1 1 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) ×𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 𝑃 𝑃 1 1 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ,𝑠 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )]𝑒 (𝜎 (𝑡 ),0), 𝑁 𝑁 𝑃 𝑁 𝑁 𝑃 ⊖(−𝑎 ) 𝑁 𝑁 [𝑒 (𝑡,0 )𝑋 (𝑡 )] 𝑋 (𝑡 )= ⊖(−𝑎 ) 𝑁 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) =[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 2 1−𝜇 (𝑠 )𝑎 (𝑠 ) ×𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) 𝑡 () () 𝑁 𝑃 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑃 𝑃 𝑃 𝑁 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ.𝑠 ×𝑒 (𝜎 (𝑡 ),0). 𝑁 𝑁 𝑁 𝑁 ⊖(−𝑎 ) (12) (15) 4 Advances in Articfi ial Neural Systems Integrating both sides of the above equalities from 𝑡 to 𝑋 (𝑡 )= 𝑒 𝜔,0 −1 ( ) 𝑡+𝜔 and using 𝑋 (𝑡 + )𝜔 = 𝑋 (𝑡) and 𝑋 (𝑡 + )𝜔 = 𝑋 (𝑡), ⊖(−𝑎 ) 𝑃 𝑃 𝑁 𝑁 𝑁 we have 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) 𝑡+𝜔 𝑁 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 𝑋 (𝑡 )= ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 𝑃 𝑃 𝑃 𝑡 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ.𝑠 𝑁 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑃 (18) 𝑒 (𝜎 (𝑠 ),0) ⊖(−𝑎 ) × Δ𝑠 The proof is completed. 𝑒 𝑡+𝜔,0 −𝑒 𝑡,0 ( ) ( ) ⊖(−𝑎 ) ⊖(−𝑎 ) 𝑃 𝑃 𝑡+𝜔 = ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 3. Main Results 𝑃 𝑃 𝑃 In this section, we prove the existence and uniqueness of the ×𝐺[𝑤 𝑠 𝑋 (𝑠 − 𝜏 𝑠 ) () () 𝑃 𝑃 𝑃 periodic solution to (1). −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑁 𝑃 Theorem 12. Suppose (𝐴 )-(𝐴 )holdand max{𝛼,}𝑊 < 1 . 1 2 Then (1) has a unique 𝜔 -periodic solution, where 𝑒 (𝜎 (𝑠 ),𝑡 ) ⊖(−𝑎 ) × Δ𝑠, 𝜔 󵄨 󵄨 𝑒 (𝑡+𝜔,𝑡 )−1 󵄨 󵄨 ⊖(−𝑎 ) 󵄨 󵄨 𝜔 exp(∫ 𝜉 ⊖(−𝑎 (𝜏 )) Δ𝜏)(𝐾 +𝛽𝑅+ 𝑅𝑀/𝑊) 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 󵄨 𝛼 := , 𝑡+𝜔 1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 󵄨 𝑋 (𝑡 )= ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] ⊖(−𝑎 ) 𝑃 󵄨 𝑃 󵄨 𝑁 𝑁 𝑁 𝑁 𝜔 󵄨 󵄨 󵄨 󵄨 𝜔 exp(∫ 󵄨 𝜉 ⊖(−𝑎 𝜏 )󵄨 Δ𝜏)(𝐾 +𝛽𝑅+ 𝑅𝑀/𝑊) ( ) 󵄨 󵄨 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝜇(𝜏) 𝑁 𝑃 𝑃 0 󵄨 󵄨 𝛼 := , 󵄨 󵄨 󵄨 󵄨 2 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 ⊖(−𝑎 ) 󵄨 𝑃 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 󵄨 𝑁 󵄨 𝑁 𝑁 𝑁 (19) 𝑒 (𝜎 (𝑠 ),0) ⊖(−𝑎 ) × Δ𝑠 and𝛼:= max{𝛼 ,𝛼 }. 𝑒 (𝑡+𝜔,0 )−𝑒 (𝑡,0 ) 1 2 ⊖(−𝑎 ) ⊖(−𝑎 ) 𝑁 𝑁 𝑡+𝜔 Proof. Let X = {𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) | 𝑍 ∈ 𝐶 (T, R ), 𝑍(𝑡+ 𝑃 𝑁 rd = ∫ [[𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑁 𝑁 𝑁 𝜔) = 𝑍(𝑡)} with the norm‖𝑍‖ = sup {|𝑧 (𝑡)|+|𝑧 (𝑡)|};then 𝑃 𝑁 𝑡∈ T X is a Banach space [14]. Define ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 F : X 󳨀→ X, (F𝑍 )(𝑡 )=((F𝑍 ) (𝑡 ),(F𝑍 ) (𝑡 )), (20) 𝑃 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]] 𝑁 𝑁 𝑁 𝑁 where 𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈X and 𝑃 𝑁 𝑒 (𝜎 (𝑠 ),𝑡 ) ⊖(−𝑎 ) × Δ𝑠. (F𝑍 ) (𝑡 )= 𝑒 (𝑡+𝜔,𝑡 )−1 𝑃 ⊖(−𝑎 ) 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) (16) 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) Since × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑒 (𝑠,𝑡 ) 𝑃 ⊖(−𝑎 ) =𝑒 (𝜎 (𝑠 ),𝑡 ), ⊖(−𝑎 ) 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 (17) 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) =𝑒 𝜎 𝑠 ,𝑡 ( () ) ⊖(−𝑎 ) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠, 𝑁 𝑁 𝑁 𝑃 1−𝜇 𝑠 𝑎 𝑠 () () and 𝑎 (𝑡 + )𝜔 = 𝑎 (𝑡), 𝑎 (𝑡 + )𝜔 = 𝑎 (𝑡),weobtainthat 𝑃 𝑃 𝑁 𝑁 F𝑍 𝑡 = ( ) () 𝑒 𝜔,0 −1 ( ) 1 ⊖(−𝑎 ) 𝑋 𝑡 = () 𝑒 𝜔,0 −1 ( ) 𝑡+𝜔 ⊖(−𝑎 ) 𝑃 𝑒 𝑠,𝑡 ( ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) 1−𝜇 (𝑠 )𝑎 (𝑠 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑋 (𝑠 )] 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 2 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 2 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 −𝑤 (𝑠 )𝑋 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ,𝑠 (21) 𝑁 𝑁 𝑁 𝑃 Advances in Articfi ial Neural Systems 5 for𝑡∈ T.Notethat It follows from (23)and (24)that ∫ 𝜉 (⊖(−𝑎 )(𝜏))Δ𝜏 𝜇(𝜏) 𝑃 𝑒 (𝑠,𝑡 )=𝑒 ⊖(−𝑎 ) ‖F𝑍 ‖ ≤ 𝐼+𝛼𝛼 𝑊 ‖𝑍 ‖ ≤ . (25) 𝑡+𝜔 ∫ |𝜉 (⊖(−𝑎 )(𝜏))|Δ𝜏 𝜇(𝜏) 𝑃 1−𝑊 𝑡 (22) ≤𝑒 ∫ |𝜉 (⊖(−𝑎 )(𝜏))|Δ𝜏 𝜇(𝜏) 𝑃 =𝑒 . Hence, F𝑍∈Ω . Next, we prove that F is a contraction mapping. For any LetΩ = {𝑍(𝑡) | 𝑍 ∈ X, ‖𝑍‖ ≤ 𝐼/(1−𝑊)} and𝛽:=𝐼/(1−)𝑊 . 󸀠 󸀠 󸀠 𝑍(𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈ Ω,𝑍 (𝑡) = (𝑧 (𝑡),𝑧 (𝑡)) ∈ Ω,wehave 𝑃 𝑁 𝑃 𝑁 Obviously, Ω is a closed nonempty subset of X.Firstly,we prove that the mapping F maps Ω into itself. In fact, for any 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )−(F𝑍 ) (𝑡 ) 𝑍(𝑡) ∈ Ω ,wehave 󵄨 𝑃 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )󵄨 󵄨 󵄨 󵄨 1 󵄨 = 󵄨 1 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 󵄨 𝑃 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 𝑡+𝜔 𝑒 𝑠,𝑡 ( ) ⊖(−𝑎 ) 𝑡+𝜔 𝑒 (𝑠,𝑡 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] ⊖(−𝑎 ) 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑡 𝑃 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 󵄨 𝑁 𝑁 𝑃 −𝑤 𝑠 𝑧 (𝑠 − 𝜏 𝑠 )+𝐼 𝑠 ]Δ𝑠 󵄨 () () () 𝑁 𝑁 𝑁 𝑃 𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏) 󵄨 󵄨 𝑒 (𝜔,0 )−1 𝜇(𝜏) 𝑃 0 ⊖(−𝑎 ) 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 𝜔,0 −1󵄨 (1 −𝑎 𝜇) ( ) 󵄨 󵄨 ⊖(−𝑎 ) 𝑃 𝑡+𝜔 󵄨 󵄨 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑡+𝜔 𝑃 𝑃 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 𝑡 󵄨 𝑃 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑃 𝑃 𝑃 1 󸀠 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 𝑃 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 𝑃 𝑃 1 󸀠 1 −𝑤 𝑠 𝑧 (𝑠 − 𝜏 𝑠 )+𝐼 𝑠 ]Δ𝑠 󵄨 󵄨 () () () 𝑁 𝑁 𝑁 𝑃 −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 󵄨 𝑁 𝑁 𝑃 󵄨 𝑁 󵄨 𝜔 󵄨 󵄨 󵄨 󵄨 𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏)(𝐾 +)𝛽𝑅 󵄨 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 exp(∫ 𝜉 (⊖(−𝑎 )(𝜏 )) Δ𝜏)(𝐾𝑊 +𝛽𝑊𝑅+ 𝑅𝑀) 0 󵄨 󵄨 󵄨 󵄨 𝜇(𝜏) 𝑃 󵄨 󵄨 󵄨 󵄨 ≤ 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑒 (𝜔,0 )−1 (1 −𝑎 𝜇) 󵄨 󵄨 󵄨 󵄨 𝑒 𝜔,0 −1 (1 −𝑎 𝜇) ⊖(−𝑎 ) 𝑃 ( ) 𝑃 󵄨 󵄨 󵄨 󵄨 ⊖(−𝑎 ) 𝑃 󵄨 󵄨 𝑡+𝜔 𝑡+𝜔 󵄨 󵄨 󵄨 󵄨 󵄨 × ∫ 󵄨 [𝑊𝑧 (𝑠 − 𝜏 (𝑠 ))+𝑊𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼]󵄨 Δ𝑠 𝑃 𝑃 𝑁 𝑁 × ∫ 𝑧 (𝑠 − 𝜏 (𝑠 ))−𝑧 (𝑠 − 𝜏 (𝑠 ))+𝑧 (𝑠 − 𝜏 (𝑠 )) 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 −𝑧 (𝑠 − 𝜏 (𝑠 )) Δ𝑠 󵄨 󵄨 󵄨 󵄨 𝑁 󵄨 ≤𝛼 (𝐼 + 𝑊 sup(𝑧 (𝑡 ) + 𝑧 (𝑡 ))). 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 1 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 𝑡∈ T 󵄨 󵄨 󵄨 󵄨 󸀠 󸀠 󵄨 󵄨 󵄨 󵄨 (23) 󵄨 󵄨 󵄨 󵄨 ≤𝛼 𝑊 sup[𝑧 (𝑡 )−𝑧 (𝑡 ) + 𝑧 (𝑡 )−𝑧 (𝑡 )]. 󵄨 󵄨 󵄨 󵄨 1 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 𝑡∈ T Similarly, we have (26) 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )󵄨 󵄨 󵄨 Similarly, we have 󵄨 1 󵄨 𝑒 (𝜔,0 )−1 ⊖(−𝑎 ) 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 (F𝑍 ) (𝑡 )−(F𝑍 ) (𝑡 ) 󵄨 󵄨 𝑡+𝜔 󵄨 𝑁 󵄨 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑃 󵄨 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 1 𝑒 𝜔,0 −1 󵄨 ( ) ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 󵄨 ⊖(−𝑎 ) 𝑃 𝑁 𝑃 𝑡+𝜔 󵄨 𝑒 (𝑠,𝑡 ) 󵄨 ⊖(−𝑎 ) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 𝑁 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 󵄨 𝑁 𝑁 𝑁 1−𝜇 (𝑠 )𝑎 (𝑠 ) 󵄨 𝑡 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) ≤𝛼 (𝐼 + 𝑊 sup(󵄨 𝑧 (𝑡 )󵄨 + 󵄨 𝑧 (𝑡 )󵄨 )). 𝑃 𝑃 2 𝑃 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 𝑡∈ T (24) −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 𝑁 𝑁 6 Advances in Articfi ial Neural Systems (𝑋 (𝑡 )−𝑋 (𝑡 )) − 𝑁 𝑒 𝜔,0 −1 ( ) ⊖(−𝑎 ) =−𝑎 (𝑡 )(𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑁 𝑁 𝑁 𝑡+𝜔 𝑒 (𝑠,𝑡 ) ⊖(−𝑎 ) 𝑁 󸀠 × ∫ [𝑘 (𝑠 )−𝑟 (𝑠 )𝑧 (𝑠 )] 𝑁 𝑁 𝑁 2 1−𝜇 (𝑠 )𝑎 (𝑠 ) 𝑡 +𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑃 𝑃 2 󸀠 ×𝐺[𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 )) 2 𝑃 𝑃 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑁 𝑁 𝑁 2 󸀠 2 ∗ −𝑤 (𝑠 )𝑧 (𝑠 − 𝜏 (𝑠 ))+𝐼 (𝑠 )]Δ𝑠 𝑁 𝑁 󵄨 −𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑁 𝑁 𝑃 󵄨 𝑃 𝑃 2 ∗ 󵄨 󵄨 󵄨 󵄨 󸀠 󸀠 󵄨 󵄨 󵄨 󵄨 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 󵄨 󵄨 󵄨 󵄨 𝑁 𝑁 ≤𝛼 𝑊 sup[𝑧 (𝑡 )−𝑧 (𝑡 ) + 𝑧 (𝑡 )−𝑧 (𝑡 )]. 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 2 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 𝑡∈ T (27) −𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑁 𝑁 𝑃 𝑃 𝑃 From (26)and (27), we can get −𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 )+𝐼 𝑡 ] () () () 𝑁 𝑁 𝑁 𝑁 󵄩 󵄩 󵄩 󵄩 󸀠 󸀠 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 (F𝑍 )−(F𝑍 ) ≤𝛼𝑊 𝑍−𝑍 . (28) 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 󵄩 ∗ 2 ∗ +𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) Note that 𝛼𝑊 < 1 .Thus, F is a contraction mapping. By 𝑁 𝑁 𝑃 𝑃 𝑃 the xfi ed point theorem in the Banach space, F possesses a 2 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], unique fixed point. The proof is completed. 𝑁 𝑁 𝑁 𝑁 (30) Theorem 13. Under the conditions of Theorem 12 ,suppose further the following. which leads to (𝐴 ) There exist some constants 𝜖>0 ,𝜉>0 , 𝜉 >0 such +󵄨 ∗ 󵄨 that 󵄨 󵄨 𝐷 󵄨 𝑋 (𝑡) − 𝑋 (𝑡)󵄨 󵄨 𝑃 󵄨 (1 + 𝜇𝜖(𝑡 + 𝜏 ))(𝐾 +𝛽)𝑅 𝑊 󵄨 ∗ 󵄨 (1 + ) 𝑒 (𝑡 + 𝜏 ,𝑡) < 1, 𝜖 0 󵄨 󵄨 ≤−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 𝑃 𝑃 𝜉 󵄨 𝑃 󵄨 (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))−𝜖 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 (1 + 𝜇𝜖(𝑡 + 𝜏 ))(𝐾 +𝛽)𝑅 𝑊 𝑃 0 0 𝜉 󵄨 𝑃 󵄨 (1 + ) 𝑒 (𝑡 + 𝜏 ,𝑡) < 1; 𝜖 0 𝜉 󵄨 ∗ 󵄨 (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))−𝜖 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 (29) +󵄨 ∗ 󵄨 󵄨 󵄨 𝐷 󵄨 𝑋 (𝑡) − 𝑋 (𝑡)󵄨 󵄨 𝑁 󵄨 then the periodic solution of (1) is globally exponentially stable. 󵄨 ∗ 󵄨 󵄨 󵄨 ≤−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 𝑁 𝑁 󵄨 𝑁 󵄨 Proof. It follows from Theorem 12 that (1)has an 𝜔 -periodic ∗ ∗ ∗ ⊤ solution 𝑍 =(𝑋 (𝑡),𝑋 (𝑡)) . 𝑃 𝑁 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 𝑃 0 0 󵄨 𝑃 󵄨 Let𝑍(𝑡) = (𝑋 (𝑡),𝑋 (𝑡)) be any solution of (1); then we 𝑃 𝑁 have 󵄨 ∗ 󵄨 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )). 󵄨 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 ∗ Δ (𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑃 𝑃 (31) =−𝑎 (𝑡 )(𝑋 (𝑡 )−𝑋 (𝑡 )) 𝑃 𝑃 𝑃 For any 𝛼∈[−𝜏 ,0] , construct the Lyapunov functional 0 T 𝑉(𝑡) = 𝑉 (𝑡) + 𝑉 (𝑡) + 𝑉 (𝑡) + 𝑉 (𝑡),where +𝑘 𝑡 𝐺[𝑤 𝑡 𝑋 (𝑡 − 𝜏 𝑡 ) 1 2 3 4 () () () 𝑃 𝑃 𝑃 𝑃 󵄨 󵄨 󵄨 󵄨 𝑉 (𝑡 )=𝜉𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 , −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 1 𝜖 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 𝑁 𝑃 󸀠 󵄨 ∗ 󵄨 󵄨 󵄨 1 ∗ 𝑉 (𝑡 )=𝜉 𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 , 3 𝜖 𝑁 󵄨 𝑁 󵄨 −𝑘 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 1 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑉 (𝑡 )=𝜉 ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 𝑁 𝑃 2 0 𝜖 0 𝑁 𝑁 𝑡−𝜏 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 −𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 ×(󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 + 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 )Δ,𝑠 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )] 𝑁 𝑁 𝑃 𝑁 󸀠 𝑉 𝑡 =𝜉 ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 () 4 0 𝜖 0 𝑡−𝜏 ∗ 1 ∗ +𝑟 (𝑡 )𝑋 (𝑡 )𝐺[𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 )) 𝑃 𝑃 𝑃 𝑃 𝑃 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑠 )−𝑋 (𝑠 ) + 𝑋 (𝑠 )−𝑋 (𝑠 ))Δ.𝑠 󵄨 󵄨 󵄨 󵄨 󵄨 𝑃 𝑃 󵄨 󵄨 𝑁 𝑁 󵄨 1 ∗ −𝑤 (𝑡 )𝑋 (𝑡 − 𝜏 (𝑡 ))+𝐼 (𝑡 )], (32) 𝑁 𝑁 𝑁 𝑃 Advances in Articfi ial Neural Systems 7 󵄨 󵄨 + Δ ∗ 󸀠 󵄨 󵄨 Calculating 𝐷 𝑉(𝑡) along (1), we can get × 𝑋 𝑡 −𝑋 𝑡 +𝜉 (1 + 𝜇𝜖 𝑡 )𝑒 𝑡,𝛼 (𝐾 +𝛽)𝑅 󵄨 () ()󵄨 () ( ) 𝑁 𝑁 𝜖 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑊×( 󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 𝑃 0 0 + Δ 󵄨 𝑃 󵄨 𝐷 𝑉 (𝑡 ) 󵄨(1) 󵄨 󵄨 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 𝑁 0 𝑁 0 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ≤𝜉[𝑒𝜖 𝑡,𝛼 𝑋 𝑡 −𝑋 𝑡 ( )󵄨 () ()󵄨 𝜖 𝑃 𝑃 󵄨 󵄨 󵄨 + Δ 𝐷 𝑉 (𝑡 ) 󵄨(1) +󵄨 ∗ 󵄨Δ 󵄨 󵄨 +𝑒 (𝜎 (𝑡 ),𝛼 )𝐷 𝑋 (𝑡) − 𝑋 (𝑡) ] 󵄨 󵄨 𝜖 󵄨 𝑃 𝑃 󵄨 ≤𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 ∗ ∗ 𝑋 +𝑒 ≤𝜉{𝑒𝜖 (𝑡,𝛼 )󵄨 (𝑡 )−𝑋 (𝑡 )󵄨 (𝜎 (𝑡 ),𝛼 ) 󵄨 󵄨 󵄨 󵄨 𝜖 𝑃 𝑃 𝜖 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝑃 𝑃 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×[−(𝑎 −𝑅)𝑀 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +(𝐾+𝑅)𝛽 𝑊 −𝜉 (1 + 𝜇𝜖 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 𝑃 𝑃 󵄨 𝑃 󵄨 𝜖 󵄨 󵄨 󵄨 ∗ 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 ) ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 󵄨 󵄨 𝑃 0 0 𝑃 0 𝑃 0 󵄨 𝑃 󵄨 󵄨 󵄨 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 +󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 ). + 󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 )]} 𝑁 0 0 󵄨 𝑁 󵄨 𝑁 0 0 󵄨 𝑁 󵄨 (35) 󵄨 ∗ 󵄨 󵄨 󵄨 = 𝜉[𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))]𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝑃 𝜖 𝑃 󵄨 𝑃 󵄨 We have +𝜉(1+𝜖𝜇 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 + Δ󵄨 𝐷 (𝑉 (𝑡 )+𝑉 (𝑡 )) 3 4 󵄨 󵄨(1) 󵄨 󵄨 󵄨 󵄨 ×(𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 ) 󵄨 󵄨 󵄨 𝑃 0 𝑃 0 󵄨 ≤𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] 󵄨 ∗ 󵄨 󵄨 󵄨 +󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 ), 𝑁 0 0 󵄨 𝑁 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝜖 𝑁 󵄨 󵄨 𝑁 󵄨 + Δ 𝐷 𝑉 (𝑡 ) 󵄨(1) +𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 ≤𝜉(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 󵄨 ∗ 󵄨 󸀠 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨)+𝜉 (1 + 𝜇𝜖(𝑡 + 𝜏 )) 𝑁 0 󵄨 𝑁 󵄨 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 󵄨 󵄨 󵄨 󵄨 ×𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊( 𝛽𝑅 𝑋 (𝑡 )−𝑋 (𝑡 )). 󵄨 󵄨 𝜖 0 󵄨 𝑃 𝑃 󵄨 −𝜉(1+𝜖𝜇 (𝑡 ))𝑒 (𝑡,𝛼 )(𝐾 +𝛽)𝑅 𝑊 𝜖 (36) 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )󵄨 From (34)and (36), we can get 𝑃 0 0 󵄨 𝑃 󵄨 󵄨 ∗ 󵄨 + Δ 󵄨 󵄨 + 𝑋 (𝑡 − 𝜏 )−𝑋 (𝑡 − 𝜏 )), 󵄨 󵄨 󵄨 𝐷 𝑉 (𝑡 ) 󵄨 𝑁 0 𝑁 0 󵄨 󵄨 (1) (33) ≤{𝜉[𝜖−(𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] which leads to +(𝜉+𝜉 )(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝑡)(𝐾+𝑅)𝑊𝛽 } 0 𝜖 0 + Δ󵄨 󵄨 󵄨 󵄨 ∗ 𝐷 (𝑉 (𝑡) + 𝑉 (𝑡)) 󵄨 󵄨 ×𝑒 𝑡,𝛼 𝑋 𝑡 −𝑋 𝑡 1 2 󵄨 ( )󵄨 () ()󵄨 𝜖 𝑃 𝑃 󵄨(1) 󵄨 󵄨 󸀠 󸀠 +{𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] + (𝜉 + 𝜉 ) ≤ 𝜉[𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))] 󵄨 ∗ 󵄨 󵄨 󵄨 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ×(1+𝜇𝜖(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝑡)(𝐾+)𝑊𝑅𝛽 } 𝜖 𝑃 0 𝜖 0 󵄨 𝑃 󵄨 (34) 󵄨 ∗ 󵄨 󵄨 󵄨 +𝜉(1+𝜖𝜇(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 ×𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 . 0 𝜖 0 𝜖 𝑁 󵄨 𝑁 󵄨 (37) 󵄨 ∗ 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 󵄨 𝑃 󵄨 By assumption (𝐴 ), it follows that 𝑉(𝑡) ≤ 𝑉(0) for𝑡∈ T . +𝜉(1+𝜖𝜇(𝑡+𝜏 ))𝑒 (𝑡 + 𝜏 ,𝛼)(𝐾+)𝑊𝛽𝑅 0 𝜖 0 On the other hand, we have 󵄨 󵄨 󵄨 󵄨 ×(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ). 𝑁 𝑁 󵄨 󵄨 󸀠 𝑉 (0)≤[𝑒𝜉 (0,𝛼 )+(𝜉+𝜉 ) Note that ×∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+𝑅)𝑊𝛽 Δ]𝑠 0 𝜖 0 + Δ 󵄨 −𝜏 𝐷 𝑉 (𝑡 ) 󵄨(1) 󵄨 ∗ 󵄨 󵄨 󵄨 × sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 󵄨 𝑃 󵄨 ≤𝜉 [𝜖 − (𝑎 −𝑅)𝑀 (1+𝜖𝜇 (𝑡 ))]𝑒 (𝑡,𝛼 ) 𝑠∈ −𝜏 ,0 𝑁 𝜖 [ ] T 8 Advances in Articfi ial Neural Systems 󸀠 󸀠 Consider the following Wilson-Cowan neural network with +[𝜉 𝑒 (0,𝛼 )+(𝜉+𝜉 ) delays on time scale T: ×∫ (1+𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+𝑅)𝑊𝛽 Δ]𝑠 0 𝜖 0 𝑋 (𝑡 ) −𝜏 󵄨 󵄨 󵄨 󵄨 × sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 =−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] 𝑁 𝑁 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑃 𝑠∈[−𝜏 ,0] 0 T 1 1 ×𝐺[𝑤 𝑡 𝑋 𝑡−2 −𝑤 𝑡 𝑋 𝑡−1 +𝐼 𝑡 ], () ( ) () ( ) () 𝑃 𝑃 𝑁 𝑁 𝑃 󵄨 󵄨 󵄨 󵄨 =Γ(𝜖 )( sup 𝑋 𝑠 −𝑋 𝑠 󵄨 () ()󵄨 𝑃 𝑃 󵄨 󵄨 𝑠∈ [−𝜏 ,0] 𝑋 (𝑡 ) 󵄨 󵄨 󵄨 󵄨 =−𝑎 (𝑡 )𝑋 (𝑡 )+[𝑘 (𝑡 )−𝑟 (𝑡 )𝑋 (𝑡 )] + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ), 𝑁 𝑁 𝑁 𝑁 𝑁 𝑁 𝑁 󵄨 󵄨 𝑠∈ −𝜏 ,0 [ 0 ] 2 2 ×𝐺[𝑤 (𝑡 )𝑋 (𝑡−2 )−𝑤 (𝑡 )𝑋 (𝑡−1 )+𝐼 (𝑡 )]. (38) 𝑃 𝑃 𝑁 𝑁 𝑁 (43) where Γ(𝜖) = max{Δ ,Δ }, 1 2 Δ =𝜉𝑒 (0,𝛼 )+(𝜉+𝜉 ) ⊤ 1 𝜖 Case 1. Consider T = R.Take (𝑎 (𝑡),𝑎 (𝑡)) =(2+sin(𝑡),2+ 𝑃 𝑁 cos(𝑡)) .Obviously, 𝑎 =𝑎 =1, 𝑃 𝑁 × ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝑅𝛽 Δ,𝑠 0 𝜖 0 −𝜏 2𝜋 4𝜋 󸀠 󸀠 exp(∫ 𝑎 (𝑠 )𝑑𝑠) Δ =𝜉 𝑒 (0,𝛼 )+(𝜉+𝜉 ) 2 𝜖 = , 2𝜋 4𝜋 𝑒 −1 exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 0 𝑃 × ∫ (1 + 𝜇𝜖(𝑠 + 𝜏 ))𝑒 (𝑠 + 𝜏 ,𝛼)(𝐾+)𝑊𝑅𝛽 Δ.𝑠 0 𝜖 0 (44) −𝜏 2𝜋 4𝜋 exp(∫ 𝑎 (𝑠 )𝑑𝑠) (39) = . 2𝜋 4𝜋 𝑒 −1 It is obvious that exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 󵄨 󵄨 󵄨 󵄨 ∗ 󸀠 ∗ 󵄨 󵄨 󵄨 󵄨 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 +𝜉 𝑒 (𝑡,𝛼 )󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝜖 𝑃 𝑃 𝜖 𝑁 𝑁 󵄨 󵄨 󵄨 󵄨 ⊤ ⊤ Take (𝐼 (𝑡),𝐼 (𝑡)) =(−1+sin(𝑡),cos(𝑡)) , 𝑘 (𝑡) = 𝑘 (𝑡) = ≤𝑉 (𝑡 )≤𝑉 (0), 𝑃 𝑁 𝑃 𝑁 1 1 2 2 𝑟 (𝑡) = 𝑟 (𝑡) = 0.01, 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 0.1, 𝑃 𝑁 (40) 𝑃 𝑁 𝑃 𝑁 and 𝐺(𝑥) = (1/2)(|𝑥+1|−|𝑥−1|) .Wehave𝐿=1 .Let𝜉=1 , which means that 𝜉 =2.One caneasilyverifythat 󸀠 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 min{𝜉,𝜉 }𝑒 (𝑡,𝛼 )(󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 ) 𝜖 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 2𝜋 exp(∫ 𝑎 𝑠 𝑑𝑠) () 𝑅𝑀 ≤𝑉 (0). 0 𝛼 =𝜔(𝐾+𝛽𝑅+ ) 2𝜋 (41) exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 u Th s, we finally get ≈ 0.831 < 1, 󵄨 ∗ 󵄨 󵄨 ∗ 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 + 󵄨 𝑋 (𝑡 )−𝑋 (𝑡 )󵄨 𝑃 𝑁 󵄨 𝑃 󵄨 󵄨 𝑁 󵄨 2𝜋 exp(∫ 𝑎 (𝑠 )𝑑𝑠) Γ(𝜖 )𝑒 (𝑡,𝛼 ) 󵄨 ∗ 󵄨 𝑅𝑀 ⊖𝜖 0 󵄨 󵄨 ≤ ×( sup 𝑋 (𝑠 )−𝑋 (𝑠 ) 󵄨 󵄨 𝛼 =𝜔(𝐾+𝛽𝑅+ ) 󵄨 𝑃 𝑃 󵄨 󸀠 2 (45) 2𝜋 min{𝜉,𝜉 } 𝑠∈[−𝜏 ,0] 𝑊 0 T exp(∫ 𝑎 (𝑠 )𝑑𝑠) − 1 󵄨 󵄨 󵄨 󵄨 + sup 󵄨 𝑋 (𝑠 )−𝑋 (𝑠 )󵄨 ). ≈ 0.831 < 1, 𝑁 𝑁 󵄨 󵄨 𝑠∈[−𝜏 ,0] 0 T (42) −𝜉(𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.980 < 0, eTh refore, the unique periodic solution of ( 1)isglobally 󸀠 󸀠 exponentially stable. eTh proof is completed. −𝜉 (𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −1.970 < 0. 4. Examples It follows from eTh orems 12 and 13 that (43)has aunique2𝜋 - In this section, two numerical examples are shown to verify periodic solution which is globally exponentially stable (see the eeff ctiveness of the result obtained in the previous section. Figure 1). 𝜉𝑒 Advances in Articfi ial Neural Systems 9 0.02 0.02 0.015 0.015 0.01 0.01 0.005 0.005 0 0 −0.005 −0.005 −0.01 −0.01 −0.015 −0.015 −0.02 −0.02 0 5 10 15 2025303540 02468 10 12 14 16 18 20 Time t X (n) X (t) X (n) X (t) Figure 2: Globally exponentially stable periodic solution of ( 46). Figure 1: Globally exponentially stable periodic solution of ( 43). Case 2. Consider T = Z.Equation(43)reduces to the following difference equation: 𝑅𝑀 =(𝐾+𝛽𝑅+ ) 𝑋 (𝑛+1 )−𝑋 (𝑛 ) 𝑃 𝑃 󵄨 󵄨 𝜔−1 𝜔−1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝜔 1−∏ (1 − 𝑎 (𝑘 )) exp(∑ 󵄨 Log(1 − 𝑎 (𝑘 ))󵄨 ) 󵄨 󵄨 𝑁 𝑁 𝑘=1 𝑘=0 󵄨 󵄨 󵄨 󵄨 =−𝑎 (𝑛 )𝑋 (𝑛 )+[𝑘 (𝑛 )−𝑟 (𝑛 )𝑋 (𝑛 )] 󵄨 󵄨 𝑃 𝑃 𝑃 𝑃 𝑃 𝜔−1 󵄨 󵄨 󵄨 󵄨 ∏ (1 − 𝑎 (𝑘 )) (1 −𝑎 ) 󵄨 𝑁 󵄨 𝑁 𝑘=1 󵄨 󵄨 ×𝐺[𝑤 (𝑛 )𝑋 (𝑛−2 ) 𝑃 𝑃 ≈ 0.015 < 1, −𝑤 (𝑛 )𝑋 (𝑛−1 )+𝐼 (𝑛 )], 𝑁 𝑃 −𝜉(𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.480 < 0, 𝑋 𝑛+1 −𝑋 𝑛 ( ) ( ) 𝑁 𝑁 󸀠 󸀠 −𝜉 (𝑎 −𝑅)𝑀 +(𝜉+𝜉 )(𝐾 + 𝑅)𝛽 𝑊 ≈ −0.970 < 0. =−𝑎 (𝑛 )𝑋 (𝑛 )+[𝑘 (𝑛 )−𝑟 (𝑛 )𝑋 (𝑛 )] 𝑁 𝑁 𝑁 𝑁 𝑁 (47) 2 2 ×𝐺[𝑤 (𝑛 )𝑋 (𝑛−2 )−𝑤 (𝑛 )𝑋 (𝑛−1 )+𝐼 (𝑛 )], 𝑃 𝑁 𝑁 𝑃 𝑁 It follows from eTh orems 12 and 13 that (46)has aunique 6- (46) periodic solution which is globally exponentially stable (see Figure 2). + ⊤ ⊤ for𝑛∈ Z .Take (𝑎 (𝑛),𝑎 (𝑛)) = (1/2,1/2).Obviously, 0 𝑃 𝑁 𝑎 =𝑎 =1/2, 𝑎 = 𝑎 =1/2, (𝐼 (𝑡),𝐼 (𝑡)) =(1+ 𝑃 𝑁 𝑃 𝑁 𝑃 𝑁 5. Conclusion Remarks sin(𝑛𝜋/3), cos(𝑛𝜋/3)) , 𝑘 (𝑡) = 𝑘 (𝑡) = 𝑟 (𝑡) = 𝑟 (𝑡) = 𝑃 𝑁 𝑃 𝑁 1 1 2 2 0.01, 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 𝑤 (𝑡) = 0.1,and 𝐺(𝑥) = In this paper, we studied the stability of delayed Wilson- 𝑃 𝑁 𝑃 𝑁 (1/2)(|𝑥 + 1| − |𝑥 − 1|) .Wehave𝐿=1 .Let𝜉=1 , 𝜉 =2.If Cowan networks on periodic time scales and obtained some T = Z, (𝜇(𝑡) = 1),choosing𝜔=6 ,bysimplecalculation,we more generalized results to ensure the existence, uniqueness, have and global exponential stability of the periodic solution. These results can give a significant insight into the complex dynamical structure of Wilson-Cowan type model. eTh conditions are easily checked in practice by simple algebraic 𝑅𝑀 =(𝐾+𝛽𝑅+ ) methods. 󵄨 󵄨 𝜔−1 𝜔−1 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 󵄨 𝜔 1− ∏ (1 − 𝑎 (𝑘 )) exp(∑ 󵄨 Log(1 − 𝑎 (𝑘 ))󵄨 ) 󵄨 𝑃 󵄨 𝑃 𝑘=1 𝑘=0 󵄨 󵄨 󵄨 󵄨 Conflict of Interests 󵄨 󵄨 𝜔−1 󵄨 󵄨 󵄨 󵄨 ∏ (1 − 𝑎 (𝑘 )) (1 −𝑎 ) 󵄨 󵄨 𝑘=1 𝑃 𝑃 󵄨 󵄨 eTh authors declare that there is no conflict of interests regarding the publication of this paper. ≈ 0.015 < 1, 10 Advances in Articfi ial Neural Systems Acknowledgments [17] A. Ruffing and M. Simon, “Corresponding Banach spaces on time scales,” Journal of Computational and Applied Mathematics, Thisresearchwas supportedbythe National NaturalScience vol. 179, no. 1-2, pp. 313–326, 2005. Foundation of China (11101187 and 11361010), the Foundation for Young Professors of Jimei University, the Excellent Youth Foundation of Fujian Province (2012J06001 and NCETFJ JA11144), and the Foundation of Fujian Higher Education (JA10184 and JA11154). References [1] H. R. Wilson and J. D. Cowan, “Excitatory and inhibitory inter- actions in localized populations of model neurons,” Biophysical Journal,vol.12, no.1,pp. 1–24,1972. [2] H. R. Wilson and J. D. Cowan, “A mathematical theory of the functional dynamics of cortical and thalamic nervous tissue,” Kybernetik,vol.13, no.2,pp. 55–80, 1973. [3] A. Destexhe and T. J. Sejnowski, “eTh Wilson-Cowan model, 36 years later,” Biological Cybernetics,vol.101,no. 1, pp.1–2,2009. [4] K.Mantere,J.Parkkinen,T.Jaaskelainen, andM.M.Gupta, “Wilson-Cowan neural-network model in image processing,” Journal of Mathematical Imaging and Vision,vol.2,no. 2-3, pp. 251–259, 1992. [5] C. van Vreeswijk and H. Sompolinsky, “Chaos in neuronal networks with balanced excitatory and inhibitory activity,” Science,vol.274,no. 5293,pp. 1724–1726, 1996. [6] L.H.A.Monteiro, M. A. Bussab,and J. G. Berlinck,“Analytical results on a Wilson-Cowan neuronal network modified model,” Journal of eTh oretical Biology ,vol.219,no. 1, pp.83–91,2002. [7] S. Xie and Z. Huang, “Almost periodic solution for Wilson- Cowan type model with time-varying delays,” Discrete Dynam- ics in Nature and Society,vol.2013, ArticleID683091, 7pages, [8] V. W. Noonburg, D. Benardete, and B. Pollina, “A periodically forced Wilson-Cowan system,” SIAM Journal on Applied Math- ematics,vol.63, no.5,pp. 1585–1603, 2003. [9] S. Hilger, “Analynis on measure chains-a unified approach to continuous and discrete calculus,” Results in Mathematics,vol. 18, pp. 18–56, 1990. [10] S. Hilger, “Differential and difference calculus—unified!,” Non- linear Analysis:Theory,Methods &Applications ,vol.30, no.5, pp. 2683–2694, 1997. [11] A. Chen and F. Chen, “Periodic solution to BAM neural network with delays on time scales,” Neurocomputing,vol.73, no.1–3,pp. 274–282, 2009. [12] Y. Li, X. Chen, and L. Zhao, “Stability and existence of periodic solutions to delayed Cohen-Grossberg BAM neural networks with impulses on time scales,” Neurocomputing,vol.72, no.7–9, pp.1621–1630,2009. [13] Z.Huang,Y.N.Raoff ul, andC.Cheng,“Scale-limitedactivating sets and multiperiodicity for threshold-linear networks on time scales,” IEEE Transactions on Cybernetics,vol.44,no.4,pp.488– 499, 2014. [14] M. Bohner and A. Peterson, Dynamic Equations on Time Scales: An Introduction with Applications,Birkhaus ¨ er, Boston, Mass, USA, 2001. [15] M. Bohner and A. Peterson, Advance in Dynamic Equations on Time Scales,Birkhaus ¨ er, Boston, Mass, USA, 2003. [16] V. Lakshmikantham and A. S. Vatsala, “Hybrid systems on time scales,” Journal of Computational and Applied Mathematics,vol. 141, no. 1-2, pp. 227–235, 2002. Journal of Advances in Industrial Engineering Multimedia Applied Computational Intelligence and Soft Computing International Journal of The Scientific Distributed World Journal Sensor Networks Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 Advances in Fuzzy Systems Modelling & Simulation in Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Submit your manuscripts at http://www.hindawi.com Journal of Computer Networks and Communications  Advances in  Artic fi ial Intelligence Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Advances in Biomedical Imaging Artificial Neural Systems International Journal of Computer Games Advances in Advances in Computer Engineering Technology Software Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 International Journal of Reconfigurable Computing Computational Advances in Journal of Journal of Intelligence and Human-Computer Electrical and Computer Robotics Interaction Neuroscience Engineering Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation Hindawi Publishing Corporation http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014 http://www.hindawi.com Volume 2014

Journal

Advances in Artificial Neural SystemsHindawi Publishing Corporation

Published: Apr 2, 2014

References