The reason is as follows, The authors failed to show the paper is unique and original. The authors wrote "... in recent years, the study of stability analysis of neural networks has been attracting the interest of a great number of researchers ...," but the only three papers in 1995, 2002, and 2007 in the Reference list cannot appeal the claim. The same goes for the description "... a great number of researchers [4-7]." On the other hand, the author claims "So far, there are only a few papers that have taken stochastic phenomenon into account in neural networks [8-11]." However, the reviewer found TENS OF (not ONLY A FEW) such papers which are not in the authors' Reference list. The reviewer tried to read a number of those papers, and found lots of similarities with this paper submitted to this journal. Hence we have to conclude this paper claims nothing new at the best. Although the reviewer believes it's just due to a lack of survey, it might be taken as a plagiarism at the worst. punctuation ... neural networks .A new criteria ... Therefore , in recent years ... just name it a few indent three papers in 1995, 2002, and 2007 cannot be "in recent years, ... great number of researchers [1-3] extensively investigated [4-7] there are only a few papers ... [8-11] lack of survey at the best We consider a class of uncertain stochastic neural networks .The main purpose of this paper is to study the robust exponential stability in mean square. By using Lyapunov.Krasovskii functional we obtain the sufficient conditions for robust exponential stability in mean square of stochastic neural networks, in terms of linear matrix inequality (LMI). => google In this section, one example is given to show the effectiveness of our theoretical results. Consider the uncertain stochastic with the following parameters Then google with example, effectiveness, "uncertain stochastic neural networks" LMIS lead to Xia's paper belwo for example So ================================================================================================== Jianwei Xia Guangwu Meng Xinhua Wang (Aug. 2009) "Delay-dependent exponential stability for a class of stochastic neural networks with distributed delays and polytopic uncertainties." International Conference on Mechatronics and Automation, 2009. ICMA 2009. pp. 3118 - 3123 Abstract The global robust exponential stability in mean square for a class of stochastic neural networks with distributed delays and polytopic uncertainties is investigated in this paper. Parameter-dependent Lypaunov-Krasovskii functionals and free-weighting matrices are employed to obtain sufficient condition that guarantee the robust global exponential stability the considered stochastic neural networks. The derived sufficient conditions are proposed in terms of a set of relaxed linear matrix inequalities(LMIs), which can be checked easily by recently developed algorithms solving LMIs. A numerical example is given to demonstrate the effectiveness of the proposed criteria. also Authors wrote in abstract "The criteria can be checked easily by the LMI control toolbox in Matlab.A numerical example is given by LMI control toolbox in Matlab to demonstrate the effectiveness of our results." but not appeared any such check Matlab in the text afterwords. so quite fishy to pick up from ================================================================================================== Yurong Liu, Zidong Wang and Xiaohui Liu "On delay-dependent robust exponential stability of stochastic neural networks with mixed time delays and Markovian switching Nonlinear Dynamics Springer Netherlands Volume 54, Number 3 (November 2008) pp. 199-212 Abstract This paper deals with the global exponential stability analysis problem for a general class of uncertain stochastic neural networks with mixed time delays and Markovian switching. The mixed time delays under consideration comprise both the discrete time-varying delays and the distributed time-delays. The main purpose of this paper is to establish easily verifiable conditions under which the delayed stochastic neural network is robustly exponentially stable in the mean square in the presence of parameters uncertainties, mixed time delays, and Markovian switching. By employing new Lyapunov - Krasovskii functionals and conducting stochastic analysis, a linear matrix inequality (LMI) approach is developed to derive the criteria for the robust exponential stability, -------------------------------------------------------------------------------------------------- which can be readily checked by using some standard numerical packages such as the Matlab LMI Toolbox. -------------------------------------------------------------------------------------------------- The criteria derived are dependent on both the discrete time delay and distributed time delay, and, are therefore, less conservative. A simple example is provided to demonstrate the effectiveness and applicability of the proposed testing criteria. Keywords Stochastic neural networks - Uncertain neural networks - Mixed time delays - Markovian switching - Global exponential stability - Linear matrix inequality This work was supported in part by the Engineering and Physical Sciences Research Council (EPSRC) of the UK under Grant GR/S27658/01, the Nuffield Foundation of the UK under Grant NAL/00630/G, the National Natural Science Foundation of China under Grant 60774073, the Natural Science Foundation of Jiangsu Province of China under Grant BK2007075, the Natural Science Foundation of Jiangsu Education Committee of China under Grant 06KJD110206, the Scientific Innovation Fund of Yangzhou University of China under Grant 2006CXJ002, and the Alexander von Humboldt Foundation of Germany. ================================================================================================== P. Balasubramaniam, R. Rakkiyappan "Delay-dependent robust stability analysis of uncertain stochastic neural networks with discrete interval and distributed time-varying delays." Neurocomputing Volume 72, Issue 13-15 (August 2009) Pages: 3231-3237 ABSTRACT This paper is concerned with stability analysis problem for uncertain stochastic neural networks with discrete interval and distributed time-varying delays. The parameter uncertainties are assumed to be norm bounded and the delay is assumed to be time-varying and belong to a given interval, which means that the lower and upper bounds of interval time-varying delays are available. Based on the new Lyapunov-Krasovskii functional and stochastic stability theory, delay-interval dependent stability criteria are obtained in terms of linear matrix inequalities. Some numerical examples and comparisons are provided to show that the proposed results significantly improve the allowable upper and lower bounds of delays over some existing results in the literature. Furthermore, the supplementary requirement that the time derivative of discrete time-varying delays must be smaller than the value one is not necessary to derive the results in this paper. ================================================================================================== P. Balasubramaniam, S. Lakshmanan, R. Rakkiyappan "Delay-interval dependent robust stability criteria for stochastic neural networks with linear fractional uncertainties." Neurocomputing Volume 72, Issue 16-18 (October 2009) Pages: 3675-3682 ABSTRACT In this paper, we study the delay-interval dependent robust stability criteria for stochastic neural networks with linear fractional uncertainties. The time-varying delay is assumed to belong to an interval and is a fast time-varying function. The uncertainty under consideration includes linear fractional norm-bounded uncertainty. Based on the new Lyapunov-Krasovskii functional, some inequality techniques and stochastic stability theory, delay-interval dependent stability criteria are obtained in terms of linear matrix inequalities. Finally, some numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed LMI conditions. ================================================================================================== "Mean Square Exponential Stability for Uncertain Delayed Stochastic Neural Networks with Markovian Jump Parameters." Circuits, Systems, and Signal Processing (Birkhauser Boston) Volume 29, Number 2 (April 2010) pp. 331-348 Abstract This paper is concerned with the problem of delay-dependent mean square exponential stability for a class of delayed stochastic Hopfield neural networks with Markovian jump parameters. The delays here are time-varying delays. Based on a new Lyapunov - Krasovskii functional, delay-dependent stability conditions are derived by means of linear matrix inequalities (LMIs). It is shown that the proposed results can contain some existing stability conditions as a special case. Finally, three numerical examples are given to illustrate the effectiveness of the proposed method, and the simulations show that our results are less conservative than the existing ones. Neural networks - Stochastic systems - Markovian jump parameters - Time-varying delays - LMIs This work is partially supported by the Natural Science Foundation of China (60674055, 60774047), and the Taishan Scholar Programme of Shandong Province. ================================================================================================== Yonggang Chen, Weiping Bi, and Yuanyuan Wu "Delay-Dependent Exponential Stability for Discrete-Time BAM Neural Networks with Time-Varying Delays" Discrete Dynamics in Nature and Society (Academic Editor: Yong Zhou) Hindawi Publishing Corporation, Volume 2008 (2008) Abstract This paper considers the delay-dependent exponential stability for discrete-time BAM neural networks with time-varying delays. By constructing the new Lyapunov functional, the improved delay-dependent exponential stability criterion is derived in terms of linear matrix inequality (LMI). Moreover, in order to reduce the conservativeness, some slack matrices are introduced in this paper. Two numerical examples are presented to show the effectiveness and less conservativeness of the proposed method. ================================================================================================== Li, HYGChen, BGZhou, QGFang, S "Robust exponential stability for uncertain stochastic neural networks with discrete and distributed time-varying delays" Physics Letters. A 2008vol.372(no.19) This Letter deals with the problem of delay-dependent robust exponential stability in mean square for a class of uncertain stochastic Hopfield neural networks with discrete and distributed time-varying delays. Based on Lyapunov-Krasovskii functional and the stochastic stability theory, delay-dependent stability criteria are obtained in terms of linear matrix inequalities (LMIs). Because of introducing some free-weighting matrices to develop the stability criteria, the proposed stability conditions have less conservatism. Numerical examples are given to illustrate the effectiveness of our results. (c) 2008 Elsevier B.V. All rights reserved. Keywords neural networks exponential stability stochastic systems uncertain systems LMIs DEPENDENT STABILITY GLOBAL STABILITY CRITERIA SYSTEMS ================================================================================================== Jie Fu Huaguang Zhang Tiedong Ma "Delay-probability-distribution-dependent robust stability analysis for stochastic neural networks with time-varying delay." PROGRESS IN NATURAL SCIENCE 2009 19(10) p.31 The delay-probability-distribution-dependent robust stability problem for a class of uncertain stochastic neural networks (SNNs) with time-varying delay is investigated. The information of probability distribution of the time delay is considered and transformed into parameter matrices of the transferred SNNs model. Based on the Lyapunov-Krasovskii functional and stochastic analysis approach, a delay-probability-distribution-dependent sufficient condition is obtained in the linear matrix inequality (LMI) format such that delayed SNNs are robustly globally asymptotically stable in the mean-square sense for all admissible uncertainties. An important feature of the results is that the stability conditions are dependent on the probability distribution of the delay and upper bound of the delay derivative, and the upper bound is allowed to be greater than or equal to 1. Finally, numerical examples are given to illustrate the effectiveness and less conservativeness of the proposed method. time delayneural networksstability analysisprobability distributionglobally asymptotically stablelinear matrix inequalityupper boundsufficient conditionstability conditionsstochastic analysisstability problemeffectivenessinformationapproachresultsrobustsenseclassBasedLMI