×

On global asymptotic stability of a class of nonlinear systems arising in neural network theory. (English) Zbl 0828.34039

The author considers nonlinear dynamical systems of the form (1) \(\dot x= -{\mathcal D}(x)+ {\mathcal T}(x)h(x)\), where \(x= (x_1,\dots, x_n)\in \mathbb{R}^n\), \({\mathcal D}(x)= ({\mathcal D}_1(x_1),\dots, {\mathcal D}_n(x_n))\) and \(h(x)= (h_1(x_1),\dots, h_n(x_n))\) are vector fields of \(\mathbb{R}^n\) and \({\mathcal T}(x)\) is an \(n\times n\) matrix. \(\mathcal D\), \(h\) and \(\mathcal T\) are assumed to be as regular as required to guarantee existence, uniqueness and continuous dependence of solutions. Moreover, \({\mathcal D}(0)= h(0)= 0\). This class of systems includes for instance the so-called additive neural network model widely employed for real time signal processing. Using a Lyapunov approach, and under appropriate assumptions, the author proves a number of theorems on global asymptotic stability of (1), which turn out to be more general than previous results on the same subject. Finally, the author discusses applications to neural network models.

MSC:

34D20 Stability of solutions to ordinary differential equations
92B20 Neural networks for/in biological studies, artificial life and related topics
Full Text: DOI