×

Dynamical properties of background neural networks with uniform firing rate and background input. (English) Zbl 1127.92012

Summary: The dynamic properties of background neural networks with uniform firing rate and background input are investigated with a series of mathematical arguments including nondivergence, global attractivity and complete stability analysis. Moreover, it is shown that shifting the background level affects the existence and stability of the equilibrium point. Depending on the increase or decrease in background input, the network can engender bifurcations and chaos. It may be have one or two different stable firing levels. That means the background neural network can exhibit not only monostability but also multistability.

MSC:

92C20 Neural biology
37N25 Dynamical systems in biology
92B20 Neural networks for/in biological studies, artificial life and related topics
34D20 Stability of solutions to ordinary differential equations
Full Text: DOI

References:

[1] Sommer, M. A.; Wurtz, R. H., Frontal eye field sends delay activity related to movement, memory, and vision to the superior colliculus, J Neurophysiol, 85, 1673-1685 (2001)
[2] Salinas, E., Background synaptic activity as a switch between dynamical states in network, Neural Comput, 15, 1439-1475 (2003) · Zbl 1056.92012
[3] Forti, M., On global asymptotic stability of a class of nonlinear systems arising in neural network theory, J Differ Equat, 113, 246-264 (1994) · Zbl 0828.34039
[4] Forti, M.; Tesi, A., New conditions for global stability of neural networks with application to linear and quadratic programming problems, IEEE Trans Circ Syst I, 42, 354-366 (1995) · Zbl 0849.68105
[5] Liang, X. B.; Si, J., Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem, IEEE Trans Neural Networks, 12, 349-359 (2001)
[6] Yi, Z.; Heng, P. A.; Fu, A. W., Estimate of exponential convergence rate and exponential stability for neural networks, IEEE Trans Neural Networks, 10, 1487-1493 (1999)
[7] Hahnloser, R. L.T., On the piecewise analysis of linear threshold neural networks, Neural Networks, 11, 691-697 (1998)
[8] Wersing, H.; Beyn, W. J.; Ritter, H., Dynamical stability conditions for recurrent neural networks with unsaturating piecewise linear transfer functions, Neural Comput, 13, 1811-1825 (2001) · Zbl 1009.92003
[9] Yi, Z.; Tan, K. K., Dynamic stability conditions for Lotka-Volterra recurrent neural networks with delays, Phys Rev E, 66, 011910 (2002)
[10] Takahashi, N., A new sufficient condition for complete stability of cellular neural networks with delay, IEEE Trans Circ Syst I, 47, 793-799 (2000) · Zbl 0964.94008
[11] Yi, Z.; Tan, K. K.; Lee, T. H., Multistability analysis for recurrent neural networks with unsaturating piecewise linear transfer functions, Neural Comput, 15, 639-662 (2003) · Zbl 1085.68142
[12] Liang, X. B.; Wang, J., A recurrent neural network for nonlinear optimization with a continuously differentiable objective function and bound constraints, IEEE Trans Neural Networks, 11, 1251-1262 (2000)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.