Saturated outputs for high-gain, self-exciting networks. (English) Zbl 0764.92005
Summary: We consider a broad class of continuous time dynamical systems modeling a collection of processing units sending signals to each other. Each unit has an internal state variable \(x_ i\) and an output variable \(y_ i\) which is a nondecreasing function \(g_ i(x_ i)\). Certain outputs, called “forced”, are of the form \(\sigma_ j(Kx_ j)\) where \(\sigma_ j\) is a sigmoid and \(K>0\) is a parameter called “gain”. The dynamics is given by a system of differential equations of the form \(dx/dt=H(x,y,t)\). The system is self-exciting: \(\partial H_ i/\partial y_ i\geq 0\), and \(>0\) for the forced outputs.
We show that for sufficiently high gain, the forced outputs are close to the asymptotic limiting values of the sigmoids along any stable solution \(x(t)\) defined on a finite interval \(J\), for a proportion of \(t\in J\) that approaches 1 as \(K\to\infty\). This generalizes Hopfield’s Saturation Theorem about additive neural networks with symmetric weight matrices.
We show that for sufficiently high gain, the forced outputs are close to the asymptotic limiting values of the sigmoids along any stable solution \(x(t)\) defined on a finite interval \(J\), for a proportion of \(t\in J\) that approaches 1 as \(K\to\infty\). This generalizes Hopfield’s Saturation Theorem about additive neural networks with symmetric weight matrices.
MSC:
92B20 | Neural networks for/in biological studies, artificial life and related topics |
68T05 | Learning and adaptive systems in artificial intelligence |
37-XX | Dynamical systems and ergodic theory |