×

Stability and memory-loss go hand-in-hand: three results in dynamics and computation. (English) Zbl 1472.37088

Summary: The search for universal laws that help establish a relationship between dynamics and computation is driven by recent expansionist initiatives in biologically inspired computing. A general setting to understand both such dynamics and computation is a driven dynamical system that responds to a temporal input. Surprisingly, we find memory-loss a feature of driven systems to forget their internal states helps provide unambiguous answers to the following fundamental stability questions that have been unanswered for decades: what is necessary and sufficient so that slightly different inputs still lead to mostly similar responses? How does changing the driven system’s parameters affect stability? What is the mathematical definition of the edge-of-criticality? We anticipate our results to be timely in understanding and designing biologically inspired computers that are entering an era of dedicated hardware implementations for neuromorphic computing and state-of-the-art reservoir computing applications.

MSC:

37N25 Dynamical systems in biology
37B55 Topological dynamics of nonautonomous systems
68T01 General topics in artificial intelligence
92B99 Mathematical biology in general

References:

[1] Bray D. 2009 Wetware: a computer in every living cell. New Haven, CT: Yale University Press.
[2] Regot S et al. 2011 Distributed biological computation with multicellular engineered networks. Nature 469, 207-211. (doi:10.1038/nature09679) · doi:10.1038/nature09679
[3] Kauffman SA et al. 1993 The origins of order: self-organization and selection in evolution. USA: Oxford University Press.
[4] Langton CG. 1990 Computation at the edge of chaos: phase transitions and emergent computation. Physica D 42, 12-37. (doi:10.1016/0167-2789(90)90064-V) · doi:10.1016/0167-2789(90)90064-V
[5] Kitzbichler MG, Smith ML, Christensen SR, Bullmore E. 2009 Broadband criticality of human brain network synchronization. PLoS Comput. Biol. 5, e1000314. (doi:10.1371/journal.pcbi.1000314) · doi:10.1371/journal.pcbi.1000314
[6] Murray JD. 2007 Mathematical biology: I. An introduction, vol. 17. Berlin, Germany: Springer Science & Business Media.
[7] Terrell WJ. 2009 Stability and stabilization: an introduction. Princeton, NJ: Princeton University Press. · Zbl 1170.34001
[8] Lukoševičius M, Jaeger H. 2009 Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3, 127-149. (doi:10.1016/j.cosrev.2009.03.005) · Zbl 1302.68235 · doi:10.1016/j.cosrev.2009.03.005
[9] Manjunath G, Jaeger H. 2013 Echo state property linked to an input: exploring a fundamental characteristic of recurrent neural networks. Neural Comput. 25, 671-696. (doi:10.1162/NECO_a_00411) · Zbl 1269.92006 · doi:10.1162/NECO_a_00411
[10] Wiener N. 1966 Nonlinear problems in random theory. Cambridge, MA: MIT Press.
[11] Volterra V. 1959 Theory of functionals and of integral and integro-differential equations. New York, NY: Dover. · Zbl 0086.10402
[12] Boyd S, Chua L. 1985 Fading memory and the problem of approximating nonlinear operators with Volterra series. IEEE Trans. Circuits Syst. 32, 1150-1161. (doi:10.1109/TCS.1985.1085649) · Zbl 0587.93028 · doi:10.1109/TCS.1985.1085649
[13] Chua L, Green D. 1976 A qualitative analysis of the behavior of dynamic nonlinear networks: steady-state solutions of nonautonomous networks. IEEE Trans. Circuits Syst. 23, 530-550. (doi:10.1109/TCS.1976.1084258) · Zbl 0359.94040 · doi:10.1109/TCS.1976.1084258
[14] Jaeger H. 2001 The ‘echo state’ approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148(34):13.
[15] Grigoryeva L, Ortega JP. 2018 Echo state networks are universal. Neural Netw. 108, 495-508. (doi:10.1016/j.neunet.2018.08.025) · Zbl 1434.68409 · doi:10.1016/j.neunet.2018.08.025
[16] Grigoryeva L, Ortega JP. 2019 Differentiable reservoir computing. J. Mach. Learn. Res. 20, 1-62. · Zbl 1433.68343
[17] Kocarev L, Parlitz U. 1996 Generalized synchronization, predictability, and equivalence of unidirectionally coupled dynamical systems. Phys. Rev. Lett. 76, 1816. (doi:10.1103/PhysRevLett.76.1816) · doi:10.1103/PhysRevLett.76.1816
[18] Jaeger H, Haas H. 2004 Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication. Science 304, 78-80. (doi:10.1126/science.1091277) · doi:10.1126/science.1091277
[19] Maass W, Natschläger T, Markram H. 2002 Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14, 2531-2560. (doi:10.1162/089976602760407955) · Zbl 1057.68618 · doi:10.1162/089976602760407955
[20] Appeltant L et al. 2011 Information processing using a single dynamical node as complex system. Nat. Commun. 2, 468. (doi:10.1038/ncomms1476) · doi:10.1038/ncomms1476
[21] Kudithipudi D, Saleh Q, Merkel C, Thesing J, Wysocki B. 2016 Design and analysis of a neuromemristive reservoir computing architecture for biosignal processing. Front. Neurosci. 9, 502. (doi:10.3389/fnins.2015.00502) · doi:10.3389/fnins.2015.00502
[22] Vandoorne K et al. 2014 Experimental demonstration of reservoir computing on a silicon photonics chip. Nat. Commun. 5, 3541. (doi:10.1038/ncomms4541) · doi:10.1038/ncomms4541
[23] Beggs JM. 2008 The criticality hypothesis: how local cortical networks might optimize information processing. Phil. Trans. R. Soc. A 366, 329-343. (doi:10.1098/rsta.2007.2092) · Zbl 1152.92304 · doi:10.1098/rsta.2007.2092
[24] Yildiz IB, Jaeger H, Kiebel SJ. 2012 Re-visiting the echo state property. Neural Netw. 35, 1-9. (doi:10.1016/j.neunet.2012.07.005) · Zbl 1258.68129 · doi:10.1016/j.neunet.2012.07.005
[25] Manjunath G. 2020 Embedding information onto a dynamical system. In preparation.
[26] Munkres J. 2013 Topology. Upper Saddle River, NJ: Prentice-Hall. · Zbl 0951.54001
[27] Kauffman S. 1996 At home in the universe: the search for the laws of self-organization and complexity. Oxford, UK: Oxford University Press.
[28] Beggs JM, Timme N. 2012 Being critical of criticality in the brain. Front. Physiol. 3, 163. (doi:10.3389/fphys.2012.00163) · doi:10.3389/fphys.2012.00163
[29] Packard NH. 1988 Adaptation toward the edge of chaos. Dyn. Patterns Complex Syst. 212, 293.
[30] Crutchfield JP, Young K. 1990 Computation at the onset of chaos. In Entropy, Complexity, and the Physics of Information (ed. W Zurek), pp. 223-269. SFI Studies in the Sciences of Complexity, VIII. Reading, MA: Addison-Wesley.
[31] Kloeden PE, Rasmussen M. 2011 Nonautonomous dynamical systems, vol. 176. Providence, RI: American Mathematical Society. · Zbl 1244.37001
[32] Aubin JP, Frankowska H. 2009 Set-valued analysis. Berlin, Germany: Springer Science & Business Media. · Zbl 1168.49014
[33] Géron A. 2019 Hands-On machine learning with Scikit-Learn, Keras, and TensorFlow: concepts, tools, and techniques to build intelligent systems. Boston, MA: O’Reilly Media.
[34] Hart A, Hook J, Dawes J. 2020 Embedding and approximation theorems for echo state networks. Neural Netw. 128, 234-247. (doi:10.1016/j.neunet.2020.05.013) · Zbl 1468.68098 · doi:10.1016/j.neunet.2020.05.013
[35] Huang GB, Zhu QY, Siew CK. 2006 Extreme learning machine: theory and applications. Neurocomputing 70, 489-501. (doi:10.1016/j.neucom.2005.12.126) · doi:10.1016/j.neucom.2005.12.126
[36] Gonon L, Grigoryeva L, Ortega JP. 2020 Approximation bounds for random neural networks and reservoir systems. (http://arxiv.org/abs/2002.05933) · Zbl 1532.68078
[37] Giles CL, Omlin CW. 1994 Pruning recurrent neural networks for improved generalization performance. IEEE Trans. Neural Netw. 5, 848-851. (doi:10.1109/72.317740) · doi:10.1109/72.317740
[38] Hermans M, Schrauwen B. 2013 Training and analysing deep recurrent neural networks. In Advances in neural information processing systems (eds CJC Burges, L Bottou, M Welling, Z Ghahramani, KQ Weinberger), pp. 190-198.
[39] Jaeger H. 2017 Using conceptors to manage neural long-term memories for temporal patterns. J. Mach. Learn. Res. 18, 387-429. · Zbl 1433.68352
[40] Manjunath G. 2017 Evolving network model that almost regenerates epileptic data. Neural Comput. 29, 937-967. (doi:10.1162/NECO_a_00941) · Zbl 1414.92033 · doi:10.1162/NECO_a_00941
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.