×

Evolving carbon nanotube reservoir computers. (English) Zbl 1476.68095

Amos, Martyn (ed.) et al., Unconventional computation and natural computation. 15th international conference, UCNC 2016, Manchester, UK, July 11–15, 2016. Proceedings. Cham: Springer. Lect. Notes Comput. Sci. 9726, 49-61 (2016).
Summary: Reservoir computing is a useful general theoretical model for many dynamical systems. Here we show the first steps to applying the reservoir model as a simple computational layer to extract exploitable information from physical substrates consisting of single-walled carbon nanotubes and polymer mixtures. We argue that many physical substrates can be represented and configured into working reservoirs given some pre-training through evolutionary selected input-output mappings and targeted input stimuli.
For the entire collection see [Zbl 1339.68005].

MSC:

68Q09 Other nonclassical models of computation
68Q10 Modes of computation (nondeterministic, parallel, interactive, probabilistic, etc.)

References:

[1] Appeltant, L., Soriano, M.C., Van der Sande, G., Danckaert, J., Massar, S., Dambre, J., Schrauwen, B., Mirasso, C.R., Fischer, I.: Information processing using a single dynamical node as complex system. Nat. Commun. 2, 468 (2011) · doi:10.1038/ncomms1476
[2] Atiya, A.F., Parlos, A.G.: New results on recurrent network training: unifying the algorithms and accelerating convergence. IEEE Trans. Neural Netw. 11(3), 697–709 (2000) · doi:10.1109/72.846741
[3] Bertschinger, N., Natschläger, T.: Real-time computation at the edge of chaos in recurrent neural networks. Neural Comput. 16(7), 1413–1436 (2004) · Zbl 1102.68530 · doi:10.1162/089976604323057443
[4] Broersma, H., Gomez, F., Miller, J., Petty, M., Tufte, G.: Nascence project: nanoscale engineering for novel computation using evolution. Int. J. Unconventional Comput. 8(4), 313–317 (2012)
[5] Fernando, C.T., Sojakka, S.: Pattern recognition in a bucket. In: Banzhaf, W., Ziegler, J., Christaller, T., Dittrich, P., Kim, J.T. (eds.) ECAL 2003. LNCS (LNAI), vol. 2801, pp. 588–597. Springer, Heidelberg (2003) · doi:10.1007/978-3-540-39432-7_63
[6] Jaeger, H.: The ”echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology GMD Technical Report 148, 34 (2001)
[7] Jaeger, H.: Short term memory in echo state networks. Tech. rep. no. GMD report 152. German National Research Center for Information Technology (2001)
[8] Legenstein, R., Maass, W.: What makes a dynamical system computationally powerful. In: New Directions in Statistical Signal Processing: From Systems to Brain, pp. 127–154 (2007)
[9] Lukoševičius, M., Jaeger, H.: Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009) · Zbl 1302.68235 · doi:10.1016/j.cosrev.2009.03.005
[10] Wang, X., Halang, W.: Evaluation. In: Wang, X., Halang, W. (eds.) Discovery and Selection of Semantic Web Services. SCI, vol. 453, pp. 109–126. Springer, Heidelberg (2013) · doi:10.1007/978-3-642-33938-7_8
[11] Maass, W., Natschläger, T., Markram, H.: Real-time computing without stable states: a new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002) · Zbl 1057.68618 · doi:10.1162/089976602760407955
[12] Miller, J.F., Downing, K.: Evolution in materio: looking beyond the silicon box. In: NASA/DoD Conference on Evolvable Hardware 2002, pp. 167–176. IEEE (2002) · doi:10.1109/EH.2002.1029882
[13] Miller, J.F., Harding, S., Tufte, G.: Evolution-in-materio: evolving computation in materials. Evol. Intell. 7(1), 49–67 (2014) · doi:10.1007/s12065-014-0106-6
[14] Nichele, S., Lykkebo, O.R., Tufte, G.: An investigation of underlying physical properties exploited by evolution in nanotubes materials. In: 2015 IEEE Symposium Series on Computational Intelligence, pp. 1220–1228. IEEE (2015) · doi:10.1109/SSCI.2015.175
[15] Paquot, Y., Duport, F., Smerieri, A., Dambre, J., Schrauwen, B., Haelterman, M., Massar, S.: Optoelectronic reservoir computing. Sci. Rep. 2, 287 (2012). (Article 287) · doi:10.1038/srep00287
[16] Lykkeb, O.R., Nichele, S., Laketic, D., Tufte, G.: Is there chaos in blobs of carbon nanotubes used to perform computation? In: The Seventh International Conference on Future Computational Technologies and Applications Future Computing 2015, pp. 12–17 (2015)
[17] Sillin, H.O., Aguilera, R., Shieh, H., Avizienis, A.V., Aono, M., Stieg, A.Z., Gimzewski, J.K.: A theoretical and experimental study of neuromorphic atomic switch networks for reservoir computing. Nanotechnology 24(38), 384004 (2013) · doi:10.1088/0957-4484/24/38/384004
[18] Stieg, A.Z., Avizienis, A.V., Sillin, H.O., Aguilera, R., Shieh, H., Martin-Olmos, C., Sandouk, E.J., Aono, M., Gimzewski, J.K.: Self-organization and emergence of dynamical structures in neuromorphic atomic switch networks. In: Adamatzky, A., Chua, L. (eds.) Memristor Networks, pp. 173–209. Springer, Heidelberg (2014) · doi:10.1007/978-3-319-02630-5_10
[19] Vandoorne, K., Mechet, P., Van Vaerenbergh, T., Fiers, M., Morthier, G., Verstraeten, D., Schrauwen, B., Dambre, J., Bienstman, P.: Experimental demonstration of reservoir computing on a silicon photonics chip. Nat. Commun. 5, 3541 (2014) · doi:10.1038/ncomms4541
[20] Verstraeten, D., Schrauwen, B., D’Haene, M., Stroobandt, D.: An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007) · Zbl 1132.68605 · doi:10.1016/j.neunet.2007.04.003
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.