×

DEFM: delay-embedding-based forecast machine for time series forecasting by spatiotemporal information transformation. (English) Zbl 1540.37106

Summary: Making accurate forecasts for a complex system is a challenge in various practical applications. The major difficulty in solving such a problem concerns nonlinear spatiotemporal dynamics with time-varying characteristics. Takens’ delay embedding theory provides a way to transform high-dimensional spatial information into temporal information. In this work, by combining delay embedding theory and deep learning techniques, we propose a novel framework, delay-embedding-based forecast Machine (DEFM), to predict the future values of a target variable in a self-supervised and multistep-ahead manner based on high-dimensional observations. With a three-module spatiotemporal architecture, the DEFM leverages deep neural networks to effectively extract both the spatially and temporally associated information from the observed time series even with time-varying parameters or additive noise. The DEFM can accurately predict future information by transforming spatiotemporal information to the delay embeddings of a target variable. The efficacy and precision of the DEFM are substantiated through applications in three spatiotemporally chaotic systems: a 90-dimensional (90D) coupled Lorenz system, the Lorenz 96 system, and the Kuramoto-Sivashinsky equation with inhomogeneity. Additionally, the performance of the DEFM is evaluated on six real-world datasets spanning various fields. Comparative experiments with five prediction methods illustrate the superiority and robustness of the DEFM and show the great potential of the DEFM in temporal information mining and forecasting.
©2024 American Institute of Physics

MSC:

37M10 Time series analysis of dynamical systems
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
65P10 Numerical methods for Hamiltonian systems including symplectic integrators
68T20 Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.)
68T05 Learning and adaptive systems in artificial intelligence
68T07 Artificial neural networks and deep learning

References:

[1] Lockhart, D. J.; Winzeler, E. A., Genomics, gene expression and DNA arrays, Nature, 405, 6788, 827-836, 2000 · doi:10.1038/35015701
[2] Rienecker, M. M.; Suarez, M. J.; Gelaro, R.; Todling, R.; Bacmeister, J.; Liu, E.; Bosilovich, M. G.; Schubert, S. D.; Takacs, L.; Kim, G.-K., MERRA: NASA’s modern-era retrospective analysis for research and applications, J. Clim., 24, 14, 3624-3648, 2011 · doi:10.1175/JCLI-D-11-00015.1
[3] Fan, J.; Han, F.; Liu, H., Challenges of big data analysis, Natl. Sci. Rev., 1, 2, 293-314, 2014 · doi:10.1093/nsr/nwt032
[4] De Jong, H., Modeling and simulation of genetic regulatory systems: A literature review, J. Comput. Biol., 9, 1, 67-103, 2002 · doi:10.1089/10665270252833208
[5] Stein, R. R.; Bucci, V.; Toussaint, N. C.; Buffie, C. G.; Rätsch, G.; Pamer, E. G.; Sander, C.; Xavier, J. B., Ecological modeling from time-series inference: Insight into dynamics and stability of intestinal microbiota, PLoS Comput. Biol., 9, 12, e1003388, 2013 · doi:10.1371/journal.pcbi.1003388
[6] Stevenson, I. H.; Kording, K. P., How advances in neural recording affect data analysis, Nat. Neurosci., 14, 2, 139, 2011 · doi:10.1038/nn.2731
[7] Cohen, M. X., Analyzing Neural Time Series Data: Theory and Practice, 2014, MIT Press
[8] Holt, C. C., Forecasting seasonals and trends by exponentially weighted moving averages, Int. J. Forecast., 20, 1, 5-10, 2004 · doi:10.1016/j.ijforecast.2003.09.015
[9] Brown, R. G., Operation Research, 145, 1957, Institute of Operations Research and Management Sciences: Institute of Operations Research and Management Sciences, Linthicum
[10] Box, G. E.; Pierce, D. A., Distribution of residual autocorrelations in autoregressive-integrated moving average time series models, J. Am. Stat. Assoc., 65, 332, 1509-1526, 1970 · Zbl 0224.62041 · doi:10.1080/01621459.1970.10481180
[11] Parlos, A. G.; Rais, O. T.; Atiya, A. F., Multi-step-ahead prediction using dynamic recurrent neural networks, Neural Netw., 13, 7, 765-786, 2000 · doi:10.1016/S0893-6080(00)00048-4
[12] Giles, C. L.; Lawrence, S.; Tsoi, A. C., Noisy time series prediction using recurrent neural networks and grammatical inference, Mach. Learn., 44, 1-2, 161-183, 2001 · Zbl 0983.68163 · doi:10.1023/A:1010884214864
[13] Lazar, A.; Pipa, G.; Triesch, J., Fading memory and time series prediction in recurrent networks with different forms of plasticity, Neural Network, 20, 3, 312-322, 2007 · Zbl 1132.68567 · doi:10.1016/j.neunet.2007.04.020
[14] Connor, J. T.; Martin, R. D.; Atlas, L. E., Recurrent neural networks and robust time series prediction, IEEE Trans. Neural Network, 5, 2, 240-254, 1994 · doi:10.1109/72.279188
[15] Karevan, Z.; Suykens, J. A., Transductive LSTM for time-series prediction: An application to weather forecasting, Neural Network, 125, 2020 · doi:10.1016/j.neunet.2019.12.030
[16] Hochreiter, S.; Schmidhuber, J., Long short-term memory, Neural Comput., 9, 8, 1735-1780, 1997 · doi:10.1162/neco.1997.9.8.1735
[17] Pathak, J.; Hunt, B.; Girvan, M.; Lu, Z.; Ott, E., Model-free prediction of large spatiotemporally chaotic systems from data: A reservoir computing approach, Phys. Rev. Lett., 120, 2, 024102, 2018 · doi:10.1103/PhysRevLett.120.024102
[18] Shotton, J.; Fitzgibbon, A.; Cook, M.; Sharp, T.; Finocchio, M.; Moore, R.; Kipman, A.; Blake, A., Conference on Computer Vision and Pattern Recognition (CVPR), 1297-1304, 2011 · doi:10.1109/CVPR.2011.5995316
[19] Johnson, B.; Gomez, M.; Munch, S. B., Leveraging spatial information to forecast nonlinear ecological dynamics, Methods Ecol. Evol., 12, 2, 266-279, 2021 · doi:10.1111/2041-210X.13511
[20] Wang, Y.; Zhang, X.-S.; Chen, L., A network biology study on circadian rhythm by integrating various omics data, OMICS J. Integr. Biol., 13, 4, 313-324, 2009 · doi:10.1089/omi.2009.0040
[21] Sauer, T.; Yorke, J. A.; Casdagli, M., Embedology, J. Stat. Phys., 65, 3-4, 579-616, 1991 · Zbl 0943.37506 · doi:10.1007/BF01053745
[22] Takens, F., Dynamical Systems and Turbulence, Warwick 1980, 366-381, 1981, Springer · Zbl 0513.58032
[23] Ma, H.; Leng, S.; Aihara, K.; Lin, W.; Chen, L., Randomly distributed embedding making short-term high-dimensional data predictable, Proc. Natl. Acad. Sci., 115, 43, E9994-E10002, 2018 · Zbl 1416.62539 · doi:10.1073/pnas.1802987115
[24] Ma, H.; Zhou, T.; Aihara, K.; Chen, L., Predicting time series from short-term high-dimensional data, Int. J. Bifurc. Chaos, 24, 12, 1430033, 2014 · Zbl 1305.37043 · doi:10.1142/S021812741430033X
[25] Chen, P.; Liu, R.; Aihara, K.; Chen, L., Autoreservoir computing for multistep ahead prediction based on the spatiotemporal information transformation, Nat. Commun., 11, 1, 4568, 2020 · doi:10.1038/s41467-020-18381-0
[26] Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A. N.; Kaiser, L.; Polosukhin, I., Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS’17)
[27] Chen, Q.; Zhao, H.; Li, W.; Huang, P.; Ou, W., Proceedings of the 1st International Workshop Deep Learning Practice for High-Dimensonal Sparse Data, 1-4, 2019 · doi:10.1145/3326937.3341261
[28] Li, S.; Jin, X.; Xuan, Y.; Zhou, X.; Chen, W.; Wang, Y.-X.; Yan, X., Advances in Neural Information Processing Systems, 5243-5253, 2019
[29] Wong, T. W.; Lau, T. S.; Yu, T. S.; Neller, A.; Wong, S. L.; Tam, W.; Pang, S. W., Air pollution and hospital admissions for respiratory and cardiovascular diseases in Hong Kong, Occup. Environ. Med., 56, 10, 679-683, 1999 · doi:10.1136/oem.56.10.679
[30] Fan, J.; Zhang, W., Statistical estimation in varying coefficient models, Ann. Stat., 27, 5, 1491-1518, 1999 · doi:10.1214/aos/1017939139
[31] Hirata, Y.; Aihara, K., Predicting ramps by integrating different sorts of information, Eur. Phys. J. Spec. Top., 225, 3, 513-525, 2016 · doi:10.1140/epjst/e2015-50090-2
[33] Li, Y.; Yu, R.; Shahabi, C.; Liu, Y.
[34] Curry, J. H., A generalized Lorenz system, Commun. Math. Phys., 60, 3, 193-204, 1978 · Zbl 0387.76052 · doi:10.1007/BF01612888
[35] Sutskever, I.; Vinyals, O.; Le, Q. V., Proceedings of the 27th International Conference on Neural Information Processing Systems (NIPS’14)
[36] Glorot, X.; Bengio, Y., 249-256, 2010
[37] He, K.; Zhang, X.; Ren, S.; Sun, J., Proceedings of the 2015 IEEE International Conference on Computer Vision, 1026-1034, 2015 · doi:10.1109/ICCV.2015.123
[38] Lorenz, E. N., 1996, Reading
[39] Peng, H.; Chen, P.; Liu, R.; Chen, L., Spatiotemporal information conversion machine for time-series forecasting, Fundam. Res., 2022 · doi:10.1016/j.fmre.2022.12.009
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.