×

Solving evolutionary problems using recurrent neural networks. (English) Zbl 1518.93098

Summary: A technique for flexible control of induction baking of electrically non-conductive layers (paints, varnishes, resins, etc.) is presented, based on the temperature prediction. As the numerical solution of the full model of the process takes a long time, it is necessary to approximate it with a suitable equivalent model. In this case, recurrent neural networks (RNNs) prove to be a powerful tool for solving the task practically online and providing the input data to control the field current fast enough. The methodology was first tested to predict the current based on the knowledge of the voltage, which can be determined from the analytical solution of the ordinary differential equation that describes the feeding circuit. Subsequently, the methodology was implemented on a system for baking non-conductive layers.

MSC:

93C95 Application models in control theory
68T07 Artificial neural networks and deep learning
Full Text: DOI

References:

[1] M. Dlouhy, V. Kotlan, I. Dolezel, Indirect induction hardening of thin electrically non-conductive layers, in: Proc. Conference ELEKTRO Online, 2020, pp. 1-4.
[2] Mohajerin, N.; Waslander, S. L., Multistep prediction of dynamic systems with recurrent neural networks, IEEE Trans. Neural Netw. Learn. Syst., 30, 11, 3370-3383 (2019)
[3] Gerontitis, D.; Behera, R.; Tzekis, P.; Stanimirović, P., A family of varying-parameter finite-time zeroing neural networks for solving time-varying Sylvester equation and its application, J. Comput. Appl. Math., 403, Article 113826 pp. (2022) · Zbl 1490.65078
[4] Chen, Z.; Li, C.; Sun, W., Bitcoin price prediction using machine learning: An approach to sample dimension engineering, J. Comput. Appl. Math., 365, Article 112395 pp. (2020) · Zbl 1426.91314
[5] Huang, L.; Park, J. H.; Wu, G.-C.; Mo, Z.-W., Variable-order fractional discrete-time recurrent neural networks, J. Comput. Appl. Math., 370, Article 112633 pp. (2020) · Zbl 1432.39012
[6] Wang, H.; Fei, J., Nonsingular terminal sliding mode control for active power filter using recurrent neural network, IEEE Access, 6, 67819-67829 (2018)
[7] Li, S.; Shao, Z.; Guan, Y., A dynamic neural network approach for efficient control of manipulators, IEEE Trans. Syst. Man Cybern. A, 49, 5, 932-941 (2019)
[8] P. Karban, I. Petrášová, I. Doležel, The principle of prediction of complex time-dependent nonlinear problems using RNN, in: 2022 23rd International Conference on Computational Problems of Electrical Engineering, CPEE, 2022, pp. 1-4.
[9] Sehovac, L.; Grolinger, K., Deep learning for load forecasting: Sequence to sequence recurrent neural networks with attention, IEEE Access, 8, 36411-36426 (2020)
[10] Lin, L.; Xu, Y.; Liang, X.; Lai, J., Complex background subtraction by pursuing dynamic spatio-temporal models, IEEE Trans. Image Process., 23, 7, 3101-3202 (2014) · Zbl 1374.94216
[11] Yang, K.; Liu, Y.; Yao, Y.; Fan, S.; Mosleh, A., Operational time-series data modeling via LSTM network integrating principal component analysis based on human experience, J. Manuf. Syst., 61, 746-756 (2021)
[12] Pei, H.; Hu, C.; Si, X.; Zhang, J.; Pang, Z.; Zhang, P., Review of machine learning based remaining useful life prediction methods for equipment, J. Mech. Eng., 55, 8, 1-13 (2019)
[13] Yoon, S.; Yun, H.; Kim, Y.; Park, G. T.; Jung, K., Efficient transfer learning schemes for personalized language modeling using recurrent neural network (2017), https://arxiv.org/abs/1701.03578
[14] O. Vinyals, A. Toshev, S. Bengio, D. Erhan, Show and tell: A neural image caption generator, in: Proc. IEEE Conf. Comput. Vis. Pattern Recog., 2015, pp. 3156-3164.
[15] I. Sutskever, O. Vinyals, Q.V. Le, Sequence to sequence learning with neural networks, in: Proc. Adv. Neural Inf. Process. Syst., 2014, pp. 3104-3112.
[16] Olah, C., Understanding LSTM networks (2015), https://colah.github.io/posts/2015-08-Understanding-LSTMs/
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.