×

Monte Carlo method for estimating eigenvalues using error balancing. (English) Zbl 1489.65010

Lirkov, Ivan (ed.) et al., Large-scale scientific computing. 13th international conference, LSSC 2021, Sozopol, Bulgaria, June 7–11, 2021. Revised selected papers. Cham: Springer. Lect. Notes Comput. Sci. 13127, 447-455 (2022).
Summary: Monte Carlo (MC) power iterations are successfully applied for estimating extremal eigenvalue especially those of large sparse matrices. They use truncated Markov chain simulations for estimating matrix-vector products. The iterative MC methods contain two type of errors – systematic (a truncation error) and stochastic (a probable error). The systematic error depends on the number of iterations and the stochastic error depends on the probabilistic nature of the MC method. In this paper we propose a new version of the MC power iterations using balancing of both errors to determine the optimal length of the chain. Numerical results for estimating the largest eigenvalue are also presented and discussed.
For the entire collection see [Zbl 1484.65002].

MSC:

65C05 Monte Carlo methods
65F15 Numerical computation of eigenvalues and eigenvectors of matrices

Software:

Adam; ImageNet; AlexNet
Full Text: DOI

References:

[1] Lobo, PDC; Mikhailov, M.; Ozisik, MN, On the complex eigenvalues of Luikov system of equations, Drying Technol., 5, 2, 273-286 (1987) · doi:10.1080/07373938708916540
[2] Alexandrov, V.; Davila, D.; Esquivel-Flores, O.; Karaivanova, A.; Gurov, T.; Atanassov, E.; Lirkov, I.; Margenov, S., On Monte Carlo and Quasi-Monte Carlo for matrix computations, Large-Scale Scientific Computing, 249-257 (2018), Cham: Springer, Cham · Zbl 1476.65006 · doi:10.1007/978-3-319-73441-5_26
[3] Alexandrov, V.; Esquivel-Flores, O.; Ivanovska, S.; Karaivanova, A.; Lirkov, I.; Margenov, SD; Waśniewski, J., On the preconditioned Quasi-Monte Carlo algorithm for matrix computations, Large-Scale Scientific Computing, 163-171 (2015), Cham: Springer, Cham · Zbl 07227062 · doi:10.1007/978-3-319-26520-9_17
[4] Dimov, IT; Karaivanova, AN; Vulkov, L.; Waśniewski, J.; Yalamov, P., Iterative Monte Carlo algorithms for linear algebra problems, Numerical Analysis and Its Applications, 150-160 (1997), Heidelberg: Springer, Heidelberg · Zbl 1540.65137 · doi:10.1007/3-540-62598-4_89
[5] Dimov, I.; Karaivanova, A., Parallel computations of eigenvalues based on a Monte Carlo approach, Monte Carlo Methods Appl., 4, 1, 33-52 (1998) · Zbl 0903.65032 · doi:10.1515/mcma.1998.4.1.33
[6] Dimov, IT; Philippe, B.; Karaivanova, A.; Weihrauch, C., Robustness and applicability of Markov chain Monte Carlo algorithms for eigenvalue problems, Appl. Math. Model., 32, 8, 1511-1529 (2008) · Zbl 1176.65003 · doi:10.1016/j.apm.2007.04.012
[7] Golub, GH; Van Loon, CF, Matrix Computations (1996), Baltimore: The Johns Hopkins University Press, Baltimore · Zbl 0865.65009
[8] Mascagni, M.; Karaivanova, A.; Vulkov, L.; Yalamov, P.; Waśniewski, J., Matrix computations using quasirandom sequences, Numerical Analysis and Its Applications, 552-559 (2001), Heidelberg: Springer, Heidelberg · Zbl 0978.65030 · doi:10.1007/3-540-45262-1_65
[9] Mascagni, M.; Karaivanova, A.; Sloot, PMA; Hoekstra, AG; Tan, CJK; Dongarra, JJ, A parallel Quasi-Monte Carlo method for solving systems of linear equations, Computational Science — ICCS 2002, 598-608 (2002), Heidelberg: Springer, Heidelberg · Zbl 1062.65005 · doi:10.1007/3-540-46080-2_62
[10] Sobol, IM, Monte Carlo Numerical Methods (1973), Moscow: Nauka, Moscow · Zbl 0289.65001
[11] Isaacson, E., Keller, H.B.: Analysis of Numerical Methods. Dover Publications, Mineola; Wiley, New York (1996). ISBN 0-486-68029-0 · Zbl 0168.13101
[12] Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)
[13] Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097-1105 (2012)
[14] Maas, A.L., Hannun, A.Y., Ng, A.Y.: Rectifier nonlinearities improve neural network acoustic models. In: Proceedings ICML, vol. 30, p. 3 (2013)
[15] Mou, L.; Ghamisi, P.; Zhu, XX, Deep recurrent neural networks for hyperspectral image classification, IEEE Trans. Geosci. Remote Sens., 55, 7, 3639-3655 (2017) · doi:10.1109/TGRS.2016.2636241
[16] Norouzzadeh, MS, Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning, Proc. Natl. Acad. Sci., 115, 25, E5716-E5725 (2018) · doi:10.1073/pnas.1719367115
[17] Selfridge, O.G.: Pattern recognition and modern computers. In: Proceedings of the Western Joint Computer Conference, 1-3 March 1955, pp. 91-93 (1955)
[18] Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)
[19] Swanson, A., Kosmala, M., Lintott, C., Simpson, R., Smith, A., Packer, C.: Snapshot Serengeti, high-frequency annotated camera trap images of 40 mammalian species in an African savanna. Sci. data 2, 150026 (2015)
[20] Szegedy, C., et al.: Going deeper with convolutions. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1-9 (2015)
[21] Verma, G.K., Gupta, P.: Wild animal detection using deep convolutional neural network. In: Chaudhuri, B.B., Kankanhalli, M.S., Raman, B. (eds.) Proceedings of 2nd International Conference on Computer Vision & Image Processing. AISC, vol. 704, pp. 327-338. Springer, Singapore (2018). doi:10.1007/978-981-10-7898-9_27
[22] Wang, Z.; Bovik, AC; Sheikh, HR; Simoncelli, EP, Image quality assessment: from error visibility to structural similarity, IEEE Trans. Image Process., 13, 4, 600-612 (2004) · doi:10.1109/TIP.2003.819861
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.