×

Machine learning approach to the Floquet-Lindbladian problem. (English) Zbl 07871103


MSC:

68Txx Artificial intelligence
68Qxx Theory of computing
90Cxx Mathematical programming

Software:

Scikit; AdaBoost.MH

References:

[1] Elfving, G., Zur theorie der Markoffschen ketten, Acta Soc. Sci. Fenn., 2, 1-17, 1937 · JFM 63.0501.03
[2] Wolf, M.; Eisert, J.; Cubitt, T.; Cirac, J., Assessing non-Markovian quantum dynamics, Phys. Rev. Lett., 101, 150402, 2008 · Zbl 1225.82036 · doi:10.1103/PhysRevLett.101.150402
[3] Cubitt, T. S.; Eisert, J.; Wolf, M. M., The complexity of relating quantum channels to master equations, Commun. Math. Phys., 310, 383-418, 2012 · Zbl 1243.81100 · doi:10.1007/s00220-011-1402-y
[4] Garey, M. R. and Johnson, D. S., Computers and Intractability: A Guide to the Theory of NP-Completeness, Series of Books in the Mathematical Sciences, 1st ed. (W. H. Freeman, 1979). · Zbl 0411.68039
[5] Volokitin, V., Kozinov, E., Liniov, A., Yusipov, I., Veselov, S., Zolotykh, N., Ivanchenko, M., Meyerov, I., and Denisov, S., “Is there a Lindbladian? Implementation of the test,” (unpublished) (2022).
[6] Holthaus, M., Floquet engineering with quasienergy bands of periodically driven optical lattices, J. Phys. B: At. Mol. Opt. Phys., 49, 013001, 2015 · doi:10.1088/0953-4075/49/1/013001
[7] Bukov, M.; D’Alessio, L.; Polkovnikov, A., Universal high-frequency behavior of periodically driven systems: From dynamical stabilization to Floquet engineering, Adv. Phys., 64, 139-226, 2015 · doi:10.1080/00018732.2015.1055918
[8] Schnell, A.; Eckardt, A.; Denisov, S., Is there a Floquet Lindbladian?, Phys. Rev. B, 101, 100301, 2020 · doi:10.1103/PhysRevB.101.100301
[9] Yusipov, I. I.; Volokitin, V. D.; Liniov, A. V.; Ivanchenko, M. V.; Meyerov, I. B.; Denisov, S. V., Machine learning versus semidefinite programming approach to a particular problem of the theory of open quantum systems, Lobachevskii J. Math., 42, 1622-1629, 2021 · Zbl 1470.81039 · doi:10.1134/S199508022107026X
[10] Zhang, Z., Introduction to machine learning: K-nearest neighbors, Ann. Transl. Med., 4, 218, 2016 · doi:10.21037/atm.2016.03.37
[11] Kramer, O., “K-nearest neighbors,” in Dimensionality Reduction with Unsupervised Nearest Neighbors (Springer, Berlin, 2013), pp. 13-23. · Zbl 1283.68016
[12] Suykens, J. A. K.; Vandewalle, J., Least squares support vector machine classifiers, Neural Process. Lett., 9, 293-300, 1999 · doi:10.1023/A:1018628609742
[13] Amari, S.; Wu, S., Improving support vector machine classifiers by modifying kernel functions, Neural Netw., 12, 783-789, 2001 · doi:10.1016/S0893-6080(99)00032-5
[14] Dagher, I., Quadratic kernel-free non-linear support vector machine, J. Glob. Optim., 41, 15-30, 2008 · Zbl 1216.62003 · doi:10.1007/s10898-007-9162-0
[15] Breuer, H.-P., Genuine quantum trajectories for non-Markovian processes, Phys. Rev. A, 70, 012106, 2004 · doi:10.1103/PhysRevA.70.012106
[16] Breuer, H.-P.; Laine, E.-M.; Piilo, J., Measure for the degree of non-Markovian behavior of quantum processes in open systems, Phys. Rev. Lett., 103, 210401, 2009 · doi:10.1103/PhysRevLett.103.210401
[17] Gorini, V.; Kossakowski, A.; Sudarshan, E. C. G., Completely positive dynamical semigroups of N-level systems, J. Math. Phys., 17, 821, 1976 · Zbl 1446.47009 · doi:10.1063/1.522979
[18] Lindblad, G., On the generators of quantum dynamical semigroups, Commun. Math. Phys., 48, 119-130, 1976 · Zbl 0343.47031 · doi:10.1007/BF01608499
[19] Khachiyan, L. and Porkolab, L., “Computing integral points in convex semi-algebraic sets,” in Proceedings—38th Annual Symposium on Foundations of Computer Science (IEEE Computer Society, 1997), pp. 162-171.
[20] Flum, J. and Grohe, M., Parameterized Complexity Theory, Texts in Theoretical Computer Science. An EATCS Series (Springer-Verlag, Berlin, 2006). · Zbl 1143.68016
[21] Carleo, G.; Cirac, I.; Cranmer, K.; Daudet, L.; Schuld, M.; Tishby, N.; Vogt-Maranto, L.; Zdeborová, L., Machine learning and the physical sciences, Rev. Mod. Phys., 91, 045002, 2019 · doi:10.1103/RevModPhys.91.045002
[22] Choi, M.-D., Completely positive linear maps on complex matrices, Linear Algebra Appl., 10, 285-290, 1975 · Zbl 0327.15018 · doi:10.1016/0024-3795(75)90075-0
[23] Życzkowski, K.; Bengtsson, I., On duality between quantum states and quantum maps, Open Syst. Inf. Dyn., 11, 3-42, 2004 · Zbl 1052.81011 · doi:10.1023/B:OPSY.0000024753.05661.c2
[24] Boyd, S.; El Ghaoui, L.; Feron, E.; Balakrishnan, V., Linear Matrix Inequalities in System and Control Theory, 1994, Society for Industrial and Applied Mathematics · Zbl 0816.93004
[25] Ramana, M.; Goldman, A. J., Some geometric results in semidefinite programming, J. Glob. Optim., 7, 33-50, 1995 · Zbl 0839.90093 · doi:10.1007/BF01100204
[26] Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; Vanderplas, J.; Passos, A.; Cournapeau, D.; Brucher, M.; Perrot, M.; Duchesnay, E., Scikit-learn: Machine learning in Python, J. Mach. Learn. Res., 12, 2825-2830, 2011 · Zbl 1280.68189
[27] Safavian, S.; Landgrebe, D., A survey of decision tree classifier methodology, IEEE Trans. Syst. Man Cybern., 21, 660-674, 1991 · doi:10.1109/21.97458
[28] Breiman, L.; Friedman, J. H.; Olshen, R. A.; Stone, C. J., Classification and Regression Trees, 1984, Wadsworth International Group: Wadsworth International Group, Belmont, CA · Zbl 0541.62042
[29] Cutler, A., Cutler, D. R., and Stevens, J. R., “Random forests,” in Ensemble Machine Learning: Methods and Applications, edited by C. Zhang and Y. Ma (Springer US, Boston, MA, 2012), pp. 157-175.
[30] Breiman, L., Random forests, Mach. Learn., 45, 5-32, 2001 · Zbl 1007.68152 · doi:10.1023/A:1010933404324
[31] Sanger, T. D., Optimal unsupervised learning in a single-layer linear feedforward neural network, Neural Netw., 2, 459-473, 1989 · doi:10.1016/0893-6080(89)90044-0
[32] Glorot, X. and Bengio, Y., “Understanding the difficulty of training deep feedforward neural networks,” in Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, JMLR Proceedings, Vol. 9, edited by Y. W. Teh and D. M. Titterington (JMLR.org, 2010), pp. 249-256.
[33] Freund, Y.; Schapire, R. E., A decision-theoretic generalization of on-line learning and an application to boosting, J. Comput. Syst. Sci., 55, 119-139, 1997 · Zbl 0880.68103 · doi:10.1006/jcss.1997.1504
[34] Fukunaga, K., Introduction to Statistical Pattern Recognition, 2013, Elsevier
[35] Domingos, P.; Pazzani, M., On the optimality of the simple Bayesian classifier under zero-one loss, Mach. Learn., 29, 103-130, 1997 · Zbl 0892.68076 · doi:10.1023/A:1007413511361
[36] Chicco, D.; Jurman, G., The advantages of the Matthews correlation coefficient (MCC) over F1 score and accuracy in binary classification evaluation, BMC Genom., 21, 6, 2020 · doi:10.1186/s12864-019-6413-7
[37] Fawcett, T., An introduction to ROC analysis, Pattern Recognit. Lett., 27, 861-874, 2006 · doi:10.1016/j.patrec.2005.10.010
[38] Davies, A., Advancing mathematics by guiding human intuition with AI, Nature, 600, 70-74, 2021 · Zbl 1505.57001 · doi:10.1038/s41586-021-04086-x
[39] Gorban, A.; Golubkov, A.; Grechuk, B.; Mirkes, E.; Tyukin, I., Correction of AI systems by linear discriminants: Probabilistic foundations, Inf. Sci., 466, 303-322, 2018 · Zbl 1441.68201 · doi:10.1016/j.ins.2018.07.040
[40] Tyukin, I. Y.; Gorban, A. N.; Green, S.; Prokhorov, D., Fast construction of correcting ensembles for legacy artificial intelligence systems: Algorithms and a case study, Inf. Sci., 485, 230-247, 2019 · Zbl 1448.68370 · doi:10.1016/j.ins.2018.11.057
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.