×

Compact data encoding for data re-uploading quantum classifier. (English) Zbl 1508.81016

Summary: In the realm of quantum machine learning, different genres of quantum classifiers have been designed to classify classical data. Recently, a quantum classifier that features re-uploading the sample to be classified many times along the quantum circuit was proposed. Data re-uploading allows circumventing the limitations established by the no-cloning theorem. This quantum classifier has great potential in NISQ-era, because it requires very few qubits due to the special data encoding scheme it used. Previous work showed that even a single-qubit could constitute effective classifiers for problems with up to 4 dimensions. In this work, we focus our attention on the data encoding scheme of this quantum classifier, we propose an alternative way to encode the input sample in order to reduce by half the number of learnable parameters of the quantum circuit and simplify the computation, so the training time can be greatly shortened. Numerical results show that the new data encoding method achieves higher accuracy for high-dimensional data while using less parameters.

MSC:

81-08 Computational methods for problems pertaining to quantum theory
68T10 Pattern recognition, speech recognition
Full Text: DOI

References:

[1] Dunjko, V.; Wittek, P., A non-review of quantum machine learning: trends and explorations, Quantum Views, 4, 32 (2020) · doi:10.22331/qv-2020-03-17-32
[2] Li, WK; Deng, DL, Recent advances for quantum classifiers, Sci. China Phys. Mech. Astron., 65, 220301 (2022) · doi:10.1007/s11433-021-1793-6
[3] Benedetti, M.; Lloyd, E.; Sack, S.; Fiorentini, M., Parameterized quantum circuits as machine learning models, Quantum Sci. Technol., 4, 043001 (2019) · doi:10.1088/2058-9565/ab4eb5
[4] Schuld, M.; Bocharov, A.; Svore, K.; Wiebe, N., Circuit-centric quantum classifiers, Phys. Rev. A, 101, 032308 (2020) · doi:10.1103/PhysRevA.101.032308
[5] Yang, ZW; Zhang, XD, Entanglement-based quantum deep learning, New J. Phys., 22, 033041 (2020) · doi:10.1088/1367-2630/ab7598
[6] Grant, E.; Benedetti, M.; Cao, SX, Hierarchical quantum classifiers, NPJ Quantum Inform., 4, 65 (2018) · doi:10.1038/s41534-018-0116-9
[7] Huggins, W.; Patil, P.; Mitchell, B.; Whaley, KB; Stoudenmire, EM, Towards quantum machine learning with tensor networks, Quantum Sci. Technol., 4, 024001 (2019) · doi:10.1088/2058-9565/aaea94
[8] Plesch, M.; Brukner, Č., Quantum-state preparation with universal gate decompositions, Phys. Rev. A, 83, 032302 (2011) · doi:10.1103/PhysRevA.83.032302
[9] Preskill, J., Quantum computing in the NISQ era and beyond, Quantum, 2, 79 (2018) · doi:10.22331/q-2018-08-06-79
[10] Havlíček, V.; Córcoles, AD; Temme, K., Supervised learning with quantum-enhanced feature spaces, Nature, 567, 209 (2019) · doi:10.1038/s41586-019-0980-2
[11] Pérez-Salinas, A.; Cervera-Lierta, A.; Gil-Fuster, E.; Latorre, JI, Data re-uploading for a universal quantum classifier, Quantum, 4, 226 (2020) · doi:10.22331/q-2020-02-06-226
[12] Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006) · Zbl 1107.68072
[13] Byrd, RH; Lu, P.; Nocedal, J.; Zhu, C., A limited memory algorithm for bound constrained optimization, SIAM J. Sci. Comput., 16, 1190 (1995) · Zbl 0836.65080 · doi:10.1137/0916069
[14] Virtanen, P.; Gommers, R.; Oliphant, TE, SciPy 1.0: Fundamental algorithms for scientific computing in Python, Nature Methods, 17, 261 (2020) · doi:10.1038/s41592-019-0686-2
[15] Pedregosa, F.; Varoquaux, G.; Gramfort, A., Scikit-learn: machine learning in Python, J. Mach. Learn. Res., 12, 2825 (2011) · Zbl 1280.68189
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.