×

Efficiently updating and tracking the dominant kernel principal components. (English) Zbl 1111.68101

Summary: The dominant set of eigenvectors of the symmetrical kernel gram matrix is used in many important kernel methods (like e.g. kernel principal component analysis, feature approximation, denoising, compression, prediction) in the machine learning area. Yet in the case of dynamic and/or large-scale data, the batch calculation nature and computational demands of the eigenvector decomposition limit these methods in numerous applications. In this paper we present an efficient incremental approach for fast calculation of the dominant kernel eigenbasis, which allows us to track the kernel eigenspace dynamically. Experiments show that our updating scheme delivers a numerically stable and accurate approximation for eigenvalues and eigenvectors at every iteration in comparison to the batch algorithm.

MSC:

68T05 Learning and adaptive systems in artificial intelligence

Software:

UCI-ml
Full Text: DOI

References:

[1] Badeau, R.; Richard, G.; David, B., Sliding window adaptive SVD algorithms, IEEE Transactions on Signal Processing, 52, 1, 1-10 (2004) · Zbl 1369.94078
[2] Basseville, M.; Nikiforov, I., Detection of abrupt changes — theory and application (1993), Prentice-Hall, Inc. · Zbl 1407.62012
[3] Blake, C., & Merz, C. (1998). UCI repository of machine learning databases. URL http://www.ics.uci.edu/ mlearn/MLRepository.html; Blake, C., & Merz, C. (1998). UCI repository of machine learning databases. URL http://www.ics.uci.edu/ mlearn/MLRepository.html
[4] Bunch, J. R.; Nielsen, J. P., Updating the singular value decomposition, Numerische Mathematik, 31, 111-129 (1978) · Zbl 0421.65028
[5] Businger, P. (1970). Updating a singular value decomposition. BIT 10. pp. 376-385; Businger, P. (1970). Updating a singular value decomposition. BIT 10. pp. 376-385
[6] Chandrasekaran, S.; Manjunath, B. S.; Wang, Y. F.; Winkeler, J.; Zhang, H., An eigenspace update algorithm for image analysis, Graphical Models and Image Processing: GMIP, 59, 5, 321-332 (1997)
[7] Golub, G. H.; Loan, C. F.V., Matrix computations (1989), The John Hopkins University Press: The John Hopkins University Press Baltimore, MD 21218 · Zbl 0733.65016
[8] Guyon, I.; Poujaud, I.; Personnaz, L.; Dreyfus, G., Comparing different neural network architectures for classifying handwritten digits, Proceedings of the international joint conference on neural networks, Vol. II (1989), IEEE Press: IEEE Press New York, Washington, DC
[9] Jade, A. M.; Srikanth, B.; Jayaraman, V.; Kulkarni, B.; Jog, J.; Priya, L., Feature extraction and denoising using kernel PCA, Chemical Engineering Science, 58, 19, 4441-4448 (2003)
[10] Kim, K. I., Franz, M., & Schölkopf, B. (2003). Kernel Hebbian algorithm for iterative kernel principal component analysis. Tech. rep. 109. Tübingen: Max-Planck-Institut für biologische Kybernetik; Kim, K. I., Franz, M., & Schölkopf, B. (2003). Kernel Hebbian algorithm for iterative kernel principal component analysis. Tech. rep. 109. Tübingen: Max-Planck-Institut für biologische Kybernetik
[11] Kuh, A. (2001). Adaptive kernel methods for CDMA systems. In Proc. of the international joint conference on neural networks (pp. 1404-1409); Kuh, A. (2001). Adaptive kernel methods for CDMA systems. In Proc. of the international joint conference on neural networks (pp. 1404-1409)
[12] Levy, A.; Lindenbaum, M., Sequential Karhunen-Loeve basis extraction, IEEE Transactions on Image Processing, 9, 8, 1371-1374 (2000) · Zbl 1001.68586
[13] Mika, S.; Schölkopf, B.; Smola, A.; Müller, K.; Scholz, M.; Rätsch, G., Kernel PCA and de-noising in feature spaces, Advances in Neural Information Processing Systems, 11, 536-542 (1999)
[14] Moonen, M.; Van Dooren, P.; Vandewalle, J., An SVD updating algorithm for subspace tracking, SIAM Journal on Matrix Analysis and Applications, 13, 4, 1015-1038 (1992) · Zbl 0759.65017
[15] Rosipal, R.; Girolami, M., An expectation-maximization approach to nonlinear component analysis, Neural Computation, 13, 3, 505-510 (2001) · Zbl 1085.68646
[16] Rosipal, R.; Girolami, M.; Trejo, L.; Cichocki, A., Kernel PCA for feature extraction and de-noising in nonlinear regression, Neural Computing & Applications, 10, 3, 231-243 (2001) · Zbl 0989.68112
[17] Schölkopf, B.; Mika, S.; Burges, C.; Knirsch, P.; Múller, K.; Rätsch, G., Input space versus feature space in kernel-based methods, IEEE Transactions On Neural Networks, 10, 5, 1000-1017 (1999)
[18] Schölkopf, B.; Mika, S.; Smola, A.; Rätsch, G.; Müller, K., Kernel PCA pattern reconstruction via approximate pre-images, (Niklasson, L.;  , M. B.; Ziemke, T., Perspectives in neural computing (1998), Springer: Springer Berlin, Germany), 147-152
[19] Schölkopf, B.; Smola, A., Learning with kernels (2002), MIT Press: MIT Press Cambridge, MA
[20] Schölkopf, B.; Smola, A.; Müller, K., Nonlinear component analysis as a kernel eigenvalue problem, Neural Computation, 10, 5, 1299-1319 (1998)
[21] Suykens, J. A.K.; Van Gestel, T.; De Brabanter, J.; De Moor, B.; Vandewalle, J., Least squares support vector machines (2002), World Scientific: World Scientific Singapore · Zbl 1003.68146
[22] Twining, C. J.; Taylor, C. J., The use of kernel principal component analysis to model data distributions, Pattern Recognition, 36, 1, 217-227 (2003) · Zbl 1010.68206
[23] Vapnik, V., The nature of statistical learning theory (1995), Springer-Verlag: Springer-Verlag Berlin · Zbl 0833.62008
[24] Wahba, G., Spline models for observational data, (CBMS-NSF regional conference series in applied mathematics, Vol. 59 (1990), SIAM: SIAM Philadelphia, PA) · Zbl 0813.62001
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.