×

Stable local dimensionality reduction approaches. (English) Zbl 1177.68171

Summary: Dimensionality reduction is a big challenge in many areas. A large number of local approaches, stemming from statistics or geometry, have been developed. However, in practice these local approaches are often in lack of robustness, since in contrast to maximum variance unfolding, which explicitly unfolds the manifold, they merely characterize local geometry structure. Moreover, the eigenproblems that they encounter, are hard to solve. We propose a unified framework that explicitly unfolds the manifold and reformulate local approaches as the semi-definite programs instead of the above-mentioned eigenproblems. Three well-known algorithms, locally linear embedding, Laplacian eigenmaps and local tangent space alignment are reinterpreted and improved within this framework. Several experiments are presented to demonstrate the potential of our framework and the improvements of these local algorithms.

MSC:

68T10 Pattern recognition, speech recognition
68T05 Learning and adaptive systems in artificial intelligence

Software:

CSDP
Full Text: DOI

References:

[1] de Silva, V.; Tenenbaum, J. B., Global versus local methods in nonlinear dimensionality reduction, (Becker, S.; Thrun, S.; Obermayer, K., Proceedings of NIPS, vol. 15 (2003)), 721-728
[2] M. Collins, S. Dasgupta, R. Schapire, A generalization of principal component analysis to the exponential family, in: NIPS 13, 2001.; M. Collins, S. Dasgupta, R. Schapire, A generalization of principal component analysis to the exponential family, in: NIPS 13, 2001.
[3] Cox, T. F.; Cox, M. A.A., Multidimensional Scaling (1994), Chapman & Hall: Chapman & Hall London · Zbl 0853.62047
[4] Tenenbaum, J. B.; de Silva, V.; Langford, J. C., A global geometric framework for nonlinear dimensionality reduction, Science, 290, 2319-2323 (2000)
[5] Weinberger, K. Q.; Saul, L. K., Unsupervised learning of image manifolds by semidefinite programming, (Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR-04).2 (2004)), 988-995
[6] Weinberger, K. Q.; Packer, B. D.; Saul, L. K., Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization, (Ghahramani, Z.; Cowell, R., Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTATS-05) (2005)), 381-388
[7] Roweis, S. T.; Saul, L. K., Nonlinear dimensionality reduction by locally linear embedding, Science, 290, 2323-2326 (2000)
[8] Saul, L. K.; Roweis, S. T., Think globally, fit locally: unsupervised learning of low dimensional manifolds, Journal of Machine Learning Research, 4, 119-155 (2003) · Zbl 1093.68089
[9] Belkin, M.; Niyogi, P., Laplacian eigenmaps and spectral techniques for embedding and clustering, Advances in Neural Information Processing System, 14, 585-591 (2001)
[10] Belkin, M.; Niyogi, P., Laplacian eigenmaps for dimensionality reduction and data representation, Neural Computation, 15, 1373-1396 (2003) · Zbl 1085.68119
[11] Zhang, Z. Y.; Zha, H. Y., Principal manifolds and nonlinear dimensionality reduction via tangent space alignment, SIAM Journal of Scientific Computing, 26, 1, 313-338 (2004) · Zbl 1077.65042
[12] Chen, H. F.; Jiang, G. F.; Yoshihira, K., Robust nonlinear dimensionality reduction for manifold learning, (Proceeding of 18th International Conference on Pattern Recognition. ICPR (2006)), 447-450
[13] D. De Ridder, V. Franc, Robust manifold learning, Technical report CTU-CMP-2003-08, Center for Machine Perception, Department of Cybernetics Faculty of Electrical Engineering, Czech Technical University, Prague. pp. 1-36, 2003.; D. De Ridder, V. Franc, Robust manifold learning, Technical report CTU-CMP-2003-08, Center for Machine Perception, Department of Cybernetics Faculty of Electrical Engineering, Czech Technical University, Prague. pp. 1-36, 2003.
[14] L.J.P. Maaten, E.O. Postma, H.J. Herik, Dimensionality reduction: a comparative review. \( \langle;\) http://www.cs.unimaas.nl/l.vandermaaten/dr/DR_draft.pdf \(\rangle;\).; L.J.P. Maaten, E.O. Postma, H.J. Herik, Dimensionality reduction: a comparative review. \( \langle;\) http://www.cs.unimaas.nl/l.vandermaaten/dr/DR_draft.pdf \(\rangle;\).
[15] Chang, H.; Yeung, D. Y., Robust locally linear embedding, Pattern Recognition, 39, 6, 1053-1065 (2006) · Zbl 1096.68713
[16] J. Wang, ZY. Zhang, HY. Zha, Adaptive Manifold Learning, Advances in Neural Information Processing Systems 17, MIT Press, Cambridge, MA, 2005, pp. 1473-1480.; J. Wang, ZY. Zhang, HY. Zha, Adaptive Manifold Learning, Advances in Neural Information Processing Systems 17, MIT Press, Cambridge, MA, 2005, pp. 1473-1480.
[17] F. Lu, S. Keles, et al. Kernel regularization and dimensionality reduction. Technical Report. No 1119, Department of Statistics, University of Wisconsin, 2006.; F. Lu, S. Keles, et al. Kernel regularization and dimensionality reduction. Technical Report. No 1119, Department of Statistics, University of Wisconsin, 2006.
[18] Tikhonov, A. N., Regularization of incorrectly posed problems, Soviet Mathematics, 4, 1624-1627 (1963) · Zbl 0183.11601
[19] M. Belkin, P. Niyogi, V. Sindhwani, Manifold regularization: a geometric framework for learning from examples, Technical Report, University of Chicago, Department of Computer Science, TR-2004-06. Available at: \( \langle;\) http://www.cs.uchicago.edu/research/publications/techreports/TR-\(2004-06 \rangle;\).; M. Belkin, P. Niyogi, V. Sindhwani, Manifold regularization: a geometric framework for learning from examples, Technical Report, University of Chicago, Department of Computer Science, TR-2004-06. Available at: \( \langle;\) http://www.cs.uchicago.edu/research/publications/techreports/TR-\(2004-06 \rangle;\). · Zbl 1222.68144
[20] Borchers, B., CSDP, a C library for semidefinite programming, Optimization Methods and Software, 11, 613-623 (1999) · Zbl 0973.90524
[21] Mika, S.; Scholkopf, B., Kernel PCA and de-noising in feature space, Advances in Neural Information Processing Systems 11 (1999), MIT Press: MIT Press Cambridge, MA
[22] Lavalle, S. M.; Branicky, M. S., On the relationship between classical grid search and probabilistic roadmaps, Algorithmic Foundations of Robotics V. STAR, 7, 59-75 (2004)
[23] Weinberger, K. Q.; Sha, F.; Zhu, Q.; Saul, L. K., Graph Laplacian regularization for large-scale semidefinite programming, (Schoelkopf, B.; Platt, J.; Hofmann, T., Advances in Neural Information Processing Systems 19 (2007), MIT Press: MIT Press Cambridge, MA), Available at: \( \langle\) http://www.cs.ucsd.edu/\( \sim\) saul/papers/graph_nips06.pdf \(\rangle \)
[24] Ham, J. H.; Lee, D. D.; Saul, L. K., Semisupervised alignment of manifolds, (Proceedings of the Tenth International Workshop on Artificial Intelligence and Statistics (AISTATS 05) (2005)), 120-127
[25] The Wisconsin Diagnostic Breast Cancer dataset, \( \langle;\) http://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+(Diagnostic)\( \rangle;\).; The Wisconsin Diagnostic Breast Cancer dataset, \( \langle;\) http://archive.ics.uci.edu/ml/datasets/Breast+Cancer+Wisconsin+(Diagnostic)\( \rangle;\).
[26] The teapot database, \( \langle;\) http://www.weinbergerweb.net/Downloads/Data.html \(\rangle;\).; The teapot database, \( \langle;\) http://www.weinbergerweb.net/Downloads/Data.html \(\rangle;\).
[27] The USPS dataset, \( \langle;\) http://www.cs.toronto.edu/roweis/data.html \(\rangle;\).; The USPS dataset, \( \langle;\) http://www.cs.toronto.edu/roweis/data.html \(\rangle;\).
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.