×

Learning theory: from regression to classification. (English) Zbl 1205.62084

Jetter, Kurt (ed.) et al., Topics in multivariate approximation and interpolation. Amsterdam: Elsevier (ISBN 978-0-444-51844-6). Studies in Computational Mathematics 12, 257-290 (2005).
Summary: We give a brief survey of regularization schemes in learning theory for the purposes of regression and classification, from an approximation theory point of view. First, the classical method of empirical risk minimization is reviewed for regression with a general convex loss function. Next, we explain ideas and methods for the error analysis of regression algorithms generated by Tikhonov regularization schemes associated with reproducing kernel Hilbert spaces. Then binary classification algorithms given by regularization schemes are described with emphasis on support vector machines and noise conditions for distributions. Finally, we mention further topics and some open problems in learning theory.
For the entire collection, see [K. Jetter (ed.), M. Buhmann (ed.), W. Haussmann (ed.), R. Schaback (ed.), J. Stoeckler (ed.), Topics in multivariate approximation and interpolation. Studies in Computational Mathematics 12. Amsterdam: Elsevier. (2005; Zbl 1205.41002)].
For the entire collection see [Zbl 1205.41002].

MSC:

62J02 General nonlinear regression
62H30 Classification and discrimination; cluster analysis (statistical aspects)
68T05 Learning and adaptive systems in artificial intelligence

Citations:

Zbl 1205.41002
Full Text: DOI