Abstract
We establish learning rates to the Bayes risk for support vector machines (SVMs) using a regularization sequence \({\it \lambda}_{n}={\it n}^{-\rm \alpha}\), where \({\it \alpha}\in\)(0,1) is arbitrary. Under a noise condition recently proposed by Tsybakov these rates can become faster than n − 1/2. In order to deal with the approximation error we present a general concept called the approximation error function which describes how well the infinite sample versions of the considered SVMs approximate the data-generating distribution. In addition we discuss in some detail the relation between the “classical” approximation error and the approximation error function. Finally, for distributions satisfying a geometric noise assumption we establish some learning rates when the used RKHS is a Sobolev space.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. Springer, New York (1996)
Yang, Y.: Minimax nonparametric classification—part I and II. IEEE Trans. Inform. Theory 45, 2271–2292 (1999)
Wu, Q., Zhou, D.X.: Analysis of support vector machine classification. Tech. Report, City University of Hong Kong (2003)
Mammen, E., Tsybakov, A.: Smooth discrimination analysis. Ann. Statist. 27, 1808–1829 (1999)
Tsybakov, A.: Optimal aggregation of classifiers in statistical learning. Ann. Statist. 32, 135–166 (2004)
Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)
Steinwart, I., Scovel, C.: Fast rates for support vector machines using Gaussian kernels. Ann. Statist. submitted (2004), http://www.c3.lanl.gov/~ingo/publications/ann-04a.pdf
Steinwart, I.: Consistency of support vector machines and other regularized kernel machines. IEEE Trans. Inform. Theory 51, 128–142 (2005)
Steinwart, I., Scovel, C.: Fast rates to bayes for kernel machines. In: Saul, L.K., Weiss, Y., Bottou, L. (eds.) Advances in Neural Information Processing Systems, vol. 17, pp. 1345–1352. MIT Press, Cambridge (2005)
Edmunds, D., Triebel, H.: Function Spaces, Entropy Numbers, Differential Operators. Cambridge University Press, Cambridge (1996)
Zhang, T.: Statistical behaviour and consistency of classification methods based on convex risk minimization. Ann. Statist. 32, 56–134 (2004)
Rockafellar, R.: Convex Analysis. Princeton University Press, Princeton (1970)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Steinwart, I., Scovel, C. (2005). Fast Rates for Support Vector Machines. In: Auer, P., Meir, R. (eds) Learning Theory. COLT 2005. Lecture Notes in Computer Science(), vol 3559. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11503415_19
Download citation
DOI: https://doi.org/10.1007/11503415_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-26556-6
Online ISBN: 978-3-540-31892-7
eBook Packages: Computer ScienceComputer Science (R0)