×

Analysis of perceptron-based active learning. (English) Zbl 1137.68529

Auer, Peter (ed.) et al., Learning theory. 18th annual conference on learning theory, COLT 2005, Bertinoro, Italy, June 27–30, 2005. Proceedings. Berlin: Springer (ISBN 3-540-26556-2/pbk). Lecture Notes in Computer Science 3559. Lecture Notes in Artificial Intelligence, 249-263 (2005).
Summary: We start by showing that in an active learning setting, the Perceptron algorithm needs \(\Omega(\frac1{\epsilon^2})\) labels to learn linear separators within generalization error \(\epsilon\). We then present a simple selective sampling algorithm for this problem, which combines a modification of the perceptron update with an adaptive filtering rule for deciding which points to query. For data distributed uniformly over the unit sphere, we show that our algorithm reaches generalization error \(\epsilon\) after asking for just \(\tilde O(d\log\frac1\epsilon)\) labels. This exponential improvement over the usual sample complexity of supervised learning has previously been demonstrated only for the computationally more complex query-by-committee algorithm.
For the entire collection see [Zbl 1076.68003].

MSC:

68T05 Learning and adaptive systems in artificial intelligence
Full Text: DOI