Affiliation:
1. NEC Research Institute, 4 Independence Way, Princeton, NJ 08540 USA
Abstract
Within the context of Valiant's protocol for learning, the perceptron algorithm is shown to learn an arbitrary half-space in time O(n2/∊3) if D, the probability distribution of examples, is taken uniform over the unit sphere Sn. Here ∊ is the accuracy parameter. This is surprisingly fast, as “standard” approaches involve solution of a linear programming problem involving Ω(n/∊) constraints in n dimensions. A modification of Valiant's distribution-independent protocol for learning is proposed in which the distribution and the function to be learned may be chosen by adversaries, however these adversaries may not communicate. It is argued that this definition is more reasonable and applicable to real world learning than Valiant's. Under this definition, the perceptron algorithm is shown to be a distribution-independent learning algorithm. In an appendix we show that, for uniform distributions, some classes of infinite V-C dimension including convex sets and a class of nested differences of convex sets are learnable.
Subject
Cognitive Neuroscience,Arts and Humanities (miscellaneous)
Cited by
39 articles.
订阅此论文施引文献
订阅此论文施引文献,注册后可以免费订阅5篇论文的施引文献,订阅后可以查看论文全部施引文献
1. MeFirst ranking and multiple dichotomies : Via Linear Programming and Neural Networks;2022 26th International Conference on Pattern Recognition (ICPR);2022-08-21
2. Risk-Based Breast Cancer Prognosis Using Minimal Patient Characteristics;2022 IEEE 10th International Conference on Healthcare Informatics (ICHI);2022-06
3. The power of localization for efficiently learning linear separators with noise;Proceedings of the forty-sixth annual ACM symposium on Theory of computing;2014-05-31
4. The regularized least squares algorithm and the problem of learning halfspaces;Information Processing Letters;2011-03
5. Learning Geometric Concepts via Gaussian Surface Area;2008 49th Annual IEEE Symposium on Foundations of Computer Science;2008-10