About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
STOC 1996
Conference paper
Noise-tolerant distribution-free learning of general geometric concepts
Abstract
We present an efficient algorithm for PAC-learning a very general class of geometric concepts over Rd for fixed d. More specifically, let T be any set of s halfspaces. Let x = (x1,..., Xd) be an arbitrary point in Rd. With each t ∈ f we associate a boolean indicator function It(x) which is 1 if and only if x is in the halfspace t. The concept class, Cds, that we study consists of all concepts formed by any boolean function over It1,..., Its for ti ∈ T. This concept class is much more general than any geometric concept class known to be PAC-learnable. Our results can be easily extended to efficiently learn any boolean combination of a polynomial number of concepts selected from any concept class C over Rd given that the VC-dimension of C has dependence only on d (and is thus constant for any constant d), and there is a polynomial time algorithm to determine if there is a concept from C consistent with a given set of labeled examples. We also present a statistical query version of our algorithm that can tolerate random classification noise for any noise rate strictly less than 1/2. FinaEy we present a generalization of the standard e-net result of Haussler and Welzl [25] and apply it to give an alternative noise-tolerant algorithm for d = 2 based on geometric subdivisions. · 1996 ACM.