About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
Information and Computation
Paper
Results on learnability and the Vapnik-Chervonenkis dimension
Abstract
We consider the problem of learning a concept from examples in the distribution-free model by Valiant. (An essentially equivalent model, if one ignores issues of computational difficulty, was studied by Vapnik and Chervonenkis.) We introduce the notion of dynamic sampling, wherein the number of examples examined may increase with the complexity of the target concept. This method is used to establish the learnability of various concept classes with an infinite Vapnik-Chervonenkis dimension. We also discuss an important variation on the problem of learning from examples, called approximating from examples. Here we do not assume that the target concept T is a member of the concept class C from which approximations are chosen. This problem takes on particular interest when the VC dimension of C is infinite. Finally, we discuss the problem of computing the VC dimension of a finite concept set defined on a finite domain and consider the structure of classes of a fixed small dimension. © 1991.