Publication | Open Access
Learnability and the Vapnik-Chervonenkis dimension
1.8K
Citations
51
References
1989
Year
Artificial IntelligenceEngineeringMachine LearningAlgorithmic LearningComputational ComplexityClassification MethodVapnik-chervonenkis DimensionPattern RecognitionLearning ProblemInstance-based LearningComputational Learning TheoryDistribution-free ConvergenceKnowledge DiscoveryComputer ScienceStatistical Learning TheoryLearnability ModelAutomated ReasoningFeasible LearnabilityLearning Theory
Valiant's learnability model is extended to learning classes of concepts defined by regions in Euclidean space E n . The methods in this paper lead to a unified treatment of some of Valiant's results, along with previous results on distribution-free convergence of certain pattern recognition algorithms. It is shown that the essential condition for distribution-free learnability is finiteness of the Vapnik-Chervonenkis dimension, a simple combinatorial parameter of the class of concepts to be learned. Using this parameter, the complexity and closure properties of learnable classes are analyzed, and the necessary and sufficient conditions are provided for feasible learnability.
| Year | Citations | |
|---|---|---|
2011 | 10.1K | |
1970 | 6.1K | |
1984 | 4.8K | |
1974 | 4.5K | |
1984 | 4.2K | |
1984 | 3.2K | |
1976 | 3K | |
1971 | 2.4K | |
1985 | 2.2K | |
1974 | 2.2K |
Page 1
Page 1