Abstract We present a systematic framework for classifying, comparing, and defining models of PAC learnability. Apart from the obvious "uniformity" parameters, we present a novel "solid learnability" notion that indicates when the class in question can be successfully learned by the most straightforward algorithms, namely, any consistent algorithm. We analyze known models in terms of our new parameterization scheme and investigate the relative strength of notions of learnability that correspond to different parameter values. In addition, we consider "proximity" between concept classes. We define notions of "covering" one class by another and show that, with respect to learnability, they play a role similar to the role of reductions in computational complexity; the learnability of a class implies the learnability of any class it covers. We apply the covering technique to resolve some open questions raised by Benedek and Itai (1991, Theoret. Comput. Sci.86, 377-389; 1989, Inform. and Comput.82, 247-261) and Linial et al. (1991, Inform. and Comput.90, 33-49). The notions we discuss are information-theoretic: we concentrate on the question of learnability rather than the computational complexity of the learning process.
[1]
David Haussler,et al.
Equivalence of models for polynomial learnability
,
1988,
COLT '88.
[2]
Temple F. Smith.
Occam's razor
,
1980,
Nature.
[3]
Balas K. Natarajan,et al.
On learning Boolean functions
,
1987,
STOC.
[4]
Leslie G. Valiant,et al.
Computational limitations on learning from examples
,
1988,
JACM.
[5]
Nathan Linial,et al.
Results on learnability and the Vapnik-Chervonenkis dimension
,
1988,
[Proceedings 1988] 29th Annual Symposium on Foundations of Computer Science.
[6]
Leslie G. Valiant,et al.
A general lower bound on the number of examples needed for learning
,
1988,
COLT '88.
[7]
David Haussler,et al.
Classifying learnable geometric concepts with the Vapnik-Chervonenkis dimension
,
1986,
STOC '86.