Results on learnability and the Vapnik-Chervonenkis dimension
From MaRDI portal
Publication:751859
DOI10.1016/0890-5401(91)90058-AzbMath0715.68071OpenAlexW2122735222MaRDI QIDQ751859
Ronald L. Rivest, Nathan Linial, Yishay Mansour
Publication date: 1991
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0890-5401(91)90058-a
Related Items
On universal learning algorithms ⋮ Vapnik-Chervonenkis dimension and (pseudo-)hyperplane arrangements ⋮ On data classification by iterative linear partitioning ⋮ Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers ⋮ Quasipolynomiality of the Smallest Missing Induced Subgraph ⋮ PAC learning, VC dimension, and the arithmetic hierarchy ⋮ From undecidability of non-triviality and finiteness to undecidability of learnability ⋮ Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions ⋮ The functions of finite support: a canonical learning problem ⋮ A general lower bound on the number of examples needed for learning ⋮ The complexity of theory revision
Cites Work
- Unnamed Item
- On finding a minimum dominating set in a tournament
- Inferring decision trees using the minimum description length principle
- Modeling by shortest data description
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- A general lower bound on the number of examples needed for learning
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convergence of stochastic processes
This page was built for publication: Results on learnability and the Vapnik-Chervonenkis dimension