Learnability with respect to fixed distributions
From MaRDI portal
Publication:809614
DOI10.1016/0304-3975(91)90026-XzbMath0733.68066MaRDI QIDQ809614
Publication date: 1991
Published in: Theoretical Computer Science (Search for Journal in Brave)
Related Items (19)
Nonuniform learnability ⋮ On universal learning algorithms ⋮ Sample Complexity Bounds on Differentially Private Learning via Communication Complexity ⋮ Inductive inference in the limit of empirically adequate theories ⋮ A sufficient condition for polynomial distribution-dependent learnability ⋮ Learning distributions by their density levels: A paradigm for learning without a teacher ⋮ Supervised learning and co-training ⋮ Smart PAC-learners ⋮ Learning under \(p\)-tampering poisoning attacks ⋮ Using the doubling dimension to analyze the generalization of learning algorithms ⋮ Rigorous learning curve bounds from statistical mechanics ⋮ A fixed-distribution PAC learning theory for neural FIR models ⋮ An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution ⋮ When are epsilon-nets small? ⋮ Prediction, learning, uniform convergence, and scale-sensitive dimensions ⋮ Sample size lower bounds in PAC learning by Algorithmic Complexity Theory ⋮ A general lower bound on the number of examples needed for learning ⋮ Smart PAC-Learners ⋮ Improved lower bounds for learning from noisy examples: An information-theoretic approach
Cites Work
- Unnamed Item
- Unnamed Item
- Occam's razor
- Fast probabilistic algorithms for Hamiltonian circuits and matchings
- Deductive learning
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Computational limitations on learning from examples
- Probably Approximate Learning over Classes of Distributions
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
This page was built for publication: Learnability with respect to fixed distributions