Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model
DOI10.1214/18-AOS1766zbMath1447.62070arXiv1606.08920MaRDI QIDQ2328061
Iosif Pinelis, Leonid (Aryeh) Kontorovich
Publication date: 9 October 2019
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1606.08920
classificationbinomial distributiongeneralization errorempirical estimatorsminimax decision rulesBayes decision rulesPAC learning theory
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Minimax procedures in statistical decision theory (62C20) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (4)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Optimal binomial, Poisson, and normal left-tail domination for sums of nonnegative random variables
- On general minimax theorems
- Criterion for complete determinacy for concave-convexlike games
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Sharper bounds for Gaussian and empirical processes
- Efficient distribution-free learning of probabilistic concepts
- Toward efficient agnostic learning
- Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension
- General bounds on the number of examples needed for learning probabilistic concepts
- The complexity of learning according to two models of a drifting environment
- Fast learning rates in statistical inference through aggregation
- A CONVEXITY PROPERTY OF POSITIVE MATRICES
- A theory of the learnable
- Active Nearest-Neighbor Learning in Metric Spaces
- Neural Network Learning
- Understanding Machine Learning
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
This page was built for publication: Exact lower bounds for the agnostic probably-approximately-correct (PAC) machine learning model