Improved lower bounds for learning from noisy examples: An information-theoretic approach
From MaRDI portal
Publication:1854425
DOI10.1006/inco.2000.2919zbMath1007.68078OpenAlexW1995305717MaRDI QIDQ1854425
David P. Helmbold, Claudio Gentile
Publication date: 14 January 2003
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1006/inco.2000.2919
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05) Information theory (general) (94A15)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Four types of noise in data for PAC learning
- On learning from queries and counterexamples in the presence of noise
- Learnability with respect to fixed distributions
- Learning regular sets from queries and counterexamples
- Fast probabilistic algorithms for Hamiltonian circuits and matchings
- Estimation of dependences based on empirical data. Transl. from the Russian by Samuel Kotz
- Sample size lower bounds in PAC learning by Algorithmic Complexity Theory
- Predicting \(\{ 0,1\}\)-functions on randomly drawn points
- Malicious omissions and errors in answers to membership queries
- On the complexity of learning from drifting distributions
- Mutual information, metric entropy and cumulative relative entropy risk
- A general lower bound on the number of examples needed for learning
- Apple tasting.
- General bounds on the number of examples needed for learning probabilistic concepts
- Queries and concept learning
- Learning in the Presence of Malicious Errors
- Learnability and the Vapnik-Chervonenkis dimension
- A theory of the learnable
- Lower bounds on expected redundancy for nonparametric classes
- General formulation of Shannon’s main theorem in information theory
This page was built for publication: Improved lower bounds for learning from noisy examples: An information-theoretic approach