Mathematical Research Data Initiative
Main page
Recent changes
Random page
Help about MediaWiki
Create a new Item
Create a new Property
Create a new EntitySchema
Merge two items
In other projects
Discussion
View source
View history
Purge
English
Log in

Bounds on the sample complexity of Bayesian learning using information theory and the VC dimension

From MaRDI portal
Publication:1314506
Jump to:navigation, search

zbMath0798.68145MaRDI QIDQ1314506

David Haussler, Michael Kearns, Robert E. Schapire

Publication date: 3 March 1994

Published in: Machine Learning (Search for Journal in Brave)


zbMATH Keywords

Bayesian learningVC dimensionstatistical physicsinformation theorylearning curvesaverage-case learning


Mathematics Subject Classification ID

Learning and adaptive systems in artificial intelligence (68T05)


Related Items

Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension, Learning from a population of hypotheses, Characterizing rational versus exponential learning curves, Mutual information, metric entropy and cumulative relative entropy risk, MetaBayes: Bayesian Meta-Interpretative Learning Using Higher-Order Stochastic Refinement, Rigorous learning curve bounds from statistical mechanics, Learning a priori constrained weighted majority votes, QG/GA: a stochastic search for Progol, Sample size lower bounds in PAC learning by Algorithmic Complexity Theory, Bayesian predictiveness, exchangeability and sufficientness in bacterial taxonomy, Query by committee, linear separation and random walks.



Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:1314506&oldid=13432665"
Tools
What links here
Related changes
Special pages
Printable version
Permanent link
Page information
MaRDI portal item
This page was last edited on 31 January 2024, at 13:00.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki