Predicting \(\{ 0,1\}\)-functions on randomly drawn points
From MaRDI portal
Publication:1342520
DOI10.1006/inco.1994.1097zbMath0938.68785OpenAlexW2066688546MaRDI QIDQ1342520
Manfred K. Warmuth, David Haussler, Nicholas Littlestone
Publication date: 16 February 1995
Published in: Information and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1006/inco.1994.1097
Analysis of algorithms and problem complexity (68Q25) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Efficient algorithms for learning functions with bounded variation, Sphere packing numbers for subsets of the Boolean \(n\)-cube with bounded Vapnik-Chervonenkis dimension, Learning intersection-closed classes with signatures, A new PAC bound for intersection-closed concept classes, Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers, Characterizing rational versus exponential learning curves, Localization of VC classes: beyond local Rademacher complexities, Efficient learning of typical finite automata from random walks, Risk bounds for statistical learning, Ten More Years of Error Rate Research, On the complexity of learning from drifting distributions, Learning Weighted Automata, Learning nested differences in the presence of malicious noise, On partial cubes, well-graded families and their duals with some applications in graphs, Compression schemes for concept classes induced by three types of discrete undirected graphical models, The true sample complexity of active learning, Using the doubling dimension to analyze the generalization of learning algorithms, Combinatorial bounds of overfitting for threshold classifiers, Equivalence of models for polynomial learnability, Classic learning, Unnamed Item, A sharp concentration inequality with applications, Learning nested differences in the presence of malicious noise, Decision theoretic generalizations of the PAC model for neural net and other learning applications, Improved bounds on the sample complexity of learning, (Machine) learning parameter regions, Two proofs for shallow packings, Sharpness estimation of combinatorial generalization ability bounds for threshold decision rules, The power of random counterexamples, Bounding Embeddings of VC Classes into Maximum Classes, Shifting: one-inclusion mistake bounds and sample compression, Theory of Classification: a Survey of Some Recent Advances, Sharpening Occam's razor, An upper bound on the sample complexity of PAC-learning halfspaces with respect to the uniform distribution, An extension of play against the random past strategy. Choosing the right experts on IBM forecasts, Learning commutative deterministic finite state automata in polynomial time, When are epsilon-nets small?, Prediction-preserving reducibility, Prediction, learning, uniform convergence, and scale-sensitive dimensions, Learning with unreliable boundary queries, On density of subgraphs of halved cubes, Approximating hyper-rectangles: Learning and pseudorandom sets, A general lower bound on the number of examples needed for learning, Learning with a Drifting Target Concept, Two-dimensional partial cubes, Optimality of SVM: novel proofs and tighter bounds, Unnamed Item, Apple tasting., Improved lower bounds for learning from noisy examples: An information-theoretic approach, Exploiting random walks for learning, Relative expected instantaneous loss bounds, Distribution-free robust linear regression