Surrogate losses in passive and active learning
From MaRDI portal
Publication:2008623
DOI10.1214/19-EJS1635zbMath1433.62158arXiv1207.3772OpenAlexW1807095418MaRDI QIDQ2008623
Publication date: 26 November 2019
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1207.3772
classificationstatistical learning theoryactive learningselective samplingsequential designsurrogate loss functions
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Sequential statistical design (62L05)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning noisy linear classifiers via adaptive and selective sampling
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Fast learning rates for plug-in classifiers
- Rates of growth and sample moduli for weighted empirical processes indexed by sets
- Universal Donsker classes and metric entropy
- Decision theoretic generalizations of the PAC model for neural net and other learning applications
- Central limit theorems for empirical measures
- Toward efficient agnostic learning
- A decision-theoretic generalization of on-line learning and an application to boosting
- Smooth discrimination analysis
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Optimal aggregation of classifiers in statistical learning.
- Support-vector networks
- Weak convergence and empirical processes. With applications to statistics
- A local maximal inequality under uniform entropy
- The true sample complexity of active learning
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Rates of convergence in active learning
- Concentration inequalities and asymptotic results for ratio type empirical processes
- Agnostic active learning
- Local Rademacher complexities
- Efficient noise-tolerant learning from statistical queries
- Agnostically Learning Halfspaces
- Minimax Bounds for Active Learning
- Rademacher penalties and structural risk minimization
- 10.1162/1532443041424319
- 10.1162/153244303321897690
- Theory of Disagreement-Based Active Learning
- Information-Based Complexity, Feedback and Dynamics in Convex Programming
- Plug-in Approach to Active Learning
- Activized Learning: Transforming Passive to Active with Improved Label Complexity
- Teaching Dimension and the Complexity of Active Learning
- On the Uniform Convergence of Relative Frequencies of Events to Their Probabilities
- Convexity, Classification, and Risk Bounds
- Convergence of stochastic processes
- Introduction to nonparametric estimation
This page was built for publication: Surrogate losses in passive and active learning