An improved analysis of the Rademacher data-dependent bound using its self bounding property
From MaRDI portal
Publication:459446
DOI10.1016/j.neunet.2013.03.017zbMath1296.68137OpenAlexW2031690114WikidataQ30616773 ScholiaQ30616773MaRDI QIDQ459446
Davide Anguita, Sandro Ridella, Luca Oneto, Alessandro Ghio
Publication date: 9 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2013.03.017
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (4)
Tikhonov, Ivanov and Morozov regularization for support vector machine learning ⋮ Local Rademacher complexity: sharper risk bounds with and without unlabeled samples ⋮ An improved analysis of the Rademacher data-dependent bound using its self bounding property ⋮ Percolation centrality via Rademacher Complexity
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- An improved analysis of the Rademacher data-dependent bound using its self bounding property
- Some limit theorems for empirical processes (with discussion)
- Concentration inequalities using the entropy method
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Aspects of discrete mathematics and probability in the theory of machine learning
- Local Rademacher complexities
- A sharp concentration inequality with applications
- Rademacher penalties and structural risk minimization
- Structural risk minimization over data-dependent hierarchies
- Learning pattern classification-a survey
- 10.1162/153244303321897690
- Model selection and error estimation
This page was built for publication: An improved analysis of the Rademacher data-dependent bound using its self bounding property