Adaptive learning rates for support vector machines working on data with low intrinsic dimension
From MaRDI portal
Publication:2073699
DOI10.1214/21-AOS2078zbMath1486.62107arXiv2003.06202OpenAlexW3010957709MaRDI QIDQ2073699
Publication date: 7 February 2022
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.06202
Nonparametric regression and quantile regression (62G08) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Bayesian manifold regression
- A tree-based regressor that adapts to intrinsic dimension
- Covering numbers of Gaussian reproducing kernel Hilbert spaces
- Estimating conditional quantiles with the help of the pinball loss
- Learning rates for kernel-based expectile regression
- Adaptive Bayesian estimation using a Gaussian random field with inverse gamma bandwidth
- Learning and approximation by Gaussians on Riemannian manifolds
- Theory of function spaces II
- Fast rates for support vector machines using Gaussian kernels
- Fast learning rates for plug-in classifiers
- On the concept of attractor
- Strange attractors
- The analysis of linear partial differential operators. I: Distribution theory and Fourier analysis.
- Smooth discrimination analysis
- Improved classification rates under refined margin conditions
- Über einige Kreisüberdeckungen
- Optimal global rates of convergence for nonparametric regression
- A distribution-free theory of nonparametric regression
- New approaches to statistical learning theory
- Optimal aggregation of classifiers in statistical learning.
- Optimal regression rates for SVMs using Gaussian kernels
- Optimal learning with anisotropic Gaussian SVMs
- SVM LEARNING AND Lp APPROXIMATION BY GAUSSIANS ON RIEMANNIAN MANIFOLDS
- Support Vector Machines
- Minimax-optimal classification with dyadic decision trees
- Rates of convergence of nearest neighbor estimation under arbitrary sampling
This page was built for publication: Adaptive learning rates for support vector machines working on data with low intrinsic dimension