Refined Generalization Bounds of Gradient Learning over Reproducing Kernel Hilbert Spaces
From MaRDI portal
Publication:5380250
DOI10.1162/NECO_a_00739zbMath1473.68153WikidataQ50594021 ScholiaQ50594021MaRDI QIDQ5380250
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Nonparametric estimation (62G05) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An empirical feature-based learning algorithm producing sparse approximations
- Learning sparse gradients for variable selection and dimension reduction
- Learning gradients on manifolds
- Learning gradients via an early stopping gradient descent method
- Limit theorems for \(U\)-processes
- Geometry on probability spaces
- Component selection and smoothing in multivariate nonparametric regression
- High-dimensional additive modeling
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Ranking and empirical minimization of \(U\)-statistics
- Local Rademacher complexities
- On the mathematical foundations of learning
- Better Subset Regression Using the Nonnegative Garrote
- Support Vector Machines
- A new concentration result for regularized risk minimizers
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Additive Models
- An Adaptive Estimation of Dimension Reduction Space
- Model-Free Variable Selection
- Refined Rademacher Chaos Complexity Bounds with Applications to the Multikernel Learning Problem
- U-Processes and Preference Learning
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data