Just interpolate: kernel ``ridgeless regression can generalize
DOI10.1214/19-AOS1849zbMath1453.68155arXiv1808.00387OpenAlexW3104969455MaRDI QIDQ2196223
Tengyuan Liang, Alexander Rakhlin
Publication date: 28 August 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.00387
reproducing kernel Hilbert spaceshigh dimensionalitykernel methodsimplicit regularizationdata-dependent boundsminimum-norm interpolationspectral decay
Nonparametric regression and quantile regression (62G08) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items (39)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model selection for regularized least-squares algorithm in learning theory
- The spectrum of kernel random matrices
- On the limit of the largest eigenvalue of the large dimensional sample covariance matrix
- A distribution-free theory of nonparametric regression
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Regularization networks and support vector machines
- Optimal rates for the regularized least-squares algorithm
- On early stopping in gradient descent learning
- Kernels for Vector-Valued Functions: A Review
- Generalized Cross-Validation as a Method for Choosing a Good Ridge Parameter
- 10.1162/153244303321897690
- Kernel Ridge Regression
- Learning Theory
- The origins of kriging
This page was built for publication: Just interpolate: kernel ``ridgeless regression can generalize