For interpolating kernel machines, minimizing the norm of the ERM solution maximizes stability
From MaRDI portal
Publication:5873932
DOI10.1142/S0219530522400115OpenAlexW4309259897MaRDI QIDQ5873932
Lorenzo Rosasco, Tomaso Poggio, Akshay Rangamani
Publication date: 10 February 2023
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219530522400115
kernel regressionhigh-dimensional statisticsoverparameterizationalgorithmic stabilityminimum-norm interpolation
Cites Work
- Unnamed Item
- Statistics for high-dimensional data. Methods, theory and applications.
- Interpolation of scattered data: distance matrices and conditionally positive definite functions
- The spectrum of kernel random matrices
- A revisitation of formulae for the Moore-Penrose inverse of modified matrices.
- Surprises in high-dimensional ridgeless least squares interpolation
- Just interpolate: kernel ``ridgeless regression can generalize
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
- Theory of Classification: a Survey of Some Recent Advances
- Support Vector Machines
- 10.1162/153244302760200704
- The Generalization Error of Random Features Regression: Precise Asymptotics and the Double Descent Curve
- Benign overfitting in linear regression
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Understanding Machine Learning
- DISTRIBUTION OF EIGENVALUES FOR SOME SETS OF RANDOM MATRICES
- Generalized Inversion of Modified Matrices