Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime*
From MaRDI portal
Publication:5055412
DOI10.1088/1742-5468/ac9829OpenAlexW3213347999MaRDI QIDQ5055412
Lenka Zdeborová, Bruno Loureiro, Florent Krzakala, Hugo Cui
Publication date: 13 December 2022
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2105.15004
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Eigenvectors of some large sample covariance matrix ensembles
- High-dimensional asymptotics of prediction: ridge regression and classification
- Concentration of measure and isoperimetric inequalities in product spaces
- Bayesian learning for neural networks
- High-dimensional dynamics of generalization error in neural networks
- Surprises in high-dimensional ridgeless least squares interpolation
- Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces
- Optimal rates for the regularized least-squares algorithm
- Acceleration of Stochastic Approximation by Averaging
- Precise Error Analysis of Regularized <inline-formula> <tex-math notation="LaTeX">$M$ </tex-math> </inline-formula>-Estimators in High Dimensions
- When do neural networks outperform kernel methods?*
- Two Models of Double Descent for Weak Features
- Benign overfitting in linear regression
- Asymptotic learning curves of kernel methods: empirical data versus teacher–student paradigm
- Support vector machines learning noisy polynomial rules
- Ridge regression and asymptotic minimax estimation over spheres of growing dimension
This page was built for publication: Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime*