scientific article; zbMATH DE number 7306870
From MaRDI portal
Publication:5148955
Jonathan Lomond, Benoit Sanchez, Dmitry Kobak
Publication date: 5 February 2021
Full work available at URL: https://arxiv.org/abs/1805.10939
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items
Surprises in high-dimensional ridgeless least squares interpolation ⋮ Generalization error rates in kernel regression: the crossover from the noiseless to noisy regime* ⋮ Binary Classification of Gaussian Mixtures: Abundance of Support Vectors, Benign Overfitting, and Regularization ⋮ Unnamed Item ⋮ Double data piling leads to perfect classification ⋮ Unnamed Item
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- High-dimensional asymptotics of prediction: ridge regression and classification
- High-dimensional dynamics of generalization error in neural networks
- Just interpolate: kernel ``ridgeless regression can generalize
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Atomic Decomposition by Basis Pursuit
- An Introduction to Statistical Learning
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
- A jamming transition from under- to over-parametrization affects generalization in deep learning
This page was built for publication: