A jamming transition from under- to over-parametrization affects generalization in deep learning
From MaRDI portal
Publication:5872795
DOI10.1088/1751-8121/ab4c8bOpenAlexW3100156752MaRDI QIDQ5872795
Levent Sagun, Stéphane D'Ascoli, Mario Geiger, Matthieu Wyart, Stefano Spigler, Giulio Biroli
Publication date: 4 January 2023
Published in: Journal of Physics A: Mathematical and Theoretical (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.09665
Related Items (15)
Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks ⋮ Deep learning: a statistical viewpoint ⋮ Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation ⋮ Surprises in high-dimensional ridgeless least squares interpolation ⋮ Loss landscapes and optimization in over-parameterized non-linear systems and neural networks ⋮ Learning curves of generic features maps for realistic datasets with a teacher-student model* ⋮ Gradient descent dynamics and the jamming transition in infinite dimensions ⋮ Landscape and training regimes in deep learning ⋮ On the stability and generalization of neural networks with VC dimension and fuzzy feature encoders ⋮ A statistician teaches deep learning ⋮ Geometric compression of invariant manifolds in neural networks ⋮ Unnamed Item ⋮ Triple descent and the two kinds of overfitting: where and why do they appear?* ⋮ Generalisation error in learning with random features and the hidden manifold model* ⋮ Two Models of Double Descent for Weak Features
Uses Software
Cites Work
This page was built for publication: A jamming transition from under- to over-parametrization affects generalization in deep learning