Infinite-width limit of deep linear neural networks

From MaRDI portal
Publication:6419054

arXiv2211.16980MaRDI QIDQ6419054

Lénaïc Chizat, Alessio Figalli, Maria Colombo, Xavier Fernández-Real

Publication date: 29 November 2022

Abstract: This paper studies the infinite-width limit of deep linear neural networks initialized with random parameters. We obtain that, when the number of neurons diverges, the training dynamics converge (in a precise sense) to the dynamics obtained from a gradient descent on an infinitely wide deterministic linear neural network. Moreover, even if the weights remain random, we get their precise law along the training dynamics, and prove a quantitative convergence result of the linear predictor in terms of the number of neurons. We finally study the continuous-time limit obtained for infinitely wide linear neural networks and show that the linear predictors of the neural network converge at an exponential rate to the minimal ell2-norm minimizer of the risk.




Has companion code repository: https://github.com/lchizat/2022-wide-linear-nn








This page was built for publication: Infinite-width limit of deep linear neural networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6419054)