Self-consistent dynamical field theory of kernel evolution in wide neural networks
From MaRDI portal
Publication:6611447
DOI10.1088/1742-5468/AD01B0MaRDI QIDQ6611447
Blake Bordelon, Cengiz Pehlevan
Publication date: 26 September 2024
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Reaction models in stochastic field theory
- Statistical field theory for neural networks
- High-dimensional dynamics of generalization error in neural networks
- Cugliandolo-Kurchan equations for dynamics of spin-glasses
- Cubic regularization of Newton method and its global performance
- A mean field view of the landscape of two-layer neural networks
- The effective noise of stochastic gradient descent
- Unified field theoretical approach to deep and recurrent neuronal networks
- Trainability and Accuracy of Artificial Neural Networks: An Interacting Particle System Approach
- Out-of-equilibrium dynamical mean-field equations for the perceptron model
- Disentangling feature and lazy training in deep neural networks
- Wide neural networks of any depth evolve as linear models under gradient descent *
This page was built for publication: Self-consistent dynamical field theory of kernel evolution in wide neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6611447)