Mean Field Analysis of Neural Networks: A Law of Large Numbers

From MaRDI portal
Publication:5219306

DOI10.1137/18M1192184zbMath1440.60008arXiv1805.01053OpenAlexW3010825589WikidataQ114847156 ScholiaQ114847156MaRDI QIDQ5219306

Justin A. Sirignano, Konstantinos V. Spiliopoulos

Publication date: 11 March 2020

Published in: SIAM Journal on Applied Mathematics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1805.01053




Related Items (35)

Nonlocal cross-diffusion systems for multi-species populations and networksMehler’s Formula, Branching Process, and Compositional Kernels of Deep Neural NetworksThe Continuous Formulation of Shallow Neural Networks as Wasserstein-Type Gradient FlowsDeep learning: a statistical viewpointSurprises in high-dimensional ridgeless least squares interpolationTwo-Layer Neural Networks with Values in a Banach SpaceSparse optimization on measures with over-parameterized gradient descentMean Field Analysis of Deep Neural NetworksAsymptotics of Reinforcement Learning with Neural NetworksLarge Sample Mean-Field Stochastic OptimizationA rigorous framework for the mean field limit of multilayer neural networksA class of dimension-free metrics for the convergence of empirical measuresSharp uniform-in-time propagation of chaosContinuous limits of residual neural networks in case of large input dataOnline parameter estimation for the McKean-Vlasov stochastic differential equationA blob method for inhomogeneous diffusion with applications to multi-agent control and samplingNon-mean-field Vicsek-type models for collective behaviorStochastic gradient descent with noise of machine learning type. II: Continuous time analysisNormalization effects on deep neural networksGradient descent on infinitely wide neural networks: global convergence and generalizationUnnamed ItemLandscape and training regimes in deep learningMean Field Limits for Interacting Diffusions with Colored Noise: Phase Transitions and Spectral Numerical MethodsFast Non-mean-field Networks: Uniform in Time AveragingUnnamed ItemA selective overview of deep learningReinforcement learning and stochastic optimisationNormalization effects on shallow neural networks and related asymptotic expansionsMean-field Langevin dynamics and energy landscape of neural networksSupervised learning from noisy observations: combining machine-learning techniques with data assimilationPropagation of chaos: a review of models, methods and applications. I: Models and methodsPropagation of chaos: a review of models, methods and applications. II: ApplicationsAsymptotic properties of one-layer artificial neural networks with sparse connectivitySuboptimal Local Minima Exist for Wide Neural Networks with Smooth ActivationsRepresentation formulas and pointwise properties for Barron functions


Uses Software


Cites Work


This page was built for publication: Mean Field Analysis of Neural Networks: A Law of Large Numbers