Mean field analysis of neural networks: a central limit theorem
From MaRDI portal
Publication:2301498
DOI10.1016/j.spa.2019.06.003zbMath1441.60022arXiv1808.09372OpenAlexW2963791871MaRDI QIDQ2301498
Konstantinos V. Spiliopoulos, Justin A. Sirignano
Publication date: 24 February 2020
Published in: Stochastic Processes and their Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.09372
Central limit and other weak theorems (60F05) Interacting random processes; statistical mechanics type models; percolation theory (60K35) Functional limit theorems; invariance principles (60F17) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items (27)
Machine learning from a continuous viewpoint. I ⋮ Align, then memorise: the dynamics of learning with feedback alignment* ⋮ Align, then memorise: the dynamics of learning with feedback alignment* ⋮ Particle dual averaging: optimization of mean field neural network with global convergence rate analysis* ⋮ Mean-field and kinetic descriptions of neural differential equations ⋮ Mean Field Analysis of Deep Neural Networks ⋮ Asymptotics of Reinforcement Learning with Neural Networks ⋮ Unbiased Deep Solvers for Linear Parametric PDEs ⋮ Training Neural Networks as Learning Data-adaptive Kernels: Provable Representation and Approximation Benefits ⋮ Continuous limits of residual neural networks in case of large input data ⋮ Unnamed Item ⋮ A comparative analysis of optimization and generalization properties of two-layer neural network and random feature models under gradient descent dynamics ⋮ High‐dimensional limit theorems for SGD: Effective dynamics and critical scaling ⋮ Benign Overfitting and Noisy Features ⋮ Normalization effects on deep neural networks ⋮ Mirror descent algorithms for minimizing interacting free energy ⋮ Unnamed Item ⋮ Optimization for deep learning: an overview ⋮ Landscape and training regimes in deep learning ⋮ Plateau Phenomenon in Gradient Descent Training of RELU Networks: Explanation, Quantification, and Avoidance ⋮ Machine Learning and Computational Mathematics ⋮ Analysis of a two-layer neural network via displacement convexity ⋮ Linearized two-layers neural networks in high dimension ⋮ Mean Field Analysis of Neural Networks: A Law of Large Numbers ⋮ Normalization effects on shallow neural networks and related asymptotic expansions ⋮ Dynamics of stochastic gradient descent for two-layer neural networks in the teacher–student setup* ⋮ Asymptotic properties of one-layer artificial neural networks with sparse connectivity
Cites Work
- Unnamed Item
- Unnamed Item
- Fluctuation analysis for the loss from default
- Heterogeneous credit portfolios and the dynamics of the aggregate losses
- Lectures on empirical processes. Theory and statistical applications.
- Large portfolio losses: A dynamic contagion model
- Asymptotic dynamics, non-critical and critical fluctuations for a geometric long-range interacting model
- Distribution function inequalities for martingales
- Semigroups of conditioned shifts and approximation of Markov processes
- McKean-Vlasov limit for interacting random processes in random media.
- Smooth bump functions and the geometry of Banach spaces. A brief survey
- A stochastic McKean-Vlasov equation for absorbing diffusions on the half-line
- Large deviations and mean-field theory for asymmetric random recurrent neural networks
- Default clustering in large portfolios: typical events
- A Hilbertian approach for fluctuations on the McKean-Vlasov model
- Particle systems with a singular mean-field self-excitation. Application to neuronal networks
- Propagation of chaos in neural fields
- A stochastic evolution equation arising from the fluctuations of a class of interacting particle systems
- Fluctuations for mean-field interacting age-dependent Hawkes processes
- Mean-Field Limit of a Stochastic Particle System Smoothly Interacting Through Threshold Hitting-Times and Applications to Neural Networks with Dendritic Component
- A mean field view of the landscape of two-layer neural networks
- Mean Field Analysis of Deep Neural Networks
- LARGE PORTFOLIO ASYMPTOTICS FOR LOSS FROM DEFAULT
- Systemic Risk in Interbanking Networks
This page was built for publication: Mean field analysis of neural networks: a central limit theorem