Asymptotic properties of one-layer artificial neural networks with sparse connectivity
From MaRDI portal
Publication:2105365
DOI10.1016/j.spl.2022.109698OpenAlexW3215286853MaRDI QIDQ2105365
Matthias Neumann, Christian Hirsch, Volker Schmidt
Publication date: 8 December 2022
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2112.00732
weak convergencelaw of large numbersartificial neural networkstochastic gradient descentrandom networksparse connectivity
Geometric probability and stochastic geometry (60D05) Artificial neural networks and deep learning (68T07) Point processes (e.g., Poisson, Cox, Hawkes processes) (60G55)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Principles and theory for data mining and machine learning
- Hedonic housing prices and the demand for clean air
- Reaction-diffusion models: from particle systems to SDE's
- Fluctuation theory in the Boltzmann-Grad limit
- Mean field analysis of neural networks: a central limit theorem
- A mean field view of the landscape of two-layer neural networks
- Trainability and Accuracy of Artificial Neural Networks: An Interacting Particle System Approach
- Mean Field Analysis of Neural Networks: A Law of Large Numbers
This page was built for publication: Asymptotic properties of one-layer artificial neural networks with sparse connectivity