scientific article; zbMATH DE number 7625201
From MaRDI portal
Publication:5054655
Stephen J. Wright, Shi Chen, Qin Li, Zhiyan Ding
Publication date: 29 November 2022
Full work available at URL: https://arxiv.org/abs/2105.14417
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items (2)
Stationary Density Estimation of Itô Diffusions Using Deep Learning ⋮ Optimization problems for PDEs in weak space-time form. Abstracts from the workshop held March 5--10, 2023
Cites Work
- Unnamed Item
- Unnamed Item
- Machine learning from a continuous viewpoint. I
- Gradient descent optimizes over-parameterized deep ReLU networks
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- A mean field view of the landscape of two-layer neural networks
- Mean Field Analysis of Deep Neural Networks
- Convex Formulation of Overparameterized Deep Neural Networks
- Gradient Descent with Identity Initialization Efficiently Learns Positive-Definite Linear Transformations by Deep Residual Networks
- Mean Field Analysis of Neural Networks: A Law of Large Numbers
This page was built for publication: