Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity
From MaRDI portal
Publication:5051381
DOI10.1137/19M1299074OpenAlexW4309184205MaRDI QIDQ5051381
No author found.
Publication date: 23 November 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1912.13472
Applications of mathematical programming (90C90) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- A geometric analysis of phase retrieval
- Gradient descent optimizes over-parameterized deep ReLU networks
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Interference Alignment Using Finite and Dependent Channel Extensions: The Single Beam Case
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- A mean field view of the landscape of two-layer neural networks
- Convergence Results for Neural Networks via Electrodynamics
- Mean Field Analysis of Deep Neural Networks
- Effect of Depth and Width on Local Minima in Deep Learning
- Learning ReLU Networks on Linearly Separable Data: Algorithm, Optimality, and Generalization
- Matrix Completion From a Few Entries
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity