On the omnipresence of spurious local minima in certain neural network training problems
From MaRDI portal
Publication:6629537
DOI10.1007/s00365-023-09658-wMaRDI QIDQ6629537
Unnamed Author, Constantin Christof
Publication date: 30 October 2024
Published in: Constructive Approximation (Search for Journal in Brave)
stability analysisbest approximationHadamard well-posednesstraining problemloss landscapedeep artificial neural networklocal affine linearityspurious local minimum
Artificial neural networks and deep learning (68T07) Sensitivity, stability, well-posedness (49K40) Sensitivity, stability, parametric optimization (90C31) Variants of convex sets (star-shaped, ((m, n))-convex, etc.) (52A30)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Lower bounds for approximation by MLP neural networks
- Topological properties of the set of functions generated by neural networks of fixed size
- On the nonuniqueness and instability of solutions of tracking-type optimal control problems
- Inverse and ill-posed problems. Theory and applications.
- Almost convex and Chebyshev sets
- On a Property of Metric Projections Onto Closed Subsets of Hilbert Spaces
- An Introduction to Banach Space Theory
- Training a Single Sigmoidal Neuron Is Hard
- On the Benefit of Width for Neural Networks: Disappearance of Basins
- Introduction to Functional Analysis
- Plateau Phenomenon in Gradient Descent Training of RELU Networks: Explanation, Quantification, and Avoidance
- Spurious Valleys in Two-layer Neural Network Optimization Landscapes
- Real Analysis
- Approximation by superpositions of a sigmoidal function
- Continuity of approximation by neural networks in \(L_p\) spaces
This page was built for publication: On the omnipresence of spurious local minima in certain neural network training problems