Global convergence in learning fully-connected ReLU networks via un-rectifying based on the augmented Lagrangian approach
From MaRDI portal
Publication:6536825
DOI10.1007/s10915-024-02548-8zbMATH Open1546.90213MaRDI QIDQ6536825
Ming-Yu Chung, Shih-Shuo Tung, Jinn Ho, Wen-Liang Hwang
Publication date: 14 May 2024
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Global convergence of ADMM in nonconvex nonsmooth optimization
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- A proposal on machine learning via dynamical systems
- A Globally Convergent Augmented Lagrangian Algorithm for Optimization with General Constraints and Simple Bounds
- Optimization Methods for Large-Scale Machine Learning
- Un-Rectifying Non-Linear Networks for Signal Representation
- ADMM for multiaffine constrained optimization
- Learning representations by back-propagating errors
- On the Convergence of Alternating Direction Lagrangian Methods for Nonconvex Structured Optimization Problems
- A Stochastic Approximation Method
This page was built for publication: Global convergence in learning fully-connected ReLU networks via un-rectifying based on the augmented Lagrangian approach