Convergence properties of proximal (sub)gradient methods without convexity or smoothness of any of the functions
From MaRDI portal
Publication:6663109
DOI10.1137/23m1592158MaRDI QIDQ6663109
Publication date: 14 January 2025
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Numerical mathematical programming methods (65K05) Nonlinear programming (90C30) Complementarity and equilibrium problems and variational inequalities (finite dimensions) (aspects of mathematical programming) (90C33) Methods of successive quadratic programming type (90C55)
Cites Work
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Incremental gradient algorithms with stepsizes bounded away from zero
- Error stability properties of generalized gradient-type algorithms
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
- Stochastic subgradient method converges on tame functions
- The direct Lyapunov method in investigating the attraction of trajectories of finite-difference inclusions
- First-Order Methods in Optimization
- Convergence Rates of Proximal Gradient Methods via the Convex Conjugate
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Optimization Methods for Large-Scale Machine Learning
- Prox-regular functions in variational analysis
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- A Convergent Incremental Gradient Method with a Constant Step Size
- Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
- Numerical optimization. Theoretical and practical aspects. Transl. from the French
This page was built for publication: Convergence properties of proximal (sub)gradient methods without convexity or smoothness of any of the functions