Linearized proximal algorithms with adaptive stepsizes for convex composite optimization with applications
DOI10.1007/s00245-022-09957-xOpenAlexW4360985903MaRDI QIDQ2694483
Chong Li, Linglingzhi Zhu, Jin-Hua Wang, Yao-Hua Hu, Xiao Qi Yang
Publication date: 3 April 2023
Published in: Applied Mathematics and Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00245-022-09957-x
quadratic convergenceconvex composite optimizationconvex inclusion problemadaptive stepsizelinearized proximal algorithm
Numerical mathematical programming methods (65K05) Nonconvex programming, global optimization (90C26) Numerical methods based on nonlinear programming (49M37)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A proximal method for composite minimization
- Strong KKT conditions and weak sharp solutions in convex-composite optimization
- Methods of constructing certain stochastic matrices. II
- Error bounds in mathematical programming
- A Riemannian inexact Newton-CG method for constructing a nonnegative matrix with prescribed realizable spectrum
- A Gauss-Newton method for convex composite optimization
- Global error bounds for piecewise convex polynomials
- A nonsmooth version of Newton's method
- Optimization theory and methods. Nonlinear programming
- Inexact subgradient methods for quasi-convex optimization problems
- Methods of constructing certain stochastic matrices
- On Convergence Rates of Linearized Proximal Algorithms for Convex Composite Optimization with Applications
- Convergence of the Gauss--Newton Method for Convex Composite Optimization under a Majorant Condition
- Weak Sharp Minima in Mathematical Programming
- Structured inverse eigenvalue problems
- Weak Sharp Minima for Semi-infinite Optimization Problems with Applications
- Majorizing Functions and Convergence of the Gauss–Newton Method for Convex Composite Optimization
- Local properties of algorithms for minimizing nonsmooth composite functions
- Descent methods for composite nondifferentiable optimization problems
- First- and Second-Order Epi-Differentiability in Nonlinear Programming
- An Exact Penalization Viewpoint of Constrained Optimization
- Inverse Eigenvalue Problems
- Second-order Sufficiency and Quadratic Growth for Nonisolated Minima
- Weak Sharp Minima: Characterizations and Sufficient Conditions
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- On convergence of the Gauss-Newton method for convex composite optimization.
This page was built for publication: Linearized proximal algorithms with adaptive stepsizes for convex composite optimization with applications