Complexity of a Quadratic Penalty Accelerated Inexact Proximal Point Method for Solving Linearly Constrained Nonconvex Composite Programs
DOI10.1137/18M1171011zbMath1504.90101arXiv1802.03504WikidataQ127022372 ScholiaQ127022372MaRDI QIDQ5237309
WeiWei Kong, Renato D. C. Monteiro, Jefferson G. Melo
Publication date: 17 October 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.03504
iteration complexityinexact proximal point methodquadratic penalty methodfirst-order accelerated gradient methodcomposite nonconvex program
Abstract computational complexity for mathematical programming problems (90C60) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Numerical optimization and variational techniques (65K10) Variational and other types of inclusions (47J22)
Related Items
Cites Work
- Unnamed Item
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- Gradient methods for minimizing composite functions
- Adaptive inexact fast augmented Lagrangian methods for constrained convex optimization
- A block coordinate variable metric forward-backward algorithm
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- Introductory lectures on convex optimization. A basic course.
- A hybrid approximate extragradient-proximal point algorithm using the enlargement of a maximal monotone operator
- Structured nonconvex and nonsmooth optimization: algorithms and iteration complexity analysis
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Iteration-complexity of first-order penalty methods for convex programming
- Alternating forward-backward splitting for linearly constrained optimization problems
- Efficiency of minimizing compositions of convex functions and smooth maps
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- An Accelerated Hybrid Proximal Extragradient Method for Convex Optimization and Its Implications to Second-Order Methods
- A First-Order Smoothed Penalty Method for Compressed Sensing
- On the Complexity of the Hybrid Proximal Extragradient Method for the Iterates and the Ergodic Mean
- Accelerating Block-Decomposition First-Order Methods for Solving Composite Saddle-Point and Two-Player Nash Equilibrium Problems
- An Accelerated HPE-Type Algorithm for a Class of Composite Convex-Concave Saddle-Point Problems
- Accelerated Methods for NonConvex Optimization
- An accelerated non-Euclidean hybrid proximal extragradient-type algorithm for convex–concave saddle-point problems
- Complexity of first-order inexact Lagrangian and penalty methods for conic convex programming
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Iteration-complexity of first-order augmented Lagrangian methods for convex programming