Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)
From MaRDI portal
Publication:2082553
DOI10.1007/S10589-022-00401-YzbMath1502.90134OpenAlexW4292651679MaRDI QIDQ2082553
Publication date: 4 October 2022
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10589-022-00401-y
convex optimizationsubdifferentialforward-backward methodproximal mappingNesterov accelerated gradient method
Numerical mathematical programming methods (65K05) Convex programming (90C25) Nonsmooth analysis (49J52) Numerical methods based on nonlinear programming (49M37)
Cites Work
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Optimized first-order methods for smooth convex minimization
- Gradient methods for minimizing composite functions
- Universal gradient methods for convex optimization problems
- Convergence of inertial dynamics and proximal algorithms governed by maximally monotone operators
- Fast convergence of inertial dynamics and algorithms with asymptotic vanishing viscosity
- Linear convergence of first order methods for non-strongly convex optimization
- The Rate of Convergence of Nesterov's Accelerated Forward-Backward Method is Actually Faster Than $1/k^2$
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- An Optimal First Order Method Based on Optimal Quadratic Averaging
- An iterative thresholding algorithm for linear inverse problems with a sparsity constraint
- Convex programming in Hilbert space
- Weak convergence of the sequence of successive approximations for nonexpansive mappings
- Convex analysis and monotone operator theory in Hilbert spaces
This page was built for publication: Generalized Nesterov's accelerated proximal gradient algorithms with convergence rate of order \(o(1/k^2)\)