Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity
DOI10.1007/s00186-019-00674-wzbMath1426.90196OpenAlexW2953995000WikidataQ127616544 ScholiaQ127616544MaRDI QIDQ2311123
Publication date: 10 July 2019
Published in: Mathematical Methods of Operations Research (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00186-019-00674-w
strong convexityoptimal complexitystructured nonsmooth convex optimizationestimation sequencefirst-order black-box oracle
Numerical mathematical programming methods (65K05) Convex programming (90C25) Abstract computational complexity for mathematical programming problems (90C60) Numerical methods based on nonlinear programming (49M37)
Related Items (4)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- OSGA: a fast subgradient algorithm with optimal complexity
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Universal gradient methods for convex optimization problems
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Introductory lectures on convex optimization. A basic course.
- Optimal subgradient algorithms for large-scale convex optimization in simple domains
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Templates for convex cone problems with applications to sparse signal recovery
- Complexity bounds for primal-dual methods minimizing the model of objective function
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Fine tuning Nesterov's steepest descent algorithm for differentiable convex programming
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- On efficiency of nonmonotone Armijo-type line searches
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Generalized uniformly optimal methods for nonlinear programming
- An optimal subgradient algorithm for large-scale bound-constrained convex optimization
- An inexact line search approach using modified nonmonotone strategy for unconstrained optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- A simple nearly optimal restart scheme for speeding up first-order methods
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Optimal methods of smooth convex minimization
- Solving Ill-Conditioned and Singular Linear Systems: A Tutorial on Regularization
- Introduction to Numerical Analysis
- Sparse Reconstruction by Separable Approximation
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Optimal Primal-Dual Methods for a Class of Saddle Point Problems
- An Accelerated Linearized Alternating Direction Method of Multipliers
- Excessive Gap Technique in Nonsmooth Convex Minimization
- An Optimal Algorithm for Constrained Differentiable Convex Optimization
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- An acceleration procedure for optimal first-order methods
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima
This page was built for publication: Accelerated first-order methods for large-scale convex optimization: nearly optimal complexity under strong convexity