A family of subgradient-based methods for convex optimization problems in a unifying framework
DOI10.1080/10556788.2016.1182165zbMath1355.90065arXiv1403.6526OpenAlexW2275286187MaRDI QIDQ2829570
Publication date: 8 November 2016
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1403.6526
complexity boundsnon-smooth/smooth convex optimizationstructured convex optimizationsubgradient/gradient-based proximal methodmirror-descent methoddual-averaging method
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Numerical methods based on nonlinear programming (49M37) Methods of reduced gradient type (90C52)
Related Items (3)
Cites Work
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Universal gradient methods for convex optimization problems
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Quasi-monotone subgradient methods for nonsmooth convex minimization
- Smoothing and First Order Methods: A Unified Framework
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- A generalized proximal point algorithm for certain non-convex minimization problems
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging
- Excessive Gap Technique in Nonsmooth Convex Minimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- Interior Gradient and Proximal Methods for Convex and Conic Optimization
This page was built for publication: A family of subgradient-based methods for convex optimization problems in a unifying framework