An incremental mirror descent subgradient algorithm with random sweeping and proximal step
From MaRDI portal
Publication:4613984
DOI10.1080/02331934.2018.1482491zbMath1405.90099OpenAlexW2808520903WikidataQ64327694 ScholiaQ64327694MaRDI QIDQ4613984
Publication date: 28 January 2019
Published in: Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/02331934.2018.1482491
nonsmooth convex minimizationglobal rate of convergenceincremental mirror descent algorithmrandom sweeping
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90)
Cites Work
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Nonsmooth steepest descent method by proximal subdifferentials in Hilbert spaces
- On proximal subgradient splitting method for minimizing the sum of two nonsmooth convex functions
- The Ordered Subsets Mirror Descent Optimization Method with Applications to Tomography
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Online Learning and Online Convex Optimization
- Barrier Operators and Associated Gradient-Like Dynamical Systems for Constrained Minimization Problems
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
This page was built for publication: An incremental mirror descent subgradient algorithm with random sweeping and proximal step