Fastest rates for stochastic mirror descent methods
From MaRDI portal
Publication:2044496
DOI10.1007/s10589-021-00284-5zbMath1473.90113arXiv1803.07374OpenAlexW3168993225MaRDI QIDQ2044496
Filip Hanzely, Peter Richtárik
Publication date: 9 August 2021
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.07374
Related Items (6)
Bregman proximal gradient algorithms for deep matrix factorization ⋮ Adaptivity of Stochastic Gradient Methods for Nonconvex Optimization ⋮ Stochastic composition optimization of functions without Lipschitz continuous gradient ⋮ Stochastic incremental mirror descent algorithms with Nesterov smoothing ⋮ A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems with application to symmetric nonnegative matrix tri-factorization ⋮ Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- On optimal probabilities in stochastic coordinate descent methods
- Proportional response dynamics in the Fisher market
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Introductory lectures on convex optimization. A basic course.
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- MM Optimization Algorithms
- Coordinate descent with arbitrary sampling I: algorithms and complexity†
- Coordinate descent with arbitrary sampling II: expected separable overapproximation
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Image deblurring with Poisson data: from cells to galaxies
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- On the complexity of parallel coordinate descent
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- On Stochastic Subgradient Mirror-Descent Algorithm with Weighted Averaging
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Understanding Machine Learning
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- A Stochastic Approximation Method
This page was built for publication: Fastest rates for stochastic mirror descent methods