Unifying framework for accelerated randomized methods in convex optimization
From MaRDI portal
Publication:6200198
DOI10.1007/978-3-031-30114-8_15arXiv1707.08486OpenAlexW3112239486MaRDI QIDQ6200198
Vladimir Zholobov, A. V. Gasnikov, Pavel Dvurechensky, Alexander Tyurin
Publication date: 22 March 2024
Published in: Foundations of Modern Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1707.08486
complexityconvex optimizationinexact oraclefirst-order methodszero-order methodsaccelerated gradient descent methodsaccelerated random block-coordinate descentaccelerated random derivative-free methodaccelerated random directional search
Cites Work
- Inexact coordinate descent: complexity and preconditioning
- First-order methods of smooth convex optimization with inexact oracle
- Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
- Lectures on convex optimization
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Universal method for stochastic composite optimization problems
- An optimal randomized incremental gradient method
- An accelerated directional derivative method for smooth stochastic convex optimization
- Oracle complexity separation in convex optimization
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Zeroth-order methods for noisy Hölder-gradient functions
- Gradient methods for problems with inexact model of the objective
- Accelerated and unaccelerated stochastic gradient descent in model generality
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case
- Random gradient-free minimization of convex functions
- Stochastic intermediate gradient method for convex optimization problems
- A stable alternative to Sinkhorn's algorithm for regularized optimal transport
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Smooth Optimization with Approximate Gradient
- Accelerated, Parallel, and Proximal Coordinate Descent
- Introduction to Derivative-Free Optimization
- Katyusha: the first direct acceleration of stochastic gradient methods
- Global Convergence Rate Analysis of a Generic Line Search Algorithm with Noise
- An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
- Derivative-free optimization methods
- On Solving Large-Scale Polynomial Convex Problems by Randomized First-Order Algorithms
- An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Inexact model: a framework for optimization and variational inequalities
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
This page was built for publication: Unifying framework for accelerated randomized methods in convex optimization