An Accelerated Method for Derivative-Free Smooth Stochastic Convex Optimization
From MaRDI portal
Publication:5081777
DOI10.1137/19M1259225zbMath1494.90058arXiv1802.09022OpenAlexW2788216258MaRDI QIDQ5081777
Pavel Dvurechensky, Eduard Gorbunov, Alexander V. Gasnikov
Publication date: 17 June 2022
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.09022
accelerationsmoothnessderivative-free optimizationzeroth-order optimizationstochastic convex optimization
Convex programming (90C25) Derivative-free methods and methods using generalized derivatives (90C56) Stochastic programming (90C15)
Related Items
Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle, Accelerated gradient methods with absolute and relative noise in the gradient, First-order methods for convex optimization, Unifying framework for accelerated randomized methods in convex optimization, Recent theoretical advances in decentralized distributed convex optimization, Recent Theoretical Advances in Non-Convex Optimization, Stochastic Three Points Method for Unconstrained Smooth Minimization, Derivative-free optimization methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- First-order methods of smooth convex optimization with inexact oracle
- An optimal method for stochastic composite optimization
- Gradient-free proximal methods with inexact oracle for convex stochastic nonsmooth optimization problems on the simplex
- Stochastic intermediate gradient method for convex problems with stochastic inexact oracle
- Optimal order of accuracy of search algorithms in stochastic optimization
- Algorithms for approximate calculation of the minimum of a convex function from its values
- Introductory lectures on convex optimization. A basic course.
- Accelerated randomized stochastic optimization.
- Gradient-free two-point methods for solving stochastic nonsmooth convex optimization problems with small non-random noises
- An accelerated directional derivative method for smooth stochastic convex optimization
- A theoretical and empirical comparison of gradient approximations in derivative-free optimization
- Zeroth-order methods for noisy Hölder-gradient functions
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case
- Random gradient-free minimization of convex functions
- Stochastic intermediate gradient method for convex optimization problems
- Lectures on Modern Convex Optimization
- Optimization of Convex Functions with Random Pursuit
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Introduction to Derivative-Free Optimization
- Robust Stochastic Approximation Approach to Stochastic Programming
- Introduction to Stochastic Search and Optimization
- Derivative-Free Optimization of Noisy Functions via Quasi-Newton Methods
- About the Power Law of the PageRank Vector Component Distribution. Part 2. The Buckley–Osthus Model, Verification of the Power Law for This Model, and Setup of Real Search Engines
- Parallel Algorithms and Probability of Large Deviation for Stochastic Convex Optimization Problems
- Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent
- Gradient-Free Methods with Inexact Oracle for Convex-Concave Stochastic Saddle-Point Problem
- Kernel-based methods for bandit convex optimization
- Derivative-free optimization methods
- Stochastic Convex Optimization with Bandit Feedback
- An Optimal Algorithm for Bandit and Zero-Order Convex Optimization with Two-Point Feedback
- Regret Analysis of Stochastic and Nonstochastic Multi-armed Bandit Problems
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Stable signal recovery from incomplete and inaccurate measurements
- Stochastic Approximation of Minima with Improved Asymptotic Speed
- Universal intermediate gradient method for convex problems with inexact oracle
- Compressed sensing
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization