Analysis of biased stochastic gradient descent using sequential semidefinite programs
DOI10.1007/s10107-020-01486-1zbMath1465.90052arXiv1711.00987OpenAlexW3013479538MaRDI QIDQ2020610
Laurent Lessard, Bin Hu, Peter Seiler
Publication date: 23 April 2021
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.00987
convergence ratesconvex optimizationfirst-order methodsbiased stochastic gradientrobustness to inexact gradient
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30) Stochastic programming (90C15) Stochastic approximation (62L20)
Related Items (5)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- First-order methods of smooth convex optimization with inexact oracle
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Minimizing finite sums with the stochastic average gradient
- On the worst-case complexity of the gradient method with exact line search for smooth strongly convex functions
- Performance of first-order methods for smooth convex minimization: a novel approach
- Guaranteed Matrix Completion via Non-Convex Factorization
- Graph Implementations for Nonsmooth Convex Programs
- Large-Scale Machine Learning with Stochastic Gradient Descent
- Smooth Optimization with Approximate Gradient
- Analysis and Design of Optimization Algorithms via Integral Quadratic Constraints
- Optimization Methods for Large-Scale Machine Learning
- Information-Theoretic Lower Bounds on the Oracle Complexity of Stochastic Convex Optimization
- Exact Worst-Case Performance of First-Order Methods for Composite Convex Optimization
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- A Stochastic Approximation Method
- Stochastic gradient descent, weighted sampling, and the randomized Kaczmarz algorithm
This page was built for publication: Analysis of biased stochastic gradient descent using sequential semidefinite programs