Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes
From MaRDI portal
Publication:5038180
DOI10.1080/10556788.2020.1746963zbMath1501.90056arXiv1808.02543OpenAlexW3017142145MaRDI QIDQ5038180
Publication date: 29 September 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.02543
stochastic gradient methodsnon-smooth non-convex stochastic optimizationasynchronous variance-reduced schemelimited block coordinationrate and complexity
Related Items
A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems ⋮ A unified convergence analysis of stochastic Bregman proximal gradient and extragradient methods
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Accelerated gradient methods for nonconvex nonlinear and stochastic programming
- A minimization method for the sum of a convex function and a continuously differentiable function
- Coordinate-friendly structures, algorithms and applications
- On stochastic mirror-prox algorithms for stochastic Cartesian variational inequalities: randomized block coordinate and optimal averaging schemes
- On variance reduction for stochastic smooth convex optimization with multiplicative noise
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Convergence of stochastic proximal gradient algorithm
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Distributed Algorithms for Aggregative Games on Graphs
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Proximal Splitting Methods in Signal Processing
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- Accelerated, Parallel, and Proximal Coordinate Descent
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Monotone Operators and the Proximal Point Algorithm
- Convergence Rates in Forward--Backward Splitting
- Scalable Solvers of Random Quadratic Equations via Stochastic Truncated Amplitude Flow
- A Modified Forward-Backward Splitting Method for Maximal Monotone Mappings
- Distributed Variable Sample-Size Gradient-Response and Best-Response Schemes for Stochastic Nash Equilibrium Problems
- Asynchronous Schemes for Stochastic and Misspecified Potential Games and Nonconvex Optimization
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization