Block coordinate type methods for optimization and learning
From MaRDI portal
Publication:5889894
DOI10.1142/S021953052250018XOpenAlexW4310002330MaRDI QIDQ5889894
Publication date: 27 April 2023
Published in: Analysis and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s021953052250018x
stochastic optimizationzeroth-order methodcoordinate descentmirror descentregularized learningconditional gradient
Analysis of algorithms (68W40) Nonconvex programming, global optimization (90C26) Number-theoretic algorithms; complexity (11Y16) Optimization problems in optics and electromagnetic theory (78M50)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- An optimal method for stochastic composite optimization
- On the complexity analysis of randomized block-coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Moreau envelope augmented Lagrangian method for nonconvex optimization with linear constraints
- Distributed kernel gradient descent algorithm for minimum error entropy principle
- Convergence of online mirror descent
- Random gradient-free minimization of convex functions
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Differentially private SGD with non-smooth losses
- Iterative Regularization for Learning with Convex Loss Functions
- Conditional Gradient Sliding for Convex Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Stochastic Block Mirror Descent Methods for Nonsmooth and Stochastic Optimization
- The Cyclic Block Conditional Gradient Method for Convex Optimization Problems
- Robust Stochastic Approximation Approach to Stochastic Programming
- Distributed Randomized Gradient-Free Mirror Descent Algorithm for Constrained Optimization
- Regularization schemes for minimum error entropy principle
- Analysis of Online Composite Mirror Descent Algorithm
- On the Convergence of Block Coordinate Descent Type Methods
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
- Mini-batch stochastic approximation methods for nonconvex stochastic composite optimization
This page was built for publication: Block coordinate type methods for optimization and learning