A generic coordinate descent solver for non-smooth convex optimisation
From MaRDI portal
Publication:5865339
DOI10.1080/10556788.2019.1658758zbMath1494.90077arXiv1812.00628OpenAlexW2970475689WikidataQ127312507 ScholiaQ127312507MaRDI QIDQ5865339
Publication date: 13 June 2022
Published in: Optimization Methods and Software (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1812.00628
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Parallel coordinate descent methods for big data optimization
- On optimal probabilities in stochastic coordinate descent methods
- Dual coordinate descent methods for logistic regression and maximum entropy models
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Accelerated primal-dual proximal block coordinate updating methods for constrained convex optimization
- On the convergence of the block nonlinear Gauss-Seidel method under convex constraints
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Randomized primal-dual proximal block coordinate updates
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Coordinate descent algorithms
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Pathwise coordinate optimization
- Distributed Coordinate Descent Method for Learning with Big Data
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- A Unified Convergence Analysis of Block Successive Minimization Methods for Nonsmooth Optimization
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- On the Iteration Complexity of Cyclic Coordinate Gradient Descent Methods
- SparseNet: Coordinate Descent With Nonconvex Penalties
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- A Block Successive Upper-Bound Minimization Method of Multipliers for Linearly Constrained Convex Optimization
- Accelerated, Parallel, and Proximal Coordinate Descent
- A Smooth Primal-Dual Optimization Framework for Nonsmooth Composite Convex Minimization
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications
- On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- Minimizing Certain Convex Functions
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: A generic coordinate descent solver for non-smooth convex optimisation