Accelerated, Parallel, and Proximal Coordinate Descent
From MaRDI portal
Publication:3449571
DOI10.1137/130949993zbMath1327.65108arXiv1312.5799OpenAlexW1947202642WikidataQ61661693 ScholiaQ61661693MaRDI QIDQ3449571
Peter Richtárik, Olivier Fercoq
Publication date: 4 November 2015
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1312.5799
complexityconvex optimizationaccelerationparallel methodsbig datapartial separabilityproximal methodsrandomized coordinate descent
Numerical mathematical programming methods (65K05) Convex programming (90C25) Parallel numerical computation (65Y05) Randomized algorithms (68W20)
Related Items (71)
A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence rates ⋮ On the optimal order of worst case complexity of direct search ⋮ An attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problems ⋮ Asynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizes ⋮ Parallel random block-coordinate forward-backward algorithm: a unified convergence analysis ⋮ An accelerated coordinate gradient descent algorithm for non-separable composite optimization ⋮ Oracle complexity separation in convex optimization ⋮ On the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate Ascent ⋮ An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization ⋮ A flexible coordinate descent method ⋮ Randomized Iterative Methods for Linear Systems ⋮ MAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite Minimization ⋮ Distributed Block Coordinate Descent for Minimizing Partially Separable Functions ⋮ A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization ⋮ On the Convergence of Stochastic Primal-Dual Hybrid Gradient ⋮ Nearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergence ⋮ A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications ⋮ Accelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous Gradient ⋮ Accelerating block coordinate descent methods with identification strategies ⋮ A random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regression ⋮ Cyclic Coordinate Dual Averaging with Extrapolation ⋮ Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization ⋮ Asynchronous Stochastic Coordinate Descent: Parallelism and Convergence Properties ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ A modified stochastic quasi-Newton algorithm for summing functions problem in machine learning ⋮ On the convergence analysis of asynchronous SGD for solving consistent linear systems ⋮ Additive Schwarz Methods for Convex Optimization as Gradient Methods ⋮ Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory ⋮ First-order methods for convex optimization ⋮ Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Variance reduction for root-finding problems ⋮ Unifying framework for accelerated randomized methods in convex optimization ⋮ Adaptive Catalyst for Smooth Convex Optimization ⋮ Primal-dual block-proximal splitting for a class of non-convex problems ⋮ Worst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized version ⋮ A Randomized Exchange Algorithm for Computing Optimal Approximate Designs of Experiments ⋮ On the complexity of parallel coordinate descent ⋮ A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions ⋮ Two Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized Version ⋮ An Optimal Algorithm for Decentralized Finite-Sum Optimization ⋮ An introduction to continuous optimization for imaging ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods ⋮ An accelerated directional derivative method for smooth stochastic convex optimization ⋮ Stochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging Applications ⋮ Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization ⋮ A parallel line search subspace correction method for composite convex optimization ⋮ An Efficient Inexact ABCD Method for Least Squares Semidefinite Programming ⋮ Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences ⋮ Accelerating Nonnegative Matrix Factorization Algorithms Using Extrapolation ⋮ A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions ⋮ Markov chain block coordinate descent ⋮ Restarting the accelerated coordinate descent method with a rough strong convexity estimate ⋮ Accelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex Programming ⋮ An inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimization ⋮ Coordinate descent with arbitrary sampling I: algorithms and complexity† ⋮ Coordinate descent with arbitrary sampling II: expected separable overapproximation ⋮ Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent ⋮ Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent ⋮ Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone ⋮ Unnamed Item ⋮ The Supporting Halfspace--Quadratic Programming Strategy for the Dual of the Best Approximation Problem ⋮ Kalman-Based Stochastic Gradient Method with Stop Condition and Insensitivity to Conditioning ⋮ Convergence Analysis of Inexact Randomized Iterative Methods ⋮ An accelerated communication-efficient primal-dual optimization framework for structured machine learning ⋮ A generic coordinate descent solver for non-smooth convex optimisation ⋮ Distributed Stochastic Optimization with Large Delays ⋮ Linear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex Problems ⋮ On the efficiency of a randomized mirror descent algorithm in online optimization problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- On optimal probabilities in stochastic coordinate descent methods
- On the complexity analysis of randomized block-coordinate descent methods
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Distributed Coordinate Descent Method for Learning with Big Data
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Safe Feature Elimination in Sparse Supervised Learning
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: Accelerated, Parallel, and Proximal Coordinate Descent