Distributed Block Coordinate Descent for Minimizing Partially Separable Functions
DOI10.1007/978-3-319-17689-5_11zbMath1330.65089arXiv1406.0238OpenAlexW1594004154MaRDI QIDQ3462314
Jakub Mareček, Peter Richtárik, Martin Takáč
Publication date: 5 January 2016
Published in: Numerical Analysis and Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1406.0238
convex optimizationnumerical examplescommunication complexitysupport vector machineiteration complexitybig data optimizationcomposite objectiveexpected separable over-approximationhuge-scale optimizationpartial separabilityempirical risk minimizationdistributed coordinate descent
Numerical mathematical programming methods (65K05) Convex programming (90C25) Complexity and performance of numerical algorithms (65Y20)
Related Items (11)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- On optimal probabilities in stochastic coordinate descent methods
- On the complexity analysis of randomized block-coordinate descent methods
- Pegasos: primal estimated sub-gradient solver for SVM
- A note on the complexity of \(L _{p }\) minimization
- A MapReduce-based distributed SVM algorithm for automatic image annotation
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Introductory lectures on convex optimization. A basic course.
- Smooth minimization of nonsmooth functions with parallel coordinate descent methods
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Distributed Coordinate Descent Method for Learning with Big Data
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Separable approximations and decomposition methods for the augmented Lagrangian
- Random Coordinate Descent Methods for <inline-formula> <tex-math notation="TeX">$\ell_{0}$</tex-math></inline-formula> Regularized Convex Optimization
- Accelerated, Parallel, and Proximal Coordinate Descent
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- On the complexity of parallel coordinate descent
- Sparse Approximate Solutions to Linear Systems
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- On the Nonasymptotic Convergence of Cyclic Coordinate Descent Methods
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- An Asynchronous Parallel Stochastic Coordinate Descent Algorithm
- Convergence of a block coordinate descent method for nondifferentiable minimization
This page was built for publication: Distributed Block Coordinate Descent for Minimizing Partially Separable Functions