Deprecated: $wgMWOAuthSharedUserIDs=false is deprecated, set $wgMWOAuthSharedUserIDs=true, $wgMWOAuthSharedUserSource='local' instead [Called from MediaWiki\HookContainer\HookContainer::run in /var/www/html/w/includes/HookContainer/HookContainer.php at line 135] in /var/www/html/w/includes/Debug/MWDebug.php on line 372
Accelerated, Parallel, and Proximal Coordinate Descent - MaRDI portal

Accelerated, Parallel, and Proximal Coordinate Descent

From MaRDI portal
Publication:3449571

DOI10.1137/130949993zbMath1327.65108arXiv1312.5799OpenAlexW1947202642WikidataQ61661693 ScholiaQ61661693MaRDI QIDQ3449571

Peter Richtárik, Olivier Fercoq

Publication date: 4 November 2015

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1312.5799




Related Items (71)

A new randomized primal-dual algorithm for convex optimization with fast last iterate convergence ratesOn the optimal order of worst case complexity of direct searchAn attention algorithm for solving large scale structured \(l_0\)-norm penalty estimation problemsAsynchronous variance-reduced block schemes for composite non-convex stochastic optimization: block-specific steplengths and adapted batch-sizesParallel random block-coordinate forward-backward algorithm: a unified convergence analysisAn accelerated coordinate gradient descent algorithm for non-separable composite optimizationOracle complexity separation in convex optimizationOn the Complexity Analysis of the Primal Solutions for the Accelerated Randomized Dual Coordinate AscentAn Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk MinimizationA flexible coordinate descent methodRandomized Iterative Methods for Linear SystemsMAGMA: Multilevel Accelerated Gradient Mirror Descent Algorithm for Large-Scale Convex Composite MinimizationDistributed Block Coordinate Descent for Minimizing Partially Separable FunctionsA New Homotopy Proximal Variable-Metric Framework for Composite Convex MinimizationOn the Convergence of Stochastic Primal-Dual Hybrid GradientNearly linear-time packing and covering LP solvers. Nearly linear-time packing and covering LP solvers, achieving width-independence and \(=(1/\varepsilon)\)-convergenceA block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applicationsAccelerated First-Order Methods for Convex Optimization with Locally Lipschitz Continuous GradientAccelerating block coordinate descent methods with identification strategiesA random block-coordinate Douglas-Rachford splitting method with low computational complexity for binary logistic regressionCyclic Coordinate Dual Averaging with ExtrapolationAccelerated randomized mirror descent algorithms for composite non-strongly convex optimizationAsynchronous Stochastic Coordinate Descent: Parallelism and Convergence PropertiesRandomized Block Proximal Damped Newton Method for Composite Self-Concordant MinimizationA modified stochastic quasi-Newton algorithm for summing functions problem in machine learningOn the convergence analysis of asynchronous SGD for solving consistent linear systemsAdditive Schwarz Methods for Convex Optimization as Gradient MethodsStochastic Reformulations of Linear Systems: Algorithms and Convergence TheoryFirst-order methods for convex optimizationRandom Coordinate Descent Methods for Nonseparable Composite OptimizationLocal linear convergence of proximal coordinate descent algorithmVariance reduction for root-finding problemsUnifying framework for accelerated randomized methods in convex optimizationAdaptive Catalyst for Smooth Convex OptimizationPrimal-dual block-proximal splitting for a class of non-convex problemsWorst-case complexity of cyclic coordinate descent: \(O(n^2)\) gap with randomized versionA Randomized Exchange Algorithm for Computing Optimal Approximate Designs of ExperimentsOn the complexity of parallel coordinate descentA Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable FunctionsTwo Symmetrized Coordinate Descent Methods Can Be $O(n^2)$ Times Slower Than the Randomized VersionAn Optimal Algorithm for Decentralized Finite-Sum OptimizationAn introduction to continuous optimization for imagingUnnamed ItemUnnamed ItemMomentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methodsAn accelerated directional derivative method for smooth stochastic convex optimizationStochastic Primal-Dual Hybrid Gradient Algorithm with Arbitrary Sampling and Imaging ApplicationsAccelerated proximal stochastic dual coordinate ascent for regularized loss minimizationA parallel line search subspace correction method for composite convex optimizationAn Efficient Inexact ABCD Method for Least Squares Semidefinite ProgrammingFast and safe: accelerated gradient methods with optimality certificates and underestimate sequencesAccelerating Nonnegative Matrix Factorization Algorithms Using ExtrapolationA remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functionsMarkov chain block coordinate descentRestarting the accelerated coordinate descent method with a rough strong convexity estimateAccelerated First-Order Primal-Dual Proximal Methods for Linearly Constrained Composite Convex ProgrammingAn inexact proximal augmented Lagrangian framework with arbitrary linearly convergent inner solver for composite convex optimizationCoordinate descent with arbitrary sampling I: algorithms and complexityCoordinate descent with arbitrary sampling II: expected separable overapproximationOptimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate DescentLinear Coupling: An Ultimate Unification of Gradient and Mirror DescentComputing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative ConeUnnamed ItemThe Supporting Halfspace--Quadratic Programming Strategy for the Dual of the Best Approximation ProblemKalman-Based Stochastic Gradient Method with Stop Condition and Insensitivity to ConditioningConvergence Analysis of Inexact Randomized Iterative MethodsAn accelerated communication-efficient primal-dual optimization framework for structured machine learningA generic coordinate descent solver for non-smooth convex optimisationDistributed Stochastic Optimization with Large DelaysLinear Convergence of Random Dual Coordinate Descent on Nonpolyhedral Convex ProblemsOn the efficiency of a randomized mirror descent algorithm in online optimization problems


Uses Software


Cites Work


This page was built for publication: Accelerated, Parallel, and Proximal Coordinate Descent