Efficiency of the Accelerated Coordinate Descent Method on Structured Optimization Problems
From MaRDI portal
Publication:2957979
DOI10.1137/16M1060182zbMath1359.90073OpenAlexW2336104608MaRDI QIDQ2957979
Sebastian U. Stich, Yu. E. Nesterov
Publication date: 31 January 2017
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/16m1060182
convex optimizationstructural optimizationcomplexity boundsfast gradient methodscoordinate descent methods
Analysis of algorithms and problem complexity (68Q25) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Minimax problems in mathematical programming (90C47)
Related Items
Greedy randomized and maximal weighted residual Kaczmarz methods with oblique projection, Oracle complexity separation in convex optimization, Adaptive Gradient-Free Method for Stochastic Optimization, Exact gradient methods with memory, A New Homotopy Proximal Variable-Metric Framework for Composite Convex Minimization, Unnamed Item, Accelerated meta-algorithm for convex optimization problems, A block symmetric Gauss-Seidel decomposition theorem for convex composite quadratic programming and its applications, Cyclic Coordinate Dual Averaging with Extrapolation, Randomness and permutations in coordinate descent methods, Factor-\(\sqrt{2}\) acceleration of accelerated gradient methods, Conjugate gradients acceleration of coordinate descent for linear systems, On multi-step greedy randomized coordinate descent method for solving large linear least-squares problems, Near-linear convergence of the random Osborne algorithm for matrix balancing, First-order methods for convex optimization, Random Coordinate Descent Methods for Nonseparable Composite Optimization, Unifying framework for accelerated randomized methods in convex optimization, Adaptive Catalyst for Smooth Convex Optimization, A Randomized Coordinate Descent Method with Volume Sampling, Unnamed Item, An Optimal Algorithm for Decentralized Finite-Sum Optimization, An accelerated directional derivative method for smooth stochastic convex optimization, Decentralized and parallel primal and dual accelerated methods for stochastic convex programming problems, Gauss-Seidel method with oblique direction, Computing the Best Approximation over the Intersection of a Polyhedral Set and the Doubly Nonnegative Cone, Distributed multi-step subgradient optimization for multi-agent system, On Adaptive Sketch-and-Project for Solving Linear Systems, Accelerated proximal envelopes: application to componentwise methods, On the computational efficiency of catalyst accelerated coordinate descent
Uses Software
Cites Work
- Smooth minimization of non-smooth functions
- Universal gradient methods for convex optimization problems
- Introductory lectures on convex optimization. A basic course.
- Distributed Coordinate Descent Method for Learning with Big Data
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems