Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems
From MaRDI portal
Publication:2133414
DOI10.1007/s10107-020-01599-7zbMath1491.90130arXiv1906.10053OpenAlexW2990601391MaRDI QIDQ2133414
Andreas Themelis, Puya Latafat, Panagiotis Patrinos
Publication date: 29 April 2022
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1906.10053
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52) Set-valued and variational analysis (49J53)
Related Items
Random Coordinate Descent Methods for Nonseparable Composite Optimization ⋮ Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- Douglas-Rachford splitting for nonconvex optimization with application to nonconvex feasibility problems
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- A random coordinate descent algorithm for optimization problems with composite objective function and linear coupled constraints
- Minimizing finite sums with the stochastic average gradient
- Iteration complexity analysis of block coordinate descent methods
- Incremental proximal methods for large scale convex optimization
- A coordinate gradient descent method for linearly constrained smooth optimization and support vector machines training
- A block coordinate variable metric forward-backward algorithm
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On gradients of functions definable in o-minimal structures
- On semi- and subanalytic geometry
- Introductory lectures on convex optimization. A basic course.
- A globally convergent algorithm for nonconvex optimization based on block coordinate update
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Splitting methods with variable metric for Kurdyka-Łojasiewicz functions and general convergence rates
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- A Class of Randomized Primal-Dual Algorithms for Distributed Optimization
- ARock: An Algorithmic Framework for Asynchronous Parallel Coordinate Updates
- A Block Coordinate Descent Method for Regularized Multiconvex Optimization with Applications to Nonnegative Tensor Factorization and Completion
- iPiano: Inertial Proximal Algorithm for Nonconvex Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- A Coordinate Descent Primal-Dual Algorithm and Application to Distributed Asynchronous Optimization
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- An Accelerated Randomized Proximal Coordinate Gradient Method and its Application to Regularized Empirical Risk Minimization
- Clarke Subgradients of Stratifiable Functions
- Relaxation methods for problems with strictly convex separable costs and linear constraints
- A generalized proximal point algorithm for certain non-convex minimization problems
- Variational Analysis
- Forward-Backward Envelope for the Sum of Two Nonconvex Functions: Further Properties and Nonmonotone Linesearch Algorithms
- First-Order Methods in Optimization
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- A New Randomized Block-Coordinate Primal-Dual Proximal Algorithm for Distributed Optimization
- Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning
- Cyclic Coordinate-Update Algorithms for Fixed-Point Problems: Analysis and Applications
- Random Coordinate Descent Algorithms for Multi-Agent Convex Optimization Over Networks
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- On the Convergence of Block Coordinate Descent Type Methods
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- Stochastic Quasi-Fejér Block-Coordinate Fixed Point Iterations with Random Sweeping
- Convex analysis and monotone operator theory in Hilbert spaces
- Convergence of a block coordinate descent method for nondifferentiable minimization