A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming
From MaRDI portal
Publication:4596724
DOI10.1137/16M1110182zbMath1386.65157arXiv1306.5918OpenAlexW2963775200MaRDI QIDQ4596724
Publication date: 8 December 2017
Published in: SIAM Journal on Numerical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1306.5918
randomized algorithmsnonmonotone line searchnonconvex composite optimizationblock coordinate gradient method
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonlinear programming (90C30)
Related Items (5)
Scalable subspace methods for derivative-free nonlinear least-squares optimization ⋮ Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization ⋮ On Synchronous, Asynchronous, and Randomized Best-Response Schemes for Stochastic Nash Games ⋮ Convergence of a Class of Nonmonotone Descent Methods for Kurdyka–Łojasiewicz Optimization Problems ⋮ Nonmonotone Enhanced Proximal DC Algorithms for a Class of Structured Nonsmooth DC Programming
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Parallel coordinate descent methods for big data optimization
- Inexact coordinate descent: complexity and preconditioning
- Gradient methods for minimizing composite functions
- Efficient random coordinate descent algorithms for large-scale structured nonconvex optimization
- On the complexity analysis of randomized block-coordinate descent methods
- Iteration complexity analysis of block coordinate descent methods
- A coordinate gradient descent method for \(\ell_{1}\)-regularized convex minimization
- Fixed point and Bregman iterative methods for matrix rank minimization
- An augmented Lagrangian approach for sparse principal component analysis
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- On the convergence of the coordinate descent method for convex differentiable minimization
- Nonmonotone curvilinear line search methods for unconstrained optimization
- Coordinate descent optimization for \(l^{1}\) minimization with application to compressed sensing; a greedy algorithm
- Efficient block-coordinate descent algorithms for the group Lasso
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Coordinate descent algorithms for lasso penalized regression
- Block Coordinate Descent Methods for Semidefinite Programming
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Randomized Methods for Linear Constraints: Convergence Rates and Conditioning
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Two-Point Step Size Gradient Methods
- Nonmonotone Spectral Projected Gradient Methods on Convex Sets
- Sparse Reconstruction by Separable Approximation
- A Nonmonotone Line Search Technique and Its Application to Unconstrained Optimization
- A Nonmonotone Line Search Technique for Newton’s Method
- Efficient Serial and Parallel Coordinate Descent Methods for Huge-Scale Truss Topology Design
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Convergence of a block coordinate descent method for nondifferentiable minimization
- Adaptive two-point stepsize gradient algorithm
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: A Randomized Nonmonotone Block Proximal Gradient Method for a Class of Structured Nonlinear Programming