Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
DOI10.1080/00036811.2020.1849634zbMath1489.90138OpenAlexW3107941028MaRDI QIDQ5865360
No author found.
Publication date: 13 June 2022
Published in: Applicable Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00036811.2020.1849634
linear convergencenonconvex nonsmooth minimizationmetrical subregularityproximal incremental aggregated gradient
Numerical mathematical programming methods (65K05) Large-scale problems in mathematical programming (90C06) Nonconvex programming, global optimization (90C26) Nonlinear programming (90C30) Stochastic programming (90C15)
Related Items (2)
Uses Software
Cites Work
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Minimizing finite sums with the stochastic average gradient
- A coordinate gradient descent method for nonsmooth separable minimization
- Error bounds in mathematical programming
- Accelerating incremental gradient optimization with curvature information
- Nonconvex proximal incremental aggregated gradient method with linear convergence
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Linear Convergence of Proximal Gradient Algorithm with Extrapolation for a Class of Nonconvex Nonsmooth Minimization Problems
- An Asynchronous Mini-Batch Algorithm for Regularized Stochastic Optimization
- Robust Truncated Hinge Loss Support Vector Machines
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Optimization Methods for Large-Scale Machine Learning
- Sparsity and Smoothness Via the Fused Lasso
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- Model Selection and Estimation in Regression with Grouped Variables
- A Convergent Incremental Gradient Method with a Constant Step Size
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Convex Analysis
- An introduction to continuous optimization for imaging
This page was built for publication: Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems