Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization
From MaRDI portal
Publication:2047203
DOI10.1007/s11590-021-01723-2zbMath1475.90069OpenAlexW3141090426MaRDI QIDQ2047203
Publication date: 19 August 2021
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11590-021-01723-2
Related Items (2)
Proximal variable smoothing method for three-composite nonconvex nonsmooth minimization with a linear operator ⋮ Inertial proximal incremental aggregated gradient method with linear convergence guarantees
Uses Software
Cites Work
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Variable smoothing for weakly convex composite functions
- Accelerating incremental gradient optimization with curvature information
- Variable smoothing for convex optimization problems using stochastic gradients
- Nonconvex proximal incremental aggregated gradient method with linear convergence
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Robust Truncated Hinge Loss Support Vector Machines
- Variational Analysis
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- The Convergence Guarantees of a Non-Convex Approach for Sparse Recovery
- Stochastic Model-Based Minimization of Weakly Convex Functions
- Nonconvex Sparse Logistic Regression With Weakly Convex Regularization
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions
- Convergence Rate of Incremental Gradient and Incremental Newton Methods
- On the Convergence Rate of Incremental Aggregated Gradient Algorithms
- A Convergent Incremental Gradient Method with a Constant Step Size
- Linear convergence of proximal incremental aggregated gradient method for nonconvex nonsmooth minimization problems
This page was built for publication: Variable smoothing incremental aggregated gradient method for nonsmooth nonconvex regularized optimization