Linear convergence of prox-SVRG method for separable non-smooth convex optimization problems under bounded metric subregularity
From MaRDI portal
Publication:2115253
DOI10.1007/s10957-021-01978-wzbMath1487.90530OpenAlexW4205999994MaRDI QIDQ2115253
Publication date: 15 March 2022
Published in: Journal of Optimization Theory and Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10957-021-01978-w
linear convergencecalmnessbounded metric subregularityproximal stochastic variance-reduced gradientrandomized block-coordinate proximal gradient
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Sparse regression using mixed norms
- A coordinate gradient descent method for nonsmooth separable minimization
- A unified approach to error bounds for structured convex optimization problems
- Regularity and conditioning of solution mappings in variational analysis
- Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems
- Random block coordinate descent methods for linearly constrained optimization over networks
- Iteration complexity of randomized block-coordinate descent methods for minimizing a composite function
- Techniques of variational analysis
- Optimization in High Dimensions via Accelerated, Parallel, and Proximal Coordinate Descent
- Efficiency of Coordinate Descent Methods on Huge-Scale Optimization Problems
- Lipschitz Behavior of Solutions to Convex Minimization Problems
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Recovery Algorithms for Vector-Valued Data with Joint Sparsity Constraints
- The Group Lasso for Logistic Regression
- Strongly Regular Generalized Equations
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- Stability Theory for Systems of Inequalities. Part I: Linear Systems
- Variational Analysis
- Necessary Optimality Conditions for Optimization Problems with Variational Inequality Constraints
- Metric Subregularity of Piecewise Linear Multifunctions and Applications to Piecewise Linear Multiobjective Optimization
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Metric subregularity of the convex subdifferential in Banach spaces
- Model Selection and Estimation in Regression with Grouped Variables
- Convex Analysis
- New Constraint Qualifications for Mathematical Programs with Equilibrium Constraints via Variational Analysis