RSG: Beating Subgradient Method without Smoothness and Strong Convexity
From MaRDI portal
Publication:4558142
zbMath1444.90092arXiv1512.03107MaRDI QIDQ4558142
Publication date: 21 November 2018
Full work available at URL: https://arxiv.org/abs/1512.03107
Related Items
Subgradient methods for sharp weakly convex functions ⋮ On finite termination of an inexact proximal point algorithm ⋮ Radial duality. II: Applications and algorithms ⋮ Faster first-order primal-dual methods for linear programming using restarts and sharpness ⋮ On optimal universal first-order methods for minimizing heterogeneous sums ⋮ A simple nearly optimal restart scheme for speeding up first-order methods ⋮ General Hölder smooth convergence rates follow from specialized rates assuming growth bounds ⋮ The gradient projection algorithm for a proximally smooth set and a function with Lipschitz continuous gradient ⋮ New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization ⋮ RSG: Beating Subgradient Method without Smoothness and Strong Convexity ⋮ General convergence analysis of stochastic first-order methods for composite optimization ⋮ Randomized smoothing variance reduction method for large-scale non-smooth convex optimization ⋮ Faster subgradient methods for functions with Hölderian growth ⋮ Accelerate stochastic subgradient method by leveraging local growth condition ⋮ The method of codifferential descent for convex and global piecewise affine optimization
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- First-order algorithm with \({\mathcal{O}(\ln(1/\epsilon))}\) convergence for \({\epsilon}\)-equilibrium in two-person zero-sum games
- Characterization of the equivalence of robustification and regularization in linear and matrix regression
- Higher-order metric subregularity and its applications
- Error bounds and Hölder metric subregularity
- Weak sharp minima revisited. III: Error bounds for differentiable convex inclusions
- A coordinate gradient descent method for nonsmooth separable minimization
- Block-coordinate gradient descent method for linearly constrained nonsmooth separable optimization
- Finite termination of the proximal point algorithm
- Error bounds and convergence analysis of feasible descent methods: A general approach
- On the convergence of the coordinate descent method for convex differentiable minimization
- Error bounds in mathematical programming
- Introductory lectures on convex optimization. A basic course.
- New computational guarantees for solving convex optimization problems with first order methods, via a function growth condition measure
- From error bounds to the complexity of first-order descent methods for convex functions
- A unified approach to error bounds for structured convex optimization problems
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Global error bounds for piecewise convex polynomials
- An efficient primal dual prox method for non-smooth optimization
- Restricted strong convexity and its applications to convergence analysis of gradient-type methods in convex optimization
- Weak sharp minima revisited. II: Application to linear regularity and error bounds
- Deterministic and stochastic primal-dual subgradient algorithms for uniformly convex minimization
- An optimal algorithm for stochastic strongly-convex optimization
- Convergence Results for Projected Line-Search Methods on Varieties of Low-Rank Matrices Via Łojasiewicz Inequality
- On the Asymptotically Well Behaved Functions and Global Error Bound for Convex Polynomials
- Weak Sharp Minima in Mathematical Programming
- Proximal Alternating Minimization and Projection Methods for Nonconvex Problems: An Approach Based on the Kurdyka-Łojasiewicz Inequality
- Parallel Random Coordinate Descent Method for Composite Minimization: Convergence Analysis and Error Bounds
- Clarke Subgradients of Stratifiable Functions
- Robust Stochastic Approximation Approach to Stochastic Programming
- Error Bounds for Convex Polynomials
- On the Linear Convergence of Descent Methods for Convex Essentially Smooth Minimization
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- Non-asymptotic convergence analysis of inexact gradient methods for machine learning without strong convexity
- Hölder Metric Subregularity with Applications to Proximal Point Method
- Weak Sharp Minima: Characterizations and Sufficient Conditions
- Second-order growth, tilt stability, and metric regularity of the subdifferential
- Error Bounds, Quadratic Growth, and Linear Convergence of Proximal Methods
- Robust Regression and Lasso
- Learning with Submodular Functions: A Convex Optimization Perspective
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization, II: Shrinking Procedures and Optimal Algorithms
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- ``Efficient” Subgradient Methods for General Convex Optimization
- Convex Analysis