Linear convergence of epsilon-subgradient descent methods for a class of convex functions
From MaRDI portal
Publication:1806023
DOI10.1007/s101070050078zbMath1029.90056OpenAlexW1979459040MaRDI QIDQ1806023
Publication date: 1 February 2004
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: http://pure.iiasa.ac.at/id/eprint/4985/1/WP-96-041.pdf
convex functionsbundle methodresolvent methodproximal point methodlinear convergence ratebundle-trust region methodepsilon-subgradient descent methods
Convex programming (90C25) Nonlinear programming (90C30) Nonsmooth analysis (49J52) Numerical methods based on nonlinear programming (49M37)
Related Items
Survey Descent: A Multipoint Generalization of Gradient Descent for Nonsmooth Optimization, A Unified Analysis of Descent Sequences in Weakly Convex Optimization, Including Convergence Rates for Bundle Methods, Comparing different nonsmooth minimization methods and software, Computational efficiency of the simplex embedding method in convex nondifferentiable optimization, Generalized Eckstein-Bertsekas proximal point algorithm involving \((H,\eta )\)-monotonicity framework, New approach to the \(\eta \)-proximal point algorithm and nonlinear variational inclusion problems, On the convergence of primal-dual hybrid gradient algorithms for total variation image restoration, Gradient-based method with active set strategy for $\ell _1$ optimization, Unnamed Item, A coordinate gradient descent method for nonsmooth separable minimization, Randomized smoothing variance reduction method for large-scale non-smooth convex optimization, Generalized Eckstein-Bertsekas proximal point algorithm based ona-maximal monotonicity design, Scaling Techniques for $\epsilon$-Subgradient Methods, On Rockafellar's theorem using proximal point algorithm involving \(H\)-maximal monotonicity framework, Super-relaxed \((\eta)\)-proximal point algorithms, relaxed \((\eta)\)-proximal point algorithms, linear convergence analysis, and nonlinear variational inclusions, Subgradient and Bundle Methods for Nonsmooth Optimization, Convergence rates of subgradient methods for quasi-convex optimization problems, Nomonotone spectral gradient method for sparse recovery