Incremental subgradient algorithms with dynamic step sizes for separable convex optimizations
From MaRDI portal
Publication:6140717
DOI10.1002/mma.8958MaRDI QIDQ6140717
Publication date: 2 January 2024
Published in: Mathematical Methods in the Applied Sciences (Search for Journal in Brave)
separable convex optimizationdiminishing step sizedynamic step sizeincremental subgradient algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Primal-dual subgradient methods for convex problems
- Convergence of a generalized subgradient method for nondifferentiable convex optimization
- Incremental gradient algorithms with stepsizes bounded away from zero
- Convergence of a simple subgradient level method
- Characterizations of strict local minima and necessary conditions for weak sharp minima
- On the convergence of conditional \(\varepsilon\)-subgradient methods for convex programs and convex-concave saddle-point problems.
- The incremental Gauss-Newton algorithm with adaptive stepsize rule
- An inexact modified subgradient algorithm for primal-dual problems via augmented Lagrangians
- Incremental Subgradient Methods for Nondifferentiable Optimization
- Solving integer minimum cost flows with separable convex cost objective polynomially
- A New Class of Incremental Gradient Methods for Least Squares Problems
- An Incremental Gradient(-Projection) Method with Momentum Term and Adaptive Stepsize Rule
- Weak Sharp Minima: Characterizations and Sufficient Conditions
- Convergence Analysis of Gradient Algorithms on Riemannian Manifolds without Curvature Constraints and Application to Riemannian Mass
- A merit function approach to the subgradient method with averaging
- Convex Analysis
This page was built for publication: Incremental subgradient algorithms with dynamic step sizes for separable convex optimizations