Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems
From MaRDI portal
Publication:4965108
DOI10.1007/978-3-030-58657-7_13zbMath1460.90137arXiv2003.09147OpenAlexW3161275408MaRDI QIDQ4965108
Seydamet S. Ablaev, Mohammad S. Alkousa, Alexander A. Titov, Fedor S. Stonyakin, Alexander V. Gasnikov
Publication date: 25 February 2021
Published in: Mathematical Optimization Theory and Operations Research (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.09147
convex programming problemonline optimization probleminexact modelrelative Lipschitz-continuitystochastic mirror descentswitching subgradient scheme
Related Items (2)
Stochastic incremental mirror descent algorithms with Nesterov smoothing ⋮ Some adaptive first-order methods for variational inequalities with relatively strongly monotone operators and generalized smoothness
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- A weighted mirror descent algorithm for nonsmooth convex optimization problem
- Gradient methods for minimizing composite functions
- First-order methods of smooth convex optimization with inexact oracle
- A generalized online mirror descent with applications to classification and regression
- The CoMirror algorithm for solving nonsmooth constrained convex problems
- Mirror descent and nonlinear projected subgradient methods for convex optimization.
- Mirror descent and constrained online optimization problems
- Fast gradient descent for convex minimization problems with an oracle producing a \(( \delta, L)\)-model of function at the requested point
- Convergence of online mirror descent
- Stochastic online optimization. Single-point and multi-point non-linear multi-armed bandits. Convex and strongly-convex case
- Mirror descent and convex optimization problems with non-smooth inequality constraints
- An optimal algorithm for stochastic strongly-convex optimization
- Relatively Smooth Convex Optimization by First-Order Methods, and Applications
- Adaptive Mirror Descent Algorithms for Convex and Strongly Convex Optimization Problems with Functional Constraints
- A Descent Lemma Beyond Lipschitz Gradient Continuity: First-Order Methods Revisited and Applications
- Unnamed Item
This page was built for publication: Analogues of Switching Subgradient Schemes for Relatively Lipschitz-Continuous Convex Programming Problems