Universal Conditional Gradient Sliding for Convex Optimization
From MaRDI portal
Publication:6071883
DOI10.1137/21m1406234zbMath1529.90059arXiv2103.11026MaRDI QIDQ6071883
Publication date: 29 November 2023
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2103.11026
convex optimizationfirst-order methodconditional gradient methodconditional gradient slidinguniversal gradient method
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Gradient sliding for composite optimization
- First-order methods of smooth convex optimization with inexact oracle
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- Universal gradient methods for convex optimization problems
- Lectures on convex optimization
- Introductory lectures on convex optimization. A basic course.
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Complexity bounds for primal-dual methods minimizing the model of objective function
- First-order and stochastic optimization methods for machine learning
- On the convergence properties of non-Euclidean extragradient methods for variational inequalities with generalized monotone operators
- Fast bundle-level methods for unconstrained and ball-constrained convex optimization
- Bundle-level type methods uniformly optimal for smooth and nonsmooth convex optimization
- Conditional Gradient Sliding for Convex Optimization
- First-Order Methods in Optimization
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework
- New analysis and results for the Frank-Wolfe method
This page was built for publication: Universal Conditional Gradient Sliding for Convex Optimization