Improved complexities for stochastic conditional gradient methods under interpolation-like conditions
From MaRDI portal
Publication:2670499
DOI10.1016/j.orl.2022.01.015OpenAlexW4225727086MaRDI QIDQ2670499
Tesi Xiao, Saeed Ghadimi, Krishnakumar Balasubramanian
Publication date: 11 March 2022
Published in: Operations Research Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2006.08167
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Conditional gradient algorithms for norm-regularized smooth convex optimization
- The gap function of a convex program
- Conditional gradient type methods for composite nonlinear and stochastic optimization
- Just interpolate: kernel ``ridgeless regression can generalize
- Random gradient-free minimization of convex functions
- Conditional Gradient Sliding for Convex Optimization
- An Extended Frank--Wolfe Method with “In-Face” Directions, and Its Application to Low-Rank Matrix Completion
- Optimal Rates for Zero-Order Convex Optimization: The Power of Two Function Evaluations
- Stochastic First- and Zeroth-Order Methods for Nonconvex Stochastic Programming
This page was built for publication: Improved complexities for stochastic conditional gradient methods under interpolation-like conditions