Faster One-Sample Stochastic Conditional Gradient Method for Composite Convex Minimization
From MaRDI portal
Publication:6392238
arXiv2202.13212MaRDI QIDQ6392238
Author name not available (Why is that?)
Publication date: 26 February 2022
Abstract: We propose a stochastic conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms. Existing CGM variants for this template either suffer from slow convergence rates, or require carefully increasing the batch size over the course of the algorithm's execution, which leads to computing full gradients. In contrast, the proposed method, equipped with a stochastic average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques. In applications we put special emphasis on problems with a large number of separable constraints. Such problems are prevalent among semidefinite programming (SDP) formulations arising in machine learning and theoretical computer science. We provide numerical experiments on matrix completion, unsupervised clustering, and sparsest-cut SDPs.
Has companion code repository: https://github.com/ratschlab/faster-hcgm-composite
This page was built for publication: Faster One-Sample Stochastic Conditional Gradient Method for Composite Convex Minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6392238)