Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization

From MaRDI portal
Publication:6400148

arXiv2205.12751MaRDI QIDQ6400148

Author name not available (Why is that?)

Publication date: 25 May 2022

Abstract: We consider the problem of minimizing the sum of two convex functions. One of those functions has Lipschitz-continuous gradients, and can be accessed via stochastic oracles, whereas the other is "simple". We provide a Bregman-type algorithm with accelerated convergence in function values to a ball containing the minimum. The radius of this ball depends on problem-dependent constants, including the variance of the stochastic oracle. We further show that this algorithmic setup naturally leads to a variant of Frank-Wolfe achieving acceleration under parallelization. More precisely, when minimizing a smooth convex function on a bounded domain, we show that one can achieve an epsilon primal-dual gap (in expectation) in ildeO(1/sqrtepsilon) iterations, by only accessing gradients of the original function and a linear maximization oracle with O(1/sqrtepsilon) computing units in parallel. We illustrate this fast convergence on synthetic numerical experiments.




Has companion code repository: https://github.com/bpauld/pfw








This page was built for publication: Fast Stochastic Composite Minimization and an Accelerated Frank-Wolfe Algorithm under Parallelization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6400148)