SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
From MaRDI portal
Publication:6252788
arXiv1407.0202MaRDI QIDQ6252788
Aaron Defazio, Francis Bach, Simon Lacoste-Julien
Publication date: 1 July 2014
Abstract: In this work we introduce a new optimisation method called SAGA in the spirit of SAG, SDCA, MISO and SVRG, a set of recently proposed incremental gradient algorithms with fast linear convergence rates. SAGA improves on the theory behind SAG and SVRG, with better theoretical convergence rates, and has support for composite objectives where a proximal operator is used on the regulariser. Unlike SDCA, SAGA supports non-strongly convex problems directly, and is adaptive to any inherent strong convexity of the problem. We give experimental results showing the effectiveness of our method.
Has companion code repository: https://github.com/adefazio/point-saga
This page was built for publication: SAGA: A Fast Incremental Gradient Method With Support for Non-Strongly Convex Composite Objectives
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6252788)