Parameter-free Stochastic Optimization of Variationally Coherent Functions

From MaRDI portal
Publication:6359428

arXiv2102.00236MaRDI QIDQ6359428

Author name not available (Why is that?)

Publication date: 30 January 2021

Abstract: We design and analyze an algorithm for first-order stochastic optimization of a large class of functions on mathbbRd. In particular, we consider the emph{variationally coherent} functions which can be convex or non-convex. The iterates of our algorithm on variationally coherent functions converge almost surely to the global minimizer . Additionally, the very same algorithm with the same hyperparameters, after T iterations guarantees on convex functions that the expected suboptimality gap is bounded by for any epsilon>0. It is the first algorithm to achieve both these properties at the same time. Also, the rate for convex functions essentially matches the performance of parameter-free algorithms. Our algorithm is an instance of the Follow The Regularized Leader algorithm with the added twist of using emph{rescaled gradients} and time-varying linearithmic regularizers.




Has companion code repository: https://github.com/bremen79/parameterfree








This page was built for publication: Parameter-free Stochastic Optimization of Variationally Coherent Functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6359428)