AGGLIO: Global Optimization for Locally Convex Functions

From MaRDI portal
Publication:6382386

arXiv2111.03932MaRDI QIDQ6382386

Author name not available (Why is that?)

Publication date: 6 November 2021

Abstract: This paper presents AGGLIO (Accelerated Graduated Generalized LInear-model Optimization), a stage-wise, graduated optimization technique that offers global convergence guarantees for non-convex optimization problems whose objectives offer only local convexity and may fail to be even quasi-convex at a global scale. In particular, this includes learning problems that utilize popular activation functions such as sigmoid, softplus and SiLU that yield non-convex training objectives. AGGLIO can be readily implemented using point as well as mini-batch SGD updates and offers provable convergence to the global optimum in general conditions. In experiments, AGGLIO outperformed several recently proposed optimization techniques for non-convex and locally convex objectives in terms of convergence rate as well as convergent accuracy. AGGLIO relies on a graduation technique for generalized linear models, as well as a novel proof strategy, both of which may be of independent interest.




Has companion code repository: https://github.com/purushottamkar/agglio








This page was built for publication: AGGLIO: Global Optimization for Locally Convex Functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6382386)