Proximal Adam: Robust Adaptive Update Scheme for Constrained Optimization

From MaRDI portal
Publication:6327693

arXiv1910.10094MaRDI QIDQ6327693

Author name not available (Why is that?)

Publication date: 22 October 2019

Abstract: We implement the adaptive step size scheme from the optimization methods AdaGrad and Adam in a novel variant of the Proximal Gradient Method (PGM). Our algorithm, dubbed AdaProx, avoids the need for explicit computation of the Lipschitz constants or additional line searches and thus reduces per-iteration cost. In test cases for Constrained Matrix Factorization we demonstrate the advantages of AdaProx in fidelity and performance over PGM, while still allowing for arbitrary penalty functions. The python implementation of the algorithm presented here is available as an open-source package at https://github.com/pmelchior/proxmin.




Has companion code repository: https://github.com/pmelchior/proxmin








This page was built for publication: Proximal Adam: Robust Adaptive Update Scheme for Constrained Optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6327693)