Optimization without Backpropagation

From MaRDI portal
Publication:6410660

arXiv2209.06302MaRDI QIDQ6410660

Author name not available (Why is that?)

Publication date: 13 September 2022

Abstract: Forward gradients have been recently introduced to bypass backpropagation in autodifferentiation, while retaining unbiased estimators of true gradients. We derive an optimality condition to obtain best approximating forward gradients, which leads us to mathematical insights that suggest optimization in high dimension is challenging with forward gradients. Our extensive experiments on test functions support this claim.




Has companion code repository: https://github.com/gbelouze/forward-gradient








This page was built for publication: Optimization without Backpropagation

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6410660)