Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs

From MaRDI portal
Publication:6336544

arXiv2003.05271MaRDI QIDQ6336544

Author name not available (Why is that?)

Publication date: 11 March 2020

Abstract: We propose a simple interpolation-based method for the efficient approximation of gradients in neural ODE models. We compare it with the reverse dynamic method (known in the literature as "adjoint method") to train neural ODEs on classification, density estimation, and inference approximation tasks. We also propose a theoretical justification of our approach using logarithmic norm formalism. As a result, our method allows faster model training than the reverse dynamic method that was confirmed and validated by extensive numerical experiments for several standard benchmarks.




Has companion code repository: https://github.com/Daulbaev/IRDM








This page was built for publication: Interpolation Technique to Speed Up Gradients Propagation in Neural ODEs

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6336544)