UnICORNN: A recurrent model for learning very long time dependencies

From MaRDI portal
Publication:6362486

arXiv2103.05487MaRDI QIDQ6362486

Author name not available (Why is that?)

Publication date: 9 March 2021

Abstract: The design of recurrent neural networks (RNNs) to accurately process sequential inputs with long-time dependencies is very challenging on account of the exploding and vanishing gradient problem. To overcome this, we propose a novel RNN architecture which is based on a structure preserving discretization of a Hamiltonian system of second-order ordinary differential equations that models networks of oscillators. The resulting RNN is fast, invertible (in time), memory efficient and we derive rigorous bounds on the hidden state gradients to prove the mitigation of the exploding and vanishing gradient problem. A suite of experiments are presented to demonstrate that the proposed RNN provides state of the art performance on a variety of learning tasks with (very) long-time dependencies.




Has companion code repository: https://github.com/tk-rusch/unicornn








This page was built for publication: UnICORNN: A recurrent model for learning very long time dependencies

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6362486)