Constraining Chaos: Enforcing dynamical invariants in the training of recurrent neural networks

From MaRDI portal
Publication:6434279

arXiv2304.12865MaRDI QIDQ6434279

Author name not available (Why is that?)

Publication date: 23 April 2023

Abstract: Drawing on ergodic theory, we introduce a novel training method for machine learning based forecasting methods for chaotic dynamical systems. The training enforces dynamical invariants--such as the Lyapunov exponent spectrum and fractal dimension--in the systems of interest, enabling longer and more stable forecasts when operating with limited data. The technique is demonstrated in detail using the recurrent neural network architecture of reservoir computing. Results are given for the Lorenz 1996 chaotic dynamical system and a spectral quasi-geostrophic model, both typical test cases for numerical weather prediction.




Has companion code repository: https://github.com/japlatt/basicreservoircomputing








This page was built for publication: Constraining Chaos: Enforcing dynamical invariants in the training of recurrent neural networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6434279)