Learning continuous chaotic attractors with a reservoir computer
From MaRDI portal
Publication:6557996
DOI10.1063/5.0075572zbMATH Open1548.37141MaRDI QIDQ6557996
Jason Z. Kim, Dani S. Bassett, Zhixin Lu, Lindsay M. Smith
Publication date: 18 June 2024
Published in: Chaos (Search for Journal in Brave)
Neural biology (92C20) Dynamical systems in biology (37N25) Neural networks for/in biological studies, artificial life and related topics (92B20) Strange attractors, chaotic dynamics of systems with hyperbolic behavior (37D45) Computational methods for ergodic theory (approximation of invariant measures, computation of Lyapunov exponents, entropy, etc.) (37M25)
Cites Work
- Reservoir computing approaches to recurrent neural network training
- The Lyapunov dimension of strange attractors
- Dimension, entropy and Lyapunov exponents
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Deterministic Nonperiodic Flow
- Synchronization in chaotic systems
- Invertible generalized synchronization: A putative mechanism for implicit learning in neural systems
- Neural networks and physical systems with emergent collective computational abilities.
Related Items (1)
This page was built for publication: Learning continuous chaotic attractors with a reservoir computer