Reducing network size and improving prediction stability of reservoir computing
From MaRDI portal
Publication:5119469
DOI10.1063/5.0006869zbMath1440.37076arXiv2003.03178OpenAlexW3100551378WikidataQ96949365 ScholiaQ96949365MaRDI QIDQ5119469
Joschka Herteux, Alexander Haluszczynski, Jonas Aumeier, Christoph Räth
Publication date: 4 September 2020
Published in: Chaos: An Interdisciplinary Journal of Nonlinear Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2003.03178
Related Items (2)
Seeking optimal parameters for achieving a lightweight reservoir computing: a computational endeavor ⋮ Breaking symmetries of the reservoir equations in echo state networks
Cites Work
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Determining Lyapunov exponents from a time series
- Measuring the strangeness of strange attractors
- Chaos in fractional-order autonomous nonlinear systems.
- A practical method for calculating largest Lyapunov exponents from small data sets
- An experimental unification of reservoir computing methods
- An equation for continuous chaos
- The double scroll
- Chaos in models of double convection
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- YET ANOTHER CHAOTIC ATTRACTOR
- Using machine learning to replicate chaotic attractors and calculate Lyapunov exponents from data
- Deterministic Nonperiodic Flow
- Good and bad predictions: Assessing and improving the replication of chaotic attractors by means of reservoir computing
- Creation of a complex butterfly attractor using a novel Lorenz-Type system
- DETERMINISTIC CHAOS SEEN IN TERMS OF FEEDBACK CIRCUITS: ANALYSIS, SYNTHESIS, "LABYRINTH CHAOS"
- Ridge Regression: Biased Estimation for Nonorthogonal Problems
This page was built for publication: Reducing network size and improving prediction stability of reservoir computing