Memory in linear recurrent neural networks in continuous time
From MaRDI portal
Publication:1784560
DOI10.1016/j.neunet.2009.08.008zbMath1396.68093OpenAlexW2062004915WikidataQ39900109 ScholiaQ39900109MaRDI QIDQ1784560
Michiel Hermans, Benjamin Schrauwen
Publication date: 27 September 2018
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2009.08.008
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (9)
Unnamed Item ⋮ Synthesis of recurrent neural networks for dynamical system simulation ⋮ Memristor Models for Machine Learning ⋮ Short-Term Memory Capacity in Networks via the Restricted Isometry Property ⋮ A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks ⋮ Unnamed Item ⋮ Dimension reduction in recurrent networks by canonicalization ⋮ Unnamed Item ⋮ Memory and forecasting capacities of nonlinear recurrent networks
Cites Work
- Unnamed Item
- Unnamed Item
- On the computational power of circuits of spiking neurons
- Principal component analysis.
- Edge of chaos and prediction of computational performance for neural circuit models
- Optimization and applications of echo state networks with leaky- integrator neurons
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Spiking Neuron Models
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
- Neural networks and physical systems with emergent collective computational abilities.
- Analysis and Design of Echo State Networks
This page was built for publication: Memory in linear recurrent neural networks in continuous time