A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks
From MaRDI portal
Publication:5157184
DOI10.1162/neco_a_01084zbMath1472.68055arXiv1803.00412OpenAlexW2963650979WikidataQ52590061 ScholiaQ52590061MaRDI QIDQ5157184
E. Paxon Frady, Friedrich T. Sommer, Denis Kleyko
Publication date: 12 October 2021
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.00412
Related Items (2)
Resonator Networks, 1: An Efficient Solution for Factoring High-Dimensional, Distributed Representations of Data Structures ⋮ A Theoretical Perspective on Hyperdimensional Computing
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- Analytic study of the memory storage capacity of a neural network
- Memory in linear recurrent neural networks in continuous time
- Asymptotics for some fundamental \(q\)-functions
- Binding and Normalization of Binary Sparse Distributed Representations by Context-Dependent Thinning
- Noise Tolerance of Attractor and Feedforward Memory Models
- A table of normal integrals
- Real-Time Computing Without Stable States: A New Framework for Neural Computation Based on Perturbations
- Neural networks and physical systems with emergent collective computational abilities.
- Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks
- Randomly Connected Networks Have Short Temporal Memory
- Representing Objects, Relations, and Sequences
- Short-Term Memory Capacity in Networks via the Restricted Isometry Property
- Probability of error, equivocation, and the Chernoff bound
- On Information and Sufficiency
- A Measure of Asymptotic Efficiency for Tests of a Hypothesis Based on the sum of Observations
This page was built for publication: A Theory of Sequence Indexing and Working Memory in Recurrent Neural Networks