Timescale Separation in Recurrent Neural Networks
From MaRDI portal
Publication:5380251
DOI10.1162/NECO_a_00740zbMath1473.68142OpenAlexW2157135189WikidataQ50594023 ScholiaQ50594023MaRDI QIDQ5380251
Publication date: 4 June 2019
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00740
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Related Items (2)
Forward sensitivity analysis for contracting stochastic systems ⋮ A persistent adjoint method with dynamic time-scaling and an application to mass action kinetics
Cites Work
- Unnamed Item
- Reservoir computing approaches to recurrent neural network training
- On nonlinear difference and differential equations
- Connectionist learning of belief networks
- On contraction analysis for non-linear systems
- Dynamic behaviors of memristor-based recurrent neural networks with time-varying delays
- On partial contraction analysis for coupled nonlinear oscillators
- An Efficient Learning Procedure for Deep Boltzmann Machines
- Dynamic properties of neural networks with adapting synapses
- On the convergence of markovian stochastic algorithms with rapidly decreasing ergodicity rates
- Differential-algebraic equations and singular perturbation methods in recurrent neural learning
- Gradient Convergence in Gradient methods with Errors
- Nonlinear Systems Analysis
- Contractive Systems with Inputs
- A Contraction Theory Approach to Singularly Perturbed Systems
This page was built for publication: Timescale Separation in Recurrent Neural Networks