Computational Advantages of Reverberating Loops for Sensorimotor Learning
From MaRDI portal
Publication:2885109
DOI10.1162/NECO_a_00237zbMath1237.92013OpenAlexW2111737714WikidataQ51496263 ScholiaQ51496263MaRDI QIDQ2885109
Kristen Fortney, Douglas B. Tweed
Publication date: 21 May 2012
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00237
Probabilistic models, generic numerical methods in probability and statistics (65C20) Neural biology (92C20)
Related Items (1)
Cites Work
- Multilayer feedforward networks are universal approximators
- On the mathematical foundations of learning
- Sensitivity Derivatives for Flexible Sensorimotor Learning
- Convergence and performance analysis of the normalized LMS algorithm with uncorrelated Gaussian data
- The Kernel Recursive Least-Squares Algorithm
- Learning representations by back-propagating errors
- SOME THEOREMS IN LEAST SQUARES
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Computational Advantages of Reverberating Loops for Sensorimotor Learning