Sample complexity for learning recurrent perceptron mappings
From MaRDI portal
Publication:4896826
DOI10.1109/18.532888zbMath0858.68081OpenAlexW2156582723MaRDI QIDQ4896826
Publication date: 22 October 1996
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/3db6753e4134f1cc9d47e768fd433c2f4719e4a3
Learning and adaptive systems in artificial intelligence (68T05) Neural networks for/in biological studies, artificial life and related topics (92B20)
Related Items (6)
Improving Generalization Capabilities of Dynamic Neural Networks ⋮ Complete controllability of continuous-time recurrent neural networks ⋮ Vapnik-Chervonenkis dimension of recurrent neural networks ⋮ The complexity of model classes, and smoothing noisy data ⋮ A learning result for continuous-time recurrent neural networks ⋮ Compressive sensing and neural networks from a statistical learning perspective
This page was built for publication: Sample complexity for learning recurrent perceptron mappings