Sparsity through evolutionary pruning prevents neuronal networks from overfitting
DOI10.1016/J.NEUNET.2020.05.007OpenAlexW3021119228WikidataQ95850337 ScholiaQ95850337MaRDI QIDQ1982444
Richard C. Gerum, André Erpenbeck, Patrick Krauss, Achim Schilling
Publication date: 8 September 2021
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.10988
evolutionartificial neural networksevolutionary algorithmoverfittingbiological plausibilitymaze task
Artificial neural networks and deep learning (68T07) Problems related to evolution (92D15) Neural networks for/in biological studies, artificial life and related topics (92B20)
Related Items (1)
Uses Software
Cites Work
- An experimental unification of reservoir computing methods
- A neural algorithm for a fundamental computing problem
- Real-Time Computation at the Edge of Chaos in Recurrent Neural Networks
- The Vanishing Gradient Problem During Learning Recurrent Neural Nets and Problem Solutions
- Collective dynamics of ‘small-world’ networks
- Unnamed Item
This page was built for publication: Sparsity through evolutionary pruning prevents neuronal networks from overfitting