A tight upper bound on the generalization error of feedforward neural networks
From MaRDI portal
Publication:1982395
DOI10.1016/j.neunet.2020.04.001zbMath1468.68167OpenAlexW3015482436WikidataQ91866110 ScholiaQ91866110MaRDI QIDQ1982395
Publication date: 8 September 2021
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2020.04.001
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Robustness and generalization
- Polynomial bounds for VC dimension of sigmoidal and general Pfaffian neural networks
- Three fundamental concepts of the capacity of learning machines
- Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
- Learning theory: stability is sufficient for generalization and necessary and sufficient for consistency of empirical risk minimization
- Local Rademacher complexities
- Distribution-free performance bounds for potential function rules
- Neural Nets with Superlinear VC-Dimension
- Scale-sensitive dimensions, uniform convergence, and learnability
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- Robust Large Margin Deep Neural Networks
- 10.1162/153244302760200704
- 10.1162/153244303321897690
- Estimation of Dependences Based on Empirical Data
This page was built for publication: A tight upper bound on the generalization error of feedforward neural networks