An analysis of training and generalization errors in shallow and deep networks
From MaRDI portal
Publication:2185668
DOI10.1016/j.neunet.2019.08.028zbMath1434.68513arXiv1802.06266OpenAlexW2972277540WikidataQ90416114 ScholiaQ90416114MaRDI QIDQ2185668
Tomaso Poggio, Hrushikesh N. Mhaskar
Publication date: 5 June 2020
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.06266
Related Items (2)
Applied harmonic analysis and data processing. Abstracts from the workshop held March 25--31, 2018 ⋮ A direct approach for function approximation on data defined manifolds
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Minimum Sobolev norm interpolation with trigonometric polynomials on the torus
- Sur l'approximation d'une fonction périodique et de ses dérivées successives par un polynôme trigonométrique et par ses dérivées successives
- Eignets for function approximation on manifolds
- Function approximation with zonal function networks with activation functions analogous to the rectified linear unit functions
- Degree of approximation by neural and translation networks with a single hidden layer
- On some convergence properties of the interpolation polynomials
- Deep vs. shallow networks: An approximation theory perspective
- Robust Large Margin Deep Neural Networks
- Localized Linear Polynomial Operators and Quadrature Formulas on the Sphere
- Applications of classical approximation theory to periodic basis function networks and computational harmonic analysis
- Approximation with interpolatory constraints
This page was built for publication: An analysis of training and generalization errors in shallow and deep networks