Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces
DOI10.1007/S10208-023-09607-WMaRDI QIDQ6592113
Philipp Grohs, Felix Voigtlaender
Publication date: 24 August 2024
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
approximation spacesGelfand numbersrandomized approximationinformation based complexitydeep neural networkstheory-to-computational gaps
Artificial neural networks and deep learning (68T07) Abstract approximation theory (approximation in normed linear spaces and other abstract spaces) (41A65) Approximation by arbitrary nonlinear expressions; widths and entropy (41A46)
Cites Work
- The Deep Ritz Method: a deep learning-based numerical algorithm for solving variational problems
- Approximation spaces of deep neural networks
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- A proof that rectified deep neural networks overcome the curse of dimensionality in the numerical approximation of semilinear heat equations
- Physics-informed neural networks: a deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations
- On the mathematical foundations of learning
- High-Dimensional Probability
- The Gap between Theory and Practice in Function Approximation with Deep Neural Networks
- Deep Neural Network Approximation Theory
- Optimal Approximation with Sparsely Connected Deep Neural Networks
- Analysis of the Generalization Error: Empirical Risk Minimization over Deep Artificial Neural Networks Overcomes the Curse of Dimensionality in the Numerical Approximation of Black--Scholes Partial Differential Equations
- Solving inverse problems using data-driven models
- Understanding Machine Learning
- Neural network approximation
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
Related Items (1)
This page was built for publication: Proof of the theory-to-practice gap in deep learning via sampling complexity bounds for neural network approximation spaces
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6592113)