Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions
From MaRDI portal
Publication:5079533
DOI10.4208/jcm.2007-m2019-0239zbMath1504.41041arXiv1903.00735OpenAlexW3210425550MaRDI QIDQ5079533
Haizhao Yang, Hadrien Montanelli, Qiang Du
Publication date: 27 May 2022
Published in: Journal of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1903.00735
Chebyshev polynomialsapproximation theorycurse of dimensionalitymachine learningbandlimited functionsdeep ReLU networks
Artificial neural networks and deep learning (68T07) Multidimensional problems (41A63) Algorithms for approximation of functions (65D15)
Related Items (13)
Stationary Density Estimation of Itô Diffusions Using Deep Learning ⋮ SelectNet: self-paced learning for high-dimensional partial differential equations ⋮ ReLU deep neural networks from the hierarchical basis perspective ⋮ The Discovery of Dynamics via Linear Multistep Methods and Deep Learning: Error Estimation ⋮ A note on the applications of one primary function in deep neural networks ⋮ Approximation bounds for norm constrained neural networks with applications to regression and GANs ⋮ Deep Neural Networks for Solving Large Linear Systems Arising from High-Dimensional Problems ⋮ Friedrichs Learning: Weak Solutions of Partial Differential Equations via Deep Learning ⋮ Active learning based sampling for high-dimensional nonlinear partial differential equations ⋮ Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class ⋮ Neural network approximation and estimation of classifiers with classification boundary in a Barron class ⋮ Approximation in shift-invariant spaces with deep ReLU neural networks ⋮ Optimal approximation rate of ReLU networks in terms of width and depth
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by superposition of sigmoidal and radial basis functions
- Multilayer feedforward networks are universal approximators
- Provable approximation properties for deep neural networks
- Optimal nonlinear approximation
- Approximation properties of a multilayered feedforward artificial neural network
- Exponential convergence of the deep neural network approximation for analytic functions
- Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Nonlinear approximation via compositions
- Error bounds for approximations with deep ReLU networks
- Universal approximation bounds for superpositions of a sigmoidal function
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- A note on the expressive power of deep rectified linear unit networks in high‐dimensional spaces
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Deep ReLU Networks Overcome the Curse of Dimensionality for Generalized Bandlimited Functions