Error bounds for ReLU networks with depth and width parameters
From MaRDI portal
Publication:2111556
DOI10.1007/s13160-022-00515-0OpenAlexW4281936743MaRDI QIDQ2111556
Publication date: 17 January 2023
Published in: Japan Journal of Industrial and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s13160-022-00515-0
Learning and adaptive systems in artificial intelligence (68T05) Rate of convergence, degree of approximation (41A25)
Cites Work
- Functions, spaces, and expansions. Mathematical tools in physics and engineering
- Approximation and estimation bounds for artificial neural networks
- Multilayer feedforward networks are universal approximators
- Theory of deep convolutional neural networks: downsampling
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Deep vs. shallow networks: An approximation theory perspective
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- Approximation by superpositions of a sigmoidal function
- Unnamed Item
- Unnamed Item
This page was built for publication: Error bounds for ReLU networks with depth and width parameters