Approximating functions with multi-features by deep convolutional neural networks
From MaRDI portal
Publication:5873927
DOI10.1142/S0219530522400085MaRDI QIDQ5873927
No author found.
Publication date: 10 February 2023
Published in: Analysis and Applications (Search for Journal in Brave)
feature extractioncurse of dimensionalityrates of approximationconvolutional neural networksdeep learning
Computational learning theory (68Q32) Artificial neural networks and deep learning (68T07) Rate of convergence, degree of approximation (41A25)
Related Items
Some new inequalities and numerical results of bivariate Bernstein-type operator including Bézier basis and its GBS operator, Approximation of nonlinear functionals using deep ReLU networks
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Approximation by superposition of sigmoidal and radial basis functions
- On best approximation by ridge functions
- Fundamentality of ridge functions
- Multilayer feedforward networks are universal approximators
- Approximation properties of a multilayered feedforward artificial neural network
- Theory of deep convolutional neural networks. II: Spherical analysis
- Theory of deep convolutional neural networks: downsampling
- Nonparametric regression using deep neural networks with ReLU activation function
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Deep vs. shallow networks: An approximation theory perspective
- Learning Theory
- Universal approximation bounds for superpositions of a sigmoidal function
- Neural Networks for Localized Approximation
- Deep distributed convolutional neural networks: Universality
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- A Mathematical Theory of Deep Convolutional Neural Networks for Feature Extraction
- New Error Bounds for Deep ReLU Networks Using Sparse Grids
- Equivalence of approximation by convolutional neural networks and fully-connected networks
- Nonparametric Regression Based on Hierarchical Interaction Models
- Breaking the Curse of Dimensionality with Convex Neural Networks
- A Fast Learning Algorithm for Deep Belief Nets
- Approximation by superpositions of a sigmoidal function
- Theory of deep convolutional neural networks. III: Approximating radial functions