Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation
From MaRDI portal
Publication:6191372
DOI10.1007/s00211-023-01384-6OpenAlexW4388928889MaRDI QIDQ6191372
Sergei V. Pereverzyev, Peter Mathé, Shuai Lu, Yuan-Yuan Li
Publication date: 9 February 2024
Published in: Numerische Mathematik (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00211-023-01384-6
Rate of convergence, degree of approximation (41A25) Algorithms for approximation of functions (65D15)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Regularization theory for ill-posed problems. Selected topics
- Inverse problems and high-dimensional estimation. Stats in the Château summer school, Paris, France, August 31 -- September 4, 2009.
- Optimal recovery of functions and their derivatives from inaccurate information about the spectrum and inequalities for derivatives
- Complexity estimates based on integral transforms induced by computational units
- Representation formulas and pointwise properties for Barron functions
- Approximation spaces of deep neural networks
- The Barron space and the flow-induced function spaces for neural network models
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- Inverse Problems Light: Numerical Differentiation
- Numerical differentiation from a viewpoint of regularization theory
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- A numerical differentiation method and its application to reconstruction of discontinuity
- Numerical solution of inverse problems by weak adversarial networks
- Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units
- Finite Neuron Method and Convergence Analysis
- Theory of Reproducing Kernels
- Characterization of the variation spaces corresponding to shallow neural networks
- Integral representations of shallow neural network with Rectified Power Unit activation function
This page was built for publication: Two-layer networks with the \(\text{ReLU}^k\) activation function: Barron spaces and derivative approximation