Construction and approximation rate for feedforward neural network operators with sigmoidal functions
From MaRDI portal
Publication:6591537
DOI10.1016/j.cam.2024.116150zbMATH Open1545.41006MaRDI QIDQ6591537
Publication date: 22 August 2024
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Artificial neural networks and deep learning (68T07) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Error estimates for the modified truncations of approximate approximation with Gaussian kernels
- Multivariate sigmoidal neural network approximation
- Global errors for approximate approximations with Gaussian kernels on compact intervals
- Constructive approximate interpolation by neural networks
- Error estimates for approximate approximations with Gaussian kernels on compact intervals
- The construction and approximation of feedforward neural network with hyperbolic tangent function
- The approximation operators with sigmoidal functions
- On approximation by polynomials and rational functions in Orlicz spaces
- Approximation by superposition of sigmoidal and radial basis functions
- Rate of convergence of some neural network operators to the unit-univariate case
- Neural network operators: constructive interpolation of multivariate functions
- Multilayer feedforward networks are universal approximators
- Approximation properties of a multilayered feedforward artificial neural network
- The construction and approximation of some neural network operators
- Rates of approximation by neural network interpolation operators
- High-order approximation rates for shallow neural networks with cosine and \(\mathrm{ReLU}^k\) activation functions
- Multivariate neural network interpolation operators
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Interpolation by neural network operators activated by ramp functions
- Error bounds for approximations with deep ReLU networks
- Simultaneous approximations of multivariate functions and their derivatives by neural networks with one hidden layer
- Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS
- Neural network interpolation operators activated by smooth ramp functions
- On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights
- Approximation by superpositions of a sigmoidal function
- Rates of approximation by ReLU shallow neural networks
- Neural network interpolation operators optimized by Lagrange polynomial
- Approximation by a class of neural network operators on scattered data
- Neural network interpolation operators of multivariate functions
Related Items (1)
This page was built for publication: Construction and approximation rate for feedforward neural network operators with sigmoidal functions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6591537)