Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation
DOI10.1007/s00009-021-01717-5zbMath1469.41006OpenAlexW3133433943MaRDI QIDQ2023320
Gianluca Vinti, Danilo Costarelli, Marco Cantarini
Publication date: 3 May 2021
Published in: Mediterranean Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00009-021-01717-5
asymptotic formulasigmoidal functionsVoronovskaja formulaneural networks operatorshigh-order convergencetruncated algebraic moment
Linear operator approximation theory (47A58) Interpolation in approximation theory (41A05) Rate of convergence, degree of approximation (41A25) Approximation by other special function classes (41A30)
Related Items (1)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The new forms of Voronovskaya's theorem in weighted spaces
- Asymptotic formulae for linear combinations of generalized sampling operators
- On the approximation by neural networks with bounded number of neurons in hidden layers
- A unifying approach to convergence of linear sampling type operators in Orlicz spaces
- Approximation properties for linear combinations of moment type operators
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- The approximation operators with sigmoidal functions
- Uniform approximation by neural networks
- Approximation by neural networks with a bounded number of nodes at each level
- The universal approximation capabilities of cylindrical approximate identity neural networks
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Inverse results of approximation and the saturation order for the sampling Kantorovich series
- Quantitative generalized Voronovskaja's formulae for Bernstein polynomials
- Some inequalities in classical spaces with mixed norms
- On the near optimality of the stochastic approximation of smooth functions by neural networks
- Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions
- On the approximation by single hidden layer feedforward neural networks with fixed weights
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- An unsupervised parameter learning model for RVFL neural network
- Convergence of a family of neural network operators of the Kantorovich type
- Approximation by series of sigmoidal functions with applications to neural networks
- The Bernstein Voronovskaja-type theorem for positive linear approximation operators
- Approximate solutions of Volterra integral equations by an interpolation method based on ramp functions
- Approximation of discontinuous signals by sampling Kantorovich series
- Pointwise and uniform approximation by multivariate neural network operators of the max-product type
- Detection of thermal bridges from thermographic images by means of image processing approximation algorithms
- Learning theory estimates via integral operators and their approximations
- Deep vs. shallow networks: An approximation theory perspective
- Convergence for a family of neural network operators in Orlicz spaces
- q-Voronovskaya type theorems forq-Baskakov operators
- An Integral Upper Bound for Neural Network Approximation
- Learning Theory
- Another look at Voronovskaja type formulas
- Convergence results for a family of Kantorovich max-product neural network operators in a multivariate setting
- Approximation by truncated max‐product operators of Kantorovich‐type based on generalized (ϕ,ψ)‐kernels
- On the Approximation of the Cut and Step Functions by Logistic and Gompertz Functions
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation