Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions
From MaRDI portal
Publication:2675943
DOI10.1007/s00009-022-02138-8OpenAlexW4294571876WikidataQ114232203 ScholiaQ114232203MaRDI QIDQ2675943
Danilo Costarelli, Uğur Kadak, Lucian C. Coroianu
Publication date: 26 September 2022
Published in: Mediterranean Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00009-022-02138-8
Approximation by rational functions (41A20) Rate of convergence, degree of approximation (41A25) Approximation by operators (in particular, by integral operators) (41A35)
Cites Work
- Unnamed Item
- Unnamed Item
- Approximation results for neural network operators activated by sigmoidal functions
- Multivariate neural network operators with sigmoidal activation functions
- Multivariate hyperbolic tangent neural network approximation
- Multivariate sigmoidal neural network approximation
- On sharpness of error bounds for univariate approximation by single hidden layer feedforward neural networks
- Approximative compactness of linear combinations of characteristic functions
- The approximation operators with sigmoidal functions
- Rate of convergence of some neural network operators to the unit-univariate case
- Saturation classes for MAX-product neural network operators activated by sigmoidal functions
- Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators
- Approximation with neural networks activated by ramp sigmoids
- The max-product generalized sampling operators: convergence and quantitative estimates
- Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation
- Approximation by exponential sampling type neural network operators
- Fractional type multivariate sampling operators
- Rates of approximation by neural network interpolation operators
- Modified neural network operators and their convergence properties with summability methods
- Convergence of a family of neural network operators of the Kantorovich type
- Error bounds for approximations with deep ReLU networks
- Universality of deep convolutional neural networks
- On the hexagonal Shepard method
- Intelligent systems II. Complete approximation by neural network operators
- Max-product neural network and quasi-interpolation operators activated by sigmoidal functions
- Quantitative estimates involving K-functionals for neural network-type operators
- Neural Approximations for Optimal Control and Decision
- Approximation by max-product sampling Kantorovich operators with generalized kernels
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions