The approximation operators with sigmoidal functions
From MaRDI portal
Publication:980024
DOI10.1016/j.camwa.2009.05.001zbMath1189.41014OpenAlexW2038266964MaRDI QIDQ980024
Publication date: 28 June 2010
Published in: Computers \& Mathematics with Applications (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.camwa.2009.05.001
Related Items (66)
On Approximation by Neural Networks with Optimized Activation Functions and Fixed Weights ⋮ On the Hausdorff distance between the Heaviside step function and Verhulst logistic function ⋮ Approximation theorems for a family of multivariate neural network operators in Orlicz-type spaces ⋮ Approximation by max-product neural network operators of Kantorovich type ⋮ Approximation by network operators with logistic activation functions ⋮ Global Mittag-Leffler stability of complex valued fractional-order neural network with discrete and distributed delays ⋮ Max-product neural network and quasi-interpolation operators activated by sigmoidal functions ⋮ DENSITY RESULTS BY DEEP NEURAL NETWORK OPERATORS WITH INTEGER WEIGHTS ⋮ Fractional neural network approximation ⋮ A sigmoid method for some nonlinear Fredholm integral equations of the second kind ⋮ Neural network interpolation operators activated by smooth ramp functions ⋮ Neural network operators: constructive interpolation of multivariate functions ⋮ Solving polynomial systems using a fast adaptive back propagation-type neural network algorithm ⋮ Approximation of discontinuous signals by sampling Kantorovich series ⋮ Approximation by neural networks with sigmoidal functions ⋮ Voronovskaja type theorems and high-order convergence neural network operators with sigmoidal functions ⋮ On the approximation by single hidden layer feedforward neural networks with fixed weights ⋮ Asymptotic expansions and Voronovskaja type theorems for the multivariate neural network operators ⋮ Saturation classes for MAX-product neural network operators activated by sigmoidal functions ⋮ A note on the applications of one primary function in deep neural networks ⋮ Modified neural network operators and their convergence properties with summability methods ⋮ The construction and approximation of some neural network operators ⋮ Pointwise and uniform approximation by multivariate neural network operators of the max-product type ⋮ Quantitative estimates for neural network operators implied by the asymptotic behaviour of the sigmoidal activation functions ⋮ \(q\)-deformed hyperbolic tangent based Banach space valued ordinary and fractional neural network approximations ⋮ Construction and approximation for a class of feedforward neural networks with sigmoidal function ⋮ MULTIVARIATE FUZZY APPROXIMATION BY NEURAL NETWORK OPERATORS ACTIVATED BY A GENERAL SIGMOID FUNCTION ⋮ Neural network interpolation operators optimized by Lagrange polynomial ⋮ Brownian motion approximation by parametrized and deformed neural networks ⋮ Approximation by a class of neural network operators on scattered data ⋮ Approximation error for neural network operators by an averaged modulus of smoothness ⋮ Hyperbolic tangent like relied Banach space valued neural network multivariate approximations ⋮ Richards's curve induced Banach space valued ordinary and fractional neural network approximation ⋮ Fuzzy fractional more sigmoid function activated neural network approximations revisited ⋮ Voronovskaya Type Asymptotic Expansions for Perturbed Neural Network Operators ⋮ Neural network interpolation operators of multivariate functions ⋮ Multiple general sigmoids based Banach space valued neural network multivariate approximation ⋮ Fractional type multivariate neural network operators ⋮ The construction and approximation of feedforward neural network with hyperbolic tangent function ⋮ Convergence for a family of neural network operators in Orlicz spaces ⋮ Richards's curve induced Banach space valued multivariate neural network approximation ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Approximation results for neural network operators activated by sigmoidal functions ⋮ Multivariate neural network operators with sigmoidal activation functions ⋮ Solutions of integral equations by reproducing kernel Hilbert space method ⋮ Estimates for the neural network operators of the max-product type with continuous and \(p\)-integrable functions ⋮ Approximation results in Orlicz spaces for sequences of Kantorovich MAX-product neural network operators ⋮ Interpolation by neural network operators activated by ramp functions ⋮ Convergence of a family of neural network operators of the Kantorovich type ⋮ Approximation by series of sigmoidal functions with applications to neural networks ⋮ The max-product generalized sampling operators: convergence and quantitative estimates ⋮ Univariate hyperbolic tangent neural network approximation ⋮ Multivariate hyperbolic tangent neural network approximation ⋮ Multivariate sigmoidal neural network approximation ⋮ Asymptotic expansion for neural network operators of the Kantorovich type and high order of approximation ⋮ Quantitative estimates involving K-functionals for neural network-type operators ⋮ The new approximation operators with sigmoidal functions ⋮ Quantitative approximation by perturbed Kantorovich-Choquet neural network operators ⋮ Rates of approximation by neural network interpolation operators ⋮ Approximate solutions of Volterra integral equations by an interpolation method based on ramp functions ⋮ The construction and approximation of ReLU neural network operators ⋮ The construction and approximation of the neural network with two weights ⋮ Approximation by perturbed neural network operators ⋮ Approximation by max-product sampling Kantorovich operators with generalized kernels ⋮ Approximations by multivariate perturbed neural network operators
Cites Work
- The essential order of approximation for neural networks
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- Uniform approximation by neural networks
- Rate of convergence of some neural network operators to the unit-univariate case
- Multilayer feedforward networks are universal approximators
- An approximation by neural networks with a fixed weight
- Degree of approximation by neural and translation networks with a single hidden layer
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by superpositions of a sigmoidal function
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: The approximation operators with sigmoidal functions