Lower estimation of approximation rate for neural networks
From MaRDI portal
Publication:848260
DOI10.1007/S11432-009-0027-7zbMath1192.68506OpenAlexW1999902962MaRDI QIDQ848260
Feilong Cao, Yongquan Zhang, Zong Ben Xu
Publication date: 3 March 2010
Published in: Science in China. Series F (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11432-009-0027-7
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The essential order of approximation for neural networks
- Approximation by Ridge functions and neural networks with one hidden layer
- Approximation by superposition of sigmoidal and radial basis functions
- On best approximation by ridge functions
- Approximation problems in system identification with neural networks
- Multilayer feedforward networks are universal approximators
- Degree of approximation by neural and translation networks with a single hidden layer
- The essential order of approximation for nearly exponential type neural networks
- Simultaneous approximations of multivariate functions and their derivatives by neural networks with one hidden layer
- Simultaneous \(\mathbf L^p\)-approximation order for neural networks
- Universal approximation bounds for superpositions of a sigmoidal function
- Dimension-independent bounds on the degree of approximation by neural networks
- Bounds on rates of variable-basis and neural-network approximation
- Comparison of worst case errors in linear and neural network approximation
- Improved rates and asymptotic normality for nonparametric neural network estimators
- Efficient estimation of neural weights by polynomial approximation
- Approximation by superpositions of a sigmoidal function
This page was built for publication: Lower estimation of approximation rate for neural networks