On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
From MaRDI portal
Publication:1345261
DOI10.1016/0893-6080(94)90040-XzbMath0817.62031OpenAlexW2051688774MaRDI QIDQ1345261
Adam Krzyżak, Alan L. Yuille, Lei Xu
Publication date: 2 July 1995
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/0893-6080(94)90040-x
upper boundsconsistencyleast squares estimatoruniversal approximationkernel regression estimatorsRBF netsbest consistent estimatorconvergence rates of the approximation errorParzen window estimatorradial basis function netsreceptive field size
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Learning and Convergence of the Normalized Radial Basis Functions Networks ⋮ On-line RBFNN based identification of rapidly time-varying nonlinear systems with optimal structure-adaptation. ⋮ Flexible regression modeling ⋮ On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size ⋮ On Different Facets of Regularization Theory ⋮ A heteroscedasticity diagnostic of a regression analysis with copula dependent random variables ⋮ Pattern recognition with ordered labels ⋮ On Learning and Convergence of RBF Networks in Regression Estimation and Classification ⋮ Random Projection RBF Nets for Multidimensional Density Estimation ⋮ Sensitivity analysis applied to the construction of radial basis function networks
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Networks and the best approximation property
- Distribution-free pointwise consistency of kernel regression estimate
- On the almost everywhere convergence of nonparametric regression function estimates
- An equivalence theorem for \(L_ 1\) convergence of the kernel regression estimate
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- Multilayer feedforward networks are universal approximators
- The pointwise rate of convergence of the kernel regression estimate
- Regularization Algorithms for Learning That Are Equivalent to Multilayer Networks
- Probability Inequalities for the Sum of Independent Random Variables
- On exponential bounds on the Bayes risk of the kernel classification rule
- The rates of convergence of kernel regression estimates and classification rules
- Probability Inequalities for Sums of Bounded Random Variables