Density problem and approximation error in learning theory
From MaRDI portal
Publication:2319010
DOI10.1155/2013/715683zbMath1470.68212OpenAlexW2108280195WikidataQ58916719 ScholiaQ58916719MaRDI QIDQ2319010
Publication date: 16 August 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2013/715683
Learning and adaptive systems in artificial intelligence (68T05) Interpolation in approximation theory (41A05)
Related Items
On the K-functional in learning theory, Convergence analysis for kernel-regularized online regression associated with an RRKHS, A simpler approach to coefficient regularized support vector machines regression
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Interpolation of scattered data: distance matrices and conditionally positive definite functions
- Networks and the best approximation property
- Eigenvalues of euclidean distance matrices
- Norm estimates for the inverses of a general class of scattered-data radial-function interpolation matrices
- Lower bounds for norms of inverses of interpolation matrices for radial basis functions
- Strictly positive definite functions on a real inner product space
- Positive definite dot product kernels in learning theory
- The covering number in learning theory
- Regularization networks and support vector machines
- Norms of inverses and condition numbers for matrices associated with scattered data
- On the mathematical foundations of learning
- 10.1162/153244302760185252
- Capacity of reproducing kernel spaces in learning theory
- Local error estimates for radial basis function interpolation of scattered data
- Practical Approximate Solutions to Linear Operator Equations When the Data are Noisy
- Error estimates for scattered data interpolation on spheres
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Theory of Reproducing Kernels
- Choosing multiple parameters for support vector machines