Learning rates of least-square regularized regression with polynomial kernels
From MaRDI portal
Publication:1041518
DOI10.1007/S11425-008-0137-5zbMath1186.68374OpenAlexW2134876593MaRDI QIDQ1041518
Publication date: 2 December 2009
Published in: Science in China. Series A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11425-008-0137-5
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
On the need for structure modelling in sequence prediction ⋮ The convergence rates of Shannon sampling learning algorithms ⋮ Optimal rate of the regularized regression learning algorithm
Cites Work
- Unnamed Item
- Unnamed Item
- Fast rates for support vector machines using Gaussian kernels
- Rate of convergence of Bernstein polynomials revisited
- An interpolation theorem and its applications to positive operators
- The covering number in learning theory
- Regularization networks and support vector machines
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- Learning Theory
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Theory of Reproducing Kernels
This page was built for publication: Learning rates of least-square regularized regression with polynomial kernels