Optimal rate of the regularized regression learning algorithm
From MaRDI portal
Publication:3008355
DOI10.1080/00207160.2010.516821zbMath1218.68141OpenAlexW2138350487MaRDI QIDQ3008355
Yongquan Zhang, Feilong Cao, Zong Ben Xu
Publication date: 15 June 2011
Published in: International Journal of Computer Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/00207160.2010.516821
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Inequalities in approximation (Bernstein, Jackson, Nikol'ski?-type inequalities) (41A17)
Related Items (1)
Cites Work
- Unnamed Item
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Learning rates of least-square regularized regression with polynomial kernels
- A note on different covering numbers in learning theory.
- The covering number in learning theory
- Optimal rates for the regularized least-squares algorithm
- Learning gradients by a gradient descent algorithm
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- Learning Theory: From Regression to Classification
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
- Covering numbers for support vector machines
- Shannon sampling and function reconstruction from point values
This page was built for publication: Optimal rate of the regularized regression learning algorithm