Learning rates for regularized classifiers using multivariate polynomial kernels
DOI10.1016/j.jco.2008.05.008zbMath1169.68043OpenAlexW2021340030MaRDI QIDQ958248
Hongzhi Tong, Di-Rong Chen, Li Zhong Peng
Publication date: 3 December 2008
Published in: Journal of Complexity (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jco.2008.05.008
reproducing kernel Hilbert spacespolynomial kernelslearning ratesBernstein-Durrmeyer polynomialsregularized classifiers
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Learning and adaptive systems in artificial intelligence (68T05) Pattern recognition, speech recognition (68T10)
Related Items (13)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Multi-kernel regularized classifiers
- Learning and approximation by Gaussians on Riemannian manifolds
- Analysis of support vector machines regression
- Bernstein-Durrmeyer polynomials on a simplex
- \(K\)-moduli, moduli of smoothness, and Bernstein polynomials on a simplex
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Support-vector networks
- Regularization networks and support vector machines
- Approximation with polynomial kernels and SVM classifiers
- Learning rates of least-square regularized regression
- On the mathematical foundations of learning
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- Chaos control using least-squares support vector machines
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Learning Theory
- Theory of Reproducing Kernels
This page was built for publication: Learning rates for regularized classifiers using multivariate polynomial kernels