Classification with polynomial kernels and \(l^1\)-coefficient regularization
From MaRDI portal
Publication:514786
DOI10.11650/tjm.18.2014.3929zbMath1359.62265OpenAlexW1984344762MaRDI QIDQ514786
Hongzhi Tong, Di-Rong Chen, Fenghong Yang
Publication date: 9 March 2017
Published in: Taiwanese Journal of Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.11650/tjm.18.2014.3929
classificationcoefficient regularizationpolynomial kernelsBernstein-Kantorovich polynomiallearning rates
Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Multi-kernel regularized classifiers
- Support vector machines regression with \(l^1\)-regularizer
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Learning with sample dependent hypothesis spaces
- Approximation with polynomial kernels and SVM classifiers
- Local polynomial reproduction and moving least squares approximation
- On the mathematical foundations of learning
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Unnamed Item
- Unnamed Item