Convergence rate of the semi-supervised greedy algorithm
From MaRDI portal
Publication:459432
DOI10.1016/j.neunet.2013.03.001zbMath1296.68123OpenAlexW2041028606WikidataQ43837122 ScholiaQ43837122MaRDI QIDQ459432
Yicong Zhou, Zhibin Pan, Yuan Yan Tang, Luoqing Li, Hong Chen
Publication date: 9 October 2014
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2013.03.001
Related Items
Optimality of the rescaled pure greedy learning algorithms ⋮ Kernel-based sparse regression with the correntropy-induced loss ⋮ The convergence rate of semi-supervised regression with quadratic loss ⋮ The performance of semi-supervised Laplacian regularized regression with the least square loss ⋮ Generalization Analysis of Fredholm Kernel Regularized Classifiers ⋮ Convergence rate of SVM for kernel-based robust regression ⋮ Performance analysis of the LapRSSLG algorithm in learning theory
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Least square regression with indefinite kernels and coefficient regularization
- Semi-supervised learning on Riemannian manifolds
- Multi-kernel regularized classifiers
- Semi-supervised learning based on high density region estimation
- The generalization performance of ERM algorithm with strongly mixing observations
- Sparse regularization for semi-supervised classification
- Learning with sample dependent hypothesis spaces
- Error bounds of multi-graph regularized semi-supervised classification
- Approximation and learning by greedy algorithms
- On the mathematical foundations of learning
- Learning Theory
- Graph-Based Semi-Supervised Learning and Spectral Kernel Design
- 10.1162/jmlr.2003.3.4-5.781
- Approximation Bounds for Some Sparse Kernel Regression Algorithms