On the convergence rate of kernel-based sequential greedy regression
From MaRDI portal
Publication:1938256
DOI10.1155/2012/619138zbMath1256.68137OpenAlexW2061751626WikidataQ58695704 ScholiaQ58695704MaRDI QIDQ1938256
Xiaoyin Wang, Zhibin Pan, Xiaoyan Wei
Publication date: 4 February 2013
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2012/619138
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Cites Work
- Unnamed Item
- Consistency analysis of spectral regularization algorithms
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Learning by nonsymmetric kernels with data dependent spaces and \(\ell^1\)-regularizer
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Unified approach to coefficient-based regularized regression
- The convergence rate of a regularized ranking algorithm
- Multi-kernel regularized classifiers
- Learning rates of multi-kernel regression by orthogonal greedy algorithm
- Concentration estimates for learning with unbounded sampling
- Learning with sample dependent hypothesis spaces
- Error bounds of multi-graph regularized semi-supervised classification
- Approximation and learning by greedy algorithms
- On the mathematical foundations of learning
- Learning Theory
- Sequential greedy approximation for certain convex optimization problems
- Approximation Bounds for Some Sparse Kernel Regression Algorithms
- 10.1162/153244304773936108
This page was built for publication: On the convergence rate of kernel-based sequential greedy regression