Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions
From MaRDI portal
Publication:526680
DOI10.1016/j.ins.2014.09.011zbMath1360.68691OpenAlexW2089590069MaRDI QIDQ526680
Publication date: 15 May 2017
Published in: Information Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.ins.2014.09.011
generalization abilitycorrelation measuremultiple kernel learninglocal Rademacher complexitykernel learning
Related Items (2)
An efficient kernel learning algorithm for semisupervised regression problems ⋮ A high-order norm-product regularized multiple kernel learning framework for kernel optimization
Uses Software
Cites Work
- A fast algorithm for manifold learning by posing it as a symmetric diagonally dominant linear system
- Sparsity in multiple kernel learning
- Evolutionary combination of kernels for nonlinear feature transformation
- Sparsity in penalized empirical risk minimization
- Multi-kernel regularized classifiers
- Convex multi-task feature learning
- Optimal rates for the regularized least-squares algorithm
- Statistical performance of support vector machines
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Learning rates of least-square regularized regression
- Local Rademacher complexities
- On the mathematical foundations of learning
- Support Vector Machines
- A Reproducing Kernel Hilbert Space Framework for Spike Train Signal Processing
- Advanced Lectures on Machine Learning
- Choosing multiple parameters for support vector machines
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Optimal learning rates of \(l^p\)-type multiple kernel learning under general conditions