The learning rates of regularized regression based on reproducing kernel Banach spaces
From MaRDI portal
Publication:2318985
DOI10.1155/2013/694181zbMath1470.68174OpenAlexW2124978107WikidataQ58916646 ScholiaQ58916646MaRDI QIDQ2318985
Publication date: 16 August 2019
Published in: Abstract and Applied Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1155/2013/694181
General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items (5)
Error analysis on Hérmite learning with gradient data ⋮ On the K-functional in learning theory ⋮ Convergence rate of SVM for kernel-based robust regression ⋮ Solutions of nonlinear systems by reproducing kernel method ⋮ Error analysis of the kernel regularized regression based on refined convex losses and RKBSs
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation by multivariate Bernstein-Durrmeyer operators and learning rates of least-squares regularized regression with multivariate polynomial kernels
- Uniform convergence of Bernstein-Durrmeyer operators with respect to arbitrary measure
- Error bounds for \(l^p\)-norm multiple kernel learning with least square loss
- Minimization of the Tikhonov functional in Banach spaces smooth and convex of power type by steepest descent in the dual
- Marcinkiewicz-Zygmund measures on manifolds
- The covering number for some Mercer kernel Hilbert spaces on the unit sphere
- Regularized learning in Banach spaces as an optimization problem: representer theorems
- Minimization of Tikhonov functionals in Banach spaces
- Reproducing kernel Hilbert spaces associated with analytic translation-invariant Mercer kernels
- Learning rates for regularized classifiers using multivariate polynomial kernels
- Multivariate Bernstein-Durrmeyer operators with arbitrary weight functions
- Characteristic inequalities of uniformly convex and uniformly smooth Banach spaces
- Best choices for regularization parameters in learning theory: on the bias-variance problem.
- The covering number in learning theory
- Sampling and reconstruction of signals in a reproducing kernel subspace of \(L^p(\mathbb R^d)\)
- The covering number for some Mercer kernel Hilbert spaces
- Approximation with polynomial kernels and SVM classifiers
- Modulus of continuity conditions for Jacobi series
- Generalized semi-inner products with applications to regularized learning
- On the mathematical foundations of learning
- Covering Numbers for Convex Functions
- Learning Theory
- Inequalities in Banach spaces with applications
- Learning Theory
- Theory of Reproducing Kernels
This page was built for publication: The learning rates of regularized regression based on reproducing kernel Banach spaces