Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their Interpolation Spaces
zbMath1441.62178arXiv1706.03678MaRDI QIDQ5214209
Stephen Page, Steffen Grünewälder
Publication date: 7 February 2020
Full work available at URL: https://arxiv.org/abs/1706.03678
interpolation spaceregressionreproducing kernel Hilbert space (RKHS)Ivanov regularisationtraining and validation
Computational methods for problems pertaining to statistics (62-08) Estimation in multivariate analysis (62H12) General nonlinear regression (62J02) Numerical interpolation (65D05) Hilbert spaces with reproducing kernels (= (proper) functional Hilbert spaces, including de Branges-Rovnyak and other structured spaces) (46E22)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- On the optimal estimation of probability measures in weak and strong topologies
- Tikhonov, Ivanov and Morozov regularization for support vector machine learning
- Mercer's theorem on general domains: on the interaction between measures, kernels, and RKHSs
- Regularization in kernel learning
- Weak convergence and empirical processes. With applications to statistics
- Measurable diagonalization of positive definite matrices
- Optimal rates for the regularized least-squares algorithm
- Learning theory estimates via integral operators and their approximations
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Support Vector Machines
- A new concentration result for regularized risk minimizers
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
This page was built for publication: Ivanov-Regularised Least-Squares Estimators over Large RKHSs and Their Interpolation Spaces