Iterative kernel regression with preconditioning
From MaRDI portal
Publication:6587596
DOI10.1142/S0219530524500131MaRDI QIDQ6587596
Publication date: 14 August 2024
Published in: Analysis and Applications (Singapore) (Search for Journal in Brave)
convergence analysispreconditioned conjugate gradient methodNyström subsamplingregularized kernel regression
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Learning with coefficient-based regularization and \(\ell^1\)-penalty
- Concentration estimates for learning with \(\ell ^{1}\)-regularizer and data dependent hypothesis spaces
- Least square regression with indefinite kernels and coefficient regularization
- Optimal rates for regularization of statistical inverse learning problems
- On regularization algorithms in learning theory
- Regularization networks with indefinite kernels
- Learning theory estimates for coefficient-based regularized regression
- Optimal rates for coefficient-based regularized regression
- Optimal rates for the regularized least-squares algorithm
- Learning with sample dependent hypothesis spaces
- On some extensions of Bernstein's inequality for self-adjoint operators
- On early stopping in gradient descent learning
- Divide and conquer kernel ridge regression: a distributed algorithm with minimax optimal rates
- Learning Theory
- Support Vector Machines
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Faster Kernel Ridge Regression Using Sketching and Preconditioning
- Nyström subsampling method for coefficient-based regularized regression
- Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions
- Learning Rates of lq Coefficient Regularization Learning with Gaussian Kernel
- Analysis of regularized Nyström subsampling for regression functions of low smoothness
- Distributed learning with indefinite kernels
- Nyström type subsampling analyzed as a regularized projection
- Learning theory of distributed spectral algorithms
- Kernelized Elastic Net Regularization: Generalization Bounds, and Sparse Recovery
- An Introduction to Matrix Concentration Inequalities
- Theory of Reproducing Kernels
- Scattered Data Approximation
This page was built for publication: Iterative kernel regression with preconditioning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6587596)