Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators
From MaRDI portal
Publication:521337
DOI10.1214/17-EJS1258zbMath1362.62087arXiv1605.08839OpenAlexW2597846060MaRDI QIDQ521337
Lee H. Dicker, Dean P. Foster, Daniel Hsu
Publication date: 7 April 2017
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1605.08839
Nonparametric regression and quantile regression (62G08) Factor analysis and principal components; correspondence analysis (62H25) Ridge regression; shrinkage estimators (Lasso) (62J07)
Related Items (20)
On principal components regression, random projections, and column subsampling ⋮ Functional principal subspace sampling for large scale functional data analysis ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Spectral algorithms for learning with dependent observations ⋮ Unnamed Item ⋮ Nonasymptotic analysis of robust regression with modified Huber's loss ⋮ Kernel regression, minimax rates and effective dimensionality: Beyond the regular case ⋮ On the Improved Rates of Convergence for Matérn-Type Kernel Ridge Regression with Application to Calibration of Computer Models ⋮ Kernel conjugate gradient methods with random projections ⋮ Distributed kernel-based gradient descent algorithms ⋮ Unnamed Item ⋮ On the predictive potential of kernel principal components ⋮ Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces ⋮ Sparse principal component regression via singular value decomposition approach ⋮ Optimal Rates for Multi-pass Stochastic Gradient Methods ⋮ Kernel partial least squares for stationary data ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Thresholded spectral algorithms for sparse approximations
This page was built for publication: Kernel ridge vs. principal component regression: minimax bounds and the qualification of regularization operators