$\sigma$-Ridge: group regularized ridge regression via empirical Bayes noise level cross-validation
From MaRDI portal
Publication:6352590
arXiv2010.15817MaRDI QIDQ6352590
Nikolaos Ignatiadis, Panagiotis Lolas
Publication date: 29 October 2020
Abstract: Features in predictive models are not exchangeable, yet common supervised models treat them as such. Here we study ridge regression when the analyst can partition the features into groups based on external side-information. For example, in high-throughput biology, features may represent gene expression, protein abundance or clinical data and so each feature group represents a distinct modality. The analyst's goal is to choose optimal regularization parameters -- one for each group. In this work, we study the impact of on the predictive risk of group-regularized ridge regression by deriving limiting risk formulae under a high-dimensional random effects model with as . Furthermore, we propose a data-driven method for choosing that attains the optimal asymptotic risk: The key idea is to interpret the residual noise variance , as a regularization parameter to be chosen through cross-validation. An empirical Bayes construction maps the one-dimensional parameter to the -dimensional vector of regularization parameters, i.e., . Beyond its theoretical optimality, the proposed method is practical and runs as fast as cross-validated ridge regression without feature groups ().
Has companion code repository: https://github.com/nignatiadis/SigmaRidgeRegression.jl
This page was built for publication: $\sigma$-Ridge: group regularized ridge regression via empirical Bayes noise level cross-validation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6352590)