Outlier-robust sparse/low-rank least-squares regression and robust matrix completion
From MaRDI portal
Publication:6355891
arXiv2012.06750MaRDI QIDQ6355891
Author name not available (Why is that?)
Publication date: 12 December 2020
Abstract: We study high-dimensional least-squares regression within a subgaussian statistical learning framework with heterogeneous noise. It includes -sparse and -low-rank least-squares regression when a fraction of the labels are adversarially contaminated. We also present a novel theory of trace-regression with matrix decomposition based on a new application of the product process. For these problems, we show novel near-optimal "subgaussian" estimation rates of the form , valid with probability at least . Here, is the optimal uncontaminated rate as a function of the effective dimension but independent of the failure probability . These rates are valid uniformly on , i.e., the estimators' tuning do not depend on . Lastly, we consider noisy robust matrix completion with non-uniform sampling. If only the low-rank matrix is of interest, we present a novel near-optimal rate that is independent of the corruption level . Our estimators are tractable and based on a new "sorted" Huber-type loss. No information on are needed to tune these estimators. Our analysis makes use of novel -optimal concentration inequalities for the multiplier and product processes which could be useful elsewhere. For instance, they imply novel sharp oracle inequalities for Lasso and Slope with optimal dependence on . Numerical simulations confirm our theoretical predictions. In particular, "sorted" Huber regression can outperform classical Huber regression.
Has companion code repository: https://github.com/philipthomp/Outlier-robust-regression
This page was built for publication: Outlier-robust sparse/low-rank least-squares regression and robust matrix completion
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6355891)