Sparse recovery under matrix uncertainty
From MaRDI portal
Publication:605921
DOI10.1214/10-AOS793zbMath1373.62357arXiv0812.2818MaRDI QIDQ605921
Mathieu Rosenbaum, Alexandre B. Tsybakov
Publication date: 15 November 2010
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0812.2818
measurement errormissing dataportfolio selectionsparsityoracle inequalitiesmatrix uncertaintyerrors-in-variables modelMU-selectorportfolio replicationrestricted eigenvalue assumptionsign consistency
Asymptotic properties of parametric estimators (62F12) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Applications of statistics to actuarial sciences and financial mathematics (62P05)
Related Items
On Robustness of Principal Component Regression, A general family of trimmed estimators for robust high-dimensional data analysis, An \(\{\ell_{1},\ell_{2},\ell_{\infty}\}\)-regularization approach to high-dimensional errors-in-variables models, Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments, Balanced estimation for high-dimensional measurement error models, Inference for high dimensional linear models with error-in-variables, Estimation and variable selection in partial linear single index models with error-prone linear covariates, A convex optimization framework for the identification of homogeneous reaction systems, Calibrated zero-norm regularized LS estimator for high-dimensional error-in-variables regression, Screening Methods for Linear Errors-in-Variables Models in High Dimensions, L 0 -regularization for high-dimensional regression with corrupted data, Sparse estimation in high-dimensional linear errors-in-variables regression via a covariate relaxation method, Double bias correction for high-dimensional sparse additive hazards regression with covariate measurement errors, Model selection in high-dimensional noisy data: a simulation study, Weighted l1‐Penalized Corrected Quantile Regression for High‐Dimensional Temporally Dependent Measurement Errors, Scalable interpretable learning for multi-response error-in-variables regression, Multi-Task Learning with High-Dimensional Noisy Images, Concentration of measure bounds for matrix-variate data with missing values, Low-rank matrix estimation via nonconvex optimization methods in multi-response errors-in-variables regression, Deep learning for inverse problems with unknown operator, \(\ell_1\)-penalized quantile regression in high-dimensional sparse models, Learning partial differential equations for biological transport models from noisy spatio-temporal data, Least squares after model selection in high-dimensional sparse models, Perturbations of measurement matrices and dictionaries in compressed sensing, On Parameter Estimation for High Dimensional Errors-in-Variables Models, Weighted \(\ell_1\)-penalized corrected quantile regression for high dimensional measurement error models, Robust subspace clustering, Pivotal estimation via square-root lasso in nonparametric regression, High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity, Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error, Rate optimal estimation and confidence intervals for high-dimensional regression with missing covariates, Inference in high dimensional linear measurement error models, The generalized equivalence of regularization and min-max robustification in linear mixed models
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- The Dantzig selector and sparsity oracle inequalities
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- The restricted isometry property and its implications for compressed sensing
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Asymptotic equivalence for nonparametric regression with multivariate and random design
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Sparse and stable Markowitz portfolios
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- On inverse problems with unknown operators
- Wavelet Deconvolution With Noisy Eigenvalues
- Stable signal recovery from incomplete and inaccurate measurements
- Adaptive estimation for inverse problems with noisy operators