Exponential screening and optimal rates of sparse estimation
From MaRDI portal
Publication:548534
DOI10.1214/10-AOS854zbMath1215.62043arXiv1003.2654MaRDI QIDQ548534
Philippe Rigollet, Alexandre B. Tsybakov
Publication date: 29 June 2011
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1003.2654
aggregationadaptationBICsparsityhigh-dimensional regressionsparsity oracle inequalitieslassominimax rates
Nonparametric regression and quantile regression (62G08) Estimation in multivariate analysis (62H12) Asymptotic properties of nonparametric inference (62G20) Linear regression; mixed models (62J05) Nonparametric estimation (62G05) Minimax procedures in statistical decision theory (62C20)
Related Items
Combining a relaxed EM algorithm with Occam's razor for Bayesian variable selection in high-dimensional regression, On Robustness of Principal Component Regression, On cross-validated Lasso in high dimensions, On the prediction loss of the Lasso in the partially labeled setting, Aggregated hold out for sparse linear regression with a robust loss function, Entropic optimal transport is maximum-likelihood deconvolution, Solution of linear ill-posed problems by model selection and aggregation, Estimation of matrices with row sparsity, Exponential weights in multivariate regression and a low-rankness favoring prior, Bayesian linear regression with sparse priors, Sparse covariance matrix estimation in high-dimensional deconvolution, Transfer Learning in Large-Scale Gaussian Graphical Models with False Discovery Rate Control, Sharp oracle inequalities for aggregation of affine estimators, Targeting underrepresented populations in precision medicine: a federated transfer learning approach, Simple proof of the risk bound for denoising by exponential weights for asymmetric noise distributions, Theory of adaptive estimation, Empirical risk minimization is optimal for the convex aggregation problem, Minimax risks for sparse regressions: ultra-high dimensional phenomenons, Model selection in regression under structural constraints, Upper bounds and aggregation in bipartite ranking, MAP model selection in Gaussian regression, PAC-Bayesian bounds for sparse regression estimation with exponential weights, The Lasso as an \(\ell _{1}\)-ball model selection procedure, The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods, Mirror averaging with sparsity priors, Kullback-Leibler aggregation and misspecified generalized linear models, Sparse PCA: optimal rates and adaptive estimation, A general framework for Bayes structured linear models, Oracle inequalities and optimal inference under group sparsity, Aggregation of affine estimators, Estimation and variable selection with exponential weights, Statistical inference in compound functional models, Optimal learning with \textit{Q}-aggregation, Oracle Inequalities for Local and Global Empirical Risk Minimizers, A new perspective on least squares under convex constraint, Comment on ``Hypothesis testing by convex optimization, Isotonic regression meets Lasso, Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood, Oracle inequalities for high dimensional vector autoregressions, Oracle inequalities for high-dimensional prediction, Optimal bounds for aggregation of affine estimators, Restricted strong convexity implies weak submodularity, Slope meets Lasso: improved oracle bounds and optimality, Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space, Structured, Sparse Aggregation, Deviation optimal learning using greedy \(Q\)-aggregation, Exponential screening and optimal rates of sparse estimation, Estimation of high-dimensional low-rank matrices, Prediction error bounds for linear regression with the TREX, Anℓ1-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models, Prediction and estimation consistency of sparse multi-class penalized optimal scoring, Block-based refitting in \(\ell_{12}\) sparse regularization, High-dimensional regression with unknown variance, Sparse estimation by exponential weighting, Inference without compatibility: using exponential weighting for inference on a parameter of a linear model, Adaptive density estimation on bounded domains, On the exponentially weighted aggregate with the Laplace prior, Model-averaged ℓ1regularization using Markov chain Monte Carlo model composition, Sharp oracle inequalities for low-complexity priors, Regularization and the small-ball method II: complexity dependent error rates, Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation, Structured matrix estimation and completion, CLEAR: Covariant LEAst-Square Refitting with Applications to Image Restoration, Adaptive estimation over anisotropic functional classes via oracle approach, Robust Bayes estimation using the density power divergence
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Mirror averaging with sparsity priors
- Exponential screening and optimal rates of sparse estimation
- The Dantzig selector and sparsity oracle inequalities
- Generalized mirror averaging and \(D\)-convex aggregation
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- Sparsity in penalized empirical risk minimization
- The restricted isometry property and its implications for compressed sensing
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Mixing least-squares estimators when the variance is unknown
- A simple proof of the restricted isometry property for random matrices
- Minimax multiple shrinkage estimation
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Adaptive estimation of the intensity of inhomogeneous Poisson processes via concentration inequalities
- The risk inflation criterion for multiple regression
- On the conditions used to prove oracle results for the Lasso
- PAC-Bayesian bounds for sparse regression estimation with exponential weights
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Adapting to unknown sparsity by controlling the false discovery rate
- Information Theory and Mixing Least-Squares Regressions
- Combining Minimax Shrinkage Estimators
- Universal approximation bounds for superpositions of a sigmoidal function
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Learning Theory and Kernel Machines
- Aggregation by Exponential Weighting and Sharp Oracle Inequalities
- Introduction to nonparametric estimation
- The elements of statistical learning. Data mining, inference, and prediction