Sparse recovery in convex hulls via entropy penalization
From MaRDI portal
Publication:1018643
DOI10.1214/08-AOS621zbMath1269.62039arXiv0905.2078OpenAlexW1991675104WikidataQ105584259 ScholiaQ105584259MaRDI QIDQ1018643
Publication date: 20 May 2009
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0905.2078
Multivariate analysis (62H99) Density estimation (62G07) Order statistics; empirical distribution functions (62G30) Statistical aspects of information-theoretic topics (62B10)
Related Items
Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso, Sparsity in multiple kernel learning, Sparse regression learning by aggregation and Langevin Monte-Carlo, Mirror averaging with sparsity priors, Transductive versions of the Lasso and the Dantzig selector, General nonexact oracle inequalities for classes with a subexponential envelope, Von Neumann entropy penalization and low-rank matrix estimation, Generalization of constraints for high dimensional regression problems, \(L_1\)-penalization in functional linear regression with subgaussian design, Oracle inequalities for high-dimensional prediction, On the exponentially weighted aggregate with the Laplace prior
Cites Work
- Unnamed Item
- The Dantzig selector and sparsity oracle inequalities
- Sparsity in penalized empirical risk minimization
- From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Some PAC-Bayesian theorems
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Information-theoretic upper and lower bounds for statistical estimation
- Aggregation by Exponential Weighting and Sharp Oracle Inequalities
- Sparse Density Estimation with ℓ1 Penalties
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution