High-dimensional Gaussian model selection on a Gaussian design
From MaRDI portal
Publication:985331
DOI10.1214/09-AIHP321zbMath1191.62076arXiv0808.2152OpenAlexW2963498088MaRDI QIDQ985331
Publication date: 21 July 2010
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0808.2152
model selectionlinear regressionoracle inequalitiesGaussian graphical modelsminimax rates of estimation
Nonparametric regression and quantile regression (62G08) Linear regression; mixed models (62J05) Inequalities; stochastic orderings (60E15)
Related Items (6)
Penalized contrast estimation in functional linear models with circular data ⋮ Adaptive estimation of linear functionals in functional linear models ⋮ Minimax risks for sparse regressions: ultra-high dimensional phenomenons ⋮ Adaptive estimation of covariance matrices via Cholesky decomposition ⋮ High-dimensional regression with unknown variance ⋮ Adaptive functional linear regression
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Near-ideal model selection by \(\ell _{1}\) minimization
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Tests for Gaussian graphical models
- Gaussian model selection with an unknown variance
- Estimating the dimension of a model
- Minimum contrast estimators on sieves: Exponential bounds and rates of convergence
- Adaptive estimation of a quadratic functional by model selection.
- Estimation of Gaussian graphs by model selection
- Model selection by resampling penalization
- Minimal penalties for Gaussian model selection
- Simultaneous analysis of Lasso and Dantzig selector
- Sparsity oracle inequalities for the Lasso
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Statistical predictor identification
- Power-law correlations, related models for long-range dependence and their simulation
- A New Lower Bound for Multiple Hypothesis Testing
- Decoding by Linear Programming
- An optimal selection of regression variables
- Probabilistic Networks and Expert Systems
- Gaussian Markov Random Fields
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Learning Theory and Kernel Machines
- Some Comments on C P
- Gaussian model selection
- A new look at the statistical model identification
This page was built for publication: High-dimensional Gaussian model selection on a Gaussian design