Optimal classification in sparse Gaussian graphic model
From MaRDI portal
Publication:2438761
DOI10.1214/13-AOS1163zbMath1294.62061arXiv1212.5332OpenAlexW3100712906MaRDI QIDQ2438761
Publication date: 6 March 2014
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1212.5332
chromatic numberphase diagramprecision matrixsparse graphFisher's LDAFisher's separationrare and weak model
Related Items (17)
Integrative genetic risk prediction using non‐parametric empirical Bayes classification ⋮ Scalable inference for high-dimensional precision matrix ⋮ Adaptive threshold-based classification of sparse high-dimensional data ⋮ Large-scale inference with block structure ⋮ High-dimensional sparse MANOVA ⋮ Estimating the amount of sparsity in two-point mixture models ⋮ Optimal classification in sparse Gaussian graphic model ⋮ Bayesian sparse graphical models for classification with application to protein expression data ⋮ Halfspace depths for scatter, concentration and shape matrices ⋮ Detecting rare and faint signals via thresholding maximum likelihood estimators ⋮ Diagonally Dominant Principal Component Analysis ⋮ Unnamed Item ⋮ Dynamic linear discriminant analysis in high dimensional space ⋮ Robust Variable and Interaction Selection for Logistic Regression and General Index Models ⋮ Reproducible learning in large-scale graphical models ⋮ Innovated interaction screening for high-dimensional nonlinear classification ⋮ Higher criticism for large-scale inference, especially for rare and weak effects
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Innovated higher criticism for detecting sparse signals in correlated noise
- Sparse inverse covariance estimation with the graphical lasso
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Sparse linear discriminant analysis by thresholding for high dimensional data
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- High-dimensional classification using features annealed independence rules
- Minimax risk over \(l_ p\)-balls for \(l_ q\)-error
- Some problems of hypothesis testing leading to infinitely divisible distributions
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Minimax detection of a signal for \(l^ n\)-balls.
- Higher criticism for detecting sparse heterogeneous mixtures.
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Optimal classification in sparse Gaussian graphic model
- Tests alternative to higher criticism for high-dimensional means under sparsity and column-wise dependence
- Goodness-of-fit tests via phi-divergences
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Estimation and confidence sets for sparse normal mixtures
- Regularized estimation of large covariance matrices
- Higher criticism thresholding: Optimal feature selection when useful features are rare and weak
- Impossibility of successful classification when useful features are rare and weak
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- Classification of sparse high-dimensional vectors
- Feature selection by higher criticism thresholding achieves the optimal phase diagram
- Theoretical Measures of Relative Performance of Classifiers for High Dimensional Data with Small Sample Sizes
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Empirical Bayes Estimates for Large-Scale Prediction Problems
- Random forests
This page was built for publication: Optimal classification in sparse Gaussian graphic model