Which bridge estimator is the best for variable selection?
From MaRDI portal
Publication:2215760
DOI10.1214/19-AOS1906zbMath1456.62147arXiv1705.08617OpenAlexW3087219089MaRDI QIDQ2215760
Arian Maleki, Shuaiwen Wang, Haolei Weng
Publication date: 14 December 2020
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1705.08617
asymptotic false discovery proportion (AFDP) and true positive proportion (ATPP)two-stage variable selection techniques (TVS)variable selection for linear models asymptotic
Related Items (5)
Variable Selection With Second-Generation P-Values ⋮ A precise high-dimensional asymptotic theory for boosting and minimum-\(\ell_1\)-norm interpolated classifiers ⋮ Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit ⋮ A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics ⋮ A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Higher criticism for large-scale inference, especially for rare and weak effects
- Sharp MSE bounds for proximal denoising
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- UPS delivers optimal phase diagram in high-dimensional variable selection
- Covariate assisted screening and estimation
- High-dimensionality effects in the Markowitz problem and other quadratic programs with linear constraints: risk underestimation
- Support recovery without incoherence: a case for nonconvex regularization
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- High-dimensional variable selection
- Controlling the false discovery rate via knockoffs
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- False discoveries occur early on the Lasso path
- High-dimensional asymptotics of prediction: ridge regression and classification
- Variable selection with Hamming loss
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Asymptotics for Lasso-type estimators.
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Universality in polytope phase transitions and message passing algorithms
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- High-dimensional graphs and variable selection with the Lasso
- Optimality of Graphlet Screening in High Dimensional Variable Selection
- On robust regression with high-dimensional predictors
- Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Does $\ell _{p}$ -Minimization Outperform $\ell _{1}$ -Minimization?
- Regularization after retention in ultrahigh dimensional linear regression models
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- High Dimensional Variable Selection via Tilting
- A Statistical View of Some Chemometrics Regression Tools
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Necessary and Sufficient Conditions for Sparsity Pattern Recovery
- Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Optimal Variable Selection and Adaptive Noisy Compressed Sensing
- A modern maximum-likelihood theory for high-dimensional logistic regression
- The LASSO Risk for Gaussian Matrices
- The Noise-Sensitivity Phase Transition in Compressed Sensing
- Nearly Sharp Sufficient Conditions on Exact Sparsity Pattern Recovery
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- Information Theoretic Bounds for Compressed Sensing
- Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
- Asymptotic Analysis of Complex LASSO via Complex Approximate Message Passing (CAMP)
- Sparse nonnegative solution of underdetermined linear equations by linear programming
- Consistent parameter estimation for Lasso and approximate message passing
This page was built for publication: Which bridge estimator is the best for variable selection?