Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models
From MaRDI portal
Publication:5079021
DOI10.1080/03610926.2019.1628991OpenAlexW2952965150MaRDI QIDQ5079021
No author found.
Publication date: 25 May 2022
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2019.1628991
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- A lava attack on the recovery of sums of dense and sparse signals
- The Adaptive Lasso and Its Oracle Properties
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- Variable selection for survival data with a class of adaptive elastic net techniques
- Rates of convergence of the adaptive LASSO estimators to the oracle distribution and higher order refinements by the bootstrap
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Statistics for high-dimensional data. Methods, theory and applications.
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- SLOPE-adaptive variable selection via convex optimization
- Lasso-type recovery of sparse representations for high-dimensional data
- Nonnegative-Lasso and application in index tracking
- Least angle regression. (With discussion)
- The Lasso problem and uniqueness
- Least squares after model selection in high-dimensional sparse models
- Network exploration via the adaptive LASSO and SCAD penalties
- Simultaneous analysis of Lasso and Dantzig selector
- On the adaptive elastic net with a diverging number of parameters
- Nonnegative elastic net and application in index tracking
- Adaptive robust variable selection
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Strong oracle optimality of folded concave penalized estimation
- Regularized Estimation in the Accelerated Failure Time Model with High-Dimensional Covariates
- Greed is Good: Algorithmic Results for Sparse Approximation
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularization and Variable Selection Via the Elastic Net