A systematic review on model selection in high-dimensional regression
DOI10.1016/j.jkss.2018.10.001zbMath1411.62204OpenAlexW2900489223WikidataQ128948583 ScholiaQ128948583MaRDI QIDQ1726155
Jinwoo Cho, Kyusang Yu, Eun Ryung Lee
Publication date: 19 February 2019
Published in: Journal of the Korean Statistical Society (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jkss.2018.10.001
model selectionquantile regressionlinear regressionoracle propertyLassoSCADpenalized methodsmodel selection consistencygeneral convex losshigh dimensional regression modelshigh level conditionsquadratic margin condition
Nonparametric regression and quantile regression (62G08) Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05)
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- Statistics for high-dimensional data. Methods, theory and applications.
- One-step sparse estimates in nonconcave penalized likelihood models
- Nonconcave penalized likelihood with a diverging number of parameters.
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Calibrating nonconvex penalized regression in ultra-high dimension
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Square root penalty: Adaption to the margin in classification and in edge estimation
- Extended BIC for small-n-large-P sparse GLM
- Large sample properties of the smoothly clipped absolute deviation penalized maximum likelihood estimation on high dimensions
- Global optimality of nonconvex penalized estimators
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- Extended Bayesian information criteria for model selection with large model spaces
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Quantile Regression for Analyzing Heterogeneity in Ultra-High Dimension
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Regularization Parameter Selections via Generalized Information Criterion
- Smoothly Clipped Absolute Deviation on High Dimensions
- Tuning parameter selectors for the smoothly clipped absolute deviation method