A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data
From MaRDI portal
Publication:830734
DOI10.1016/j.jspi.2020.12.001zbMath1465.62132OpenAlexW3112516299MaRDI QIDQ830734
Xiaoling Peng, Eric Kawaguchi, Gang Li, Marc A. Suchard, Ning Li
Publication date: 7 May 2021
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2020.12.001
ridge regressionvariable selectiongeneralized linear models\( L_0\) penaltyhigh-dimensional massive sample size data
Ridge regression; shrinkage estimators (Lasso) (62J07) Applications of statistics to biology and medical sciences; meta analysis (62P10) Generalized linear models (logistic models) (62J12)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Sure independence screening in generalized linear models with NP-dimensionality
- Exact post-selection inference, with application to the Lasso
- Robust rank correlation based screening
- Estimating the dimension of a model
- Heuristics of instability and stabilization in model selection
- Model free feature screening for ultrahigh dimensional data with responses missing at random
- Adaptive conditional feature screening
- Broken adaptive ridge regression and its asymptotic properties
- Nonconcave penalized likelihood with a diverging number of parameters.
- The risk inflation criterion for multiple regression
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction
- A significance test for the lasso
- Coordinate descent algorithms for lasso penalized regression
- Strong oracle optimality of folded concave penalized estimation
- Forward Regression for Ultra-High Dimensional Variable Screening
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- Extended Bayesian information criteria for model selection with large model spaces
- Regularized quantile regression and robust feature screening for single index models
- The central role of the propensity score in observational studies for causal effects
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Massive Parallelization of Serial Inference Algorithms for a Complex Generalized Linear Model
- Feature Screening via Distance Correlation Learning
- Likelihood-Based Selection and Sharp Parameter Estimation
- Feature Selection for Varying Coefficient Models With Ultrahigh-Dimensional Covariates
- The Sparse MLE for Ultrahigh-Dimensional Feature Screening
- Some Comments on C P
- A new look at the statistical model identification