Adaptive Lasso for generalized linear models with a diverging number of parameters
From MaRDI portal
Publication:4605261
DOI10.1080/03610926.2017.1285926zbMath1414.62317OpenAlexW2582147205MaRDI QIDQ4605261
Publication date: 21 February 2018
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2017.1285926
Asymptotic properties of parametric estimators (62F12) Ridge regression; shrinkage estimators (Lasso) (62J07) Generalized linear models (logistic models) (62J12)
Related Items (3)
The asymptotic properties of SCAD penalized generalized linear models with adaptive designs ⋮ Distributed adaptive lasso penalized generalized linear models for big data ⋮ A penalized estimation for the Cox model with ordinal multinomial covariates
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A unified approach to model selection and sparse recovery using regularized least squares
- The Adaptive Lasso and Its Oracle Properties
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Asymptotics for Lasso-type estimators.
- Nonconcave penalized likelihood with a diverging number of parameters.
- Weak convergence and empirical processes. With applications to statistics
- Bridge estimation for generalized linear models with a diverging number of parameters
- Adaptive Lasso estimators for ultrahigh dimensional generalized linear models
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Shrinkage Tuning Parameter Selection with a Diverging number of Parameters
- A Selective Overview of Variable Selection in High Dimensional Feature Space (Invited Review Article)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularization and Variable Selection Via the Elastic Net
- Tuning parameter selectors for the smoothly clipped absolute deviation method
This page was built for publication: Adaptive Lasso for generalized linear models with a diverging number of parameters