An overview of reciprocal \(L_1\)-regularization for high dimensional regression data
From MaRDI portal
Publication:6602178
DOI10.1002/wics.1416zbMATH Open1544.62137MaRDI QIDQ6602178
Publication date: 11 September 2024
Published in: Wiley Interdisciplinary Reviews. WIREs Computational Statistics (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Sure independence screening in generalized linear models with NP-dimensionality
- Nearly unbiased variable selection under minimax concave penalty
- Bayesian variable selection with shrinking and diffusing priors
- The Adaptive Lasso and Its Oracle Properties
- Cross-validation for selecting a model selection procedure
- On the computational complexity of high-dimensional Bayesian variable selection
- One-step sparse estimates in nonconcave penalized likelihood models
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Lasso-type recovery of sparse representations for high-dimensional data
- Estimating the dimension of a model
- Some connections between Bayesian and non-Bayesian methods for regression model selection
- Adaptive estimation of a quadratic functional by model selection.
- Least angle regression. (With discussion)
- Extended BIC for small-n-large-P sparse GLM
- Extended Bayesian information criteria for model selection with large model spaces
- The Bayesian Lasso
- Simulated annealing process in general state space
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Likelihood-Based Selection and Sharp Parameter Estimation
- Bayesian Model Selection in High-Dimensional Settings
- Simulated Stochastic Approximation Annealing for Global Optimization With a Square-Root Cooling Schedule
- Linear Model Selection by Cross-Validation
- Shotgun Stochastic Search for “Largep” Regression
- Stochastic Approximation in Monte Carlo Computation
- Bayesian Subset Modeling for High-Dimensional Generalized Linear Models
- High-Dimensional Variable Selection With Reciprocal L1-Regularization
- A Split-and-Merge Bayesian Variable Selection Approach for Ultrahigh Dimensional Regression
- A new look at the statistical model identification
Related Items (1)
This page was built for publication: An overview of reciprocal \(L_1\)-regularization for high dimensional regression data