A communication-efficient method for ℓ0 regularization linear regression models
From MaRDI portal
Publication:6074135
DOI10.1080/00949655.2022.2111567MaRDI QIDQ6074135
Yuan Luo, Unnamed Author, Lican Kang, Yan Yan Liu, Xue-rui Li
Publication date: 19 September 2023
Published in: Journal of Statistical Computation and Simulation (Search for Journal in Brave)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- On statistics, computation and scalability
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- A unified primal dual active set algorithm for nonconvex sparse recovery
- Complexity of unconstrained \(L_2 - L_p\) minimization
- High-dimensional graphs and variable selection with the Lasso
- Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Sparse Approximate Solutions to Linear Systems
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Communication-Efficient Distributed Statistical Inference
- Dual Averaging for Distributed Optimization: Convergence Analysis and Network Scaling