High-dimensional linear regression with hard thresholding regularization: theory and algorithm
From MaRDI portal
Publication:2097492
DOI10.3934/jimo.2022034OpenAlexW4226427382MaRDI QIDQ2097492
Yan Yan Liu, Lican Kang, Yuan Luo, Jing Zhang, Yan-Ming Lai
Publication date: 14 November 2022
Published in: Journal of Industrial and Management Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3934/jimo.2022034
generalized Newton methodhigh-dimensionallinear regression modelprimal dual active set algorithmhard thresholding regularization
Estimation in multivariate analysis (62H12) Linear regression; mixed models (62J05) Analysis of algorithms and problem complexity (68Q25)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- The Adaptive Lasso and Its Oracle Properties
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Wavelet methods in statistics: some recent developments and their applications
- Heuristics of instability and stabilization in model selection
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- A unified primal dual active set algorithm for nonconvex sparse recovery
- A nonsmooth version of Newton's method
- On the adaptive elastic net with a diverging number of parameters
- Calibrating nonconvex penalized regression in ultra-high dimension
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Lagrange Multiplier Approach to Variational Problems and Applications
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
- A Statistical View of Some Chemometrics Regression Tools
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Bayesian inference for high‐dimensional linear regression under mnet priors
- Convergence of a block coordinate descent method for nondifferentiable minimization
- A general theory of concave regularization for high-dimensional sparse estimation problems