A data-driven line search rule for support recovery in high-dimensional data analysis
From MaRDI portal
Publication:2157522
DOI10.1016/j.csda.2022.107524OpenAlexW3216330441MaRDI QIDQ2157522
Lican Kang, Xiliang Lu, Peili Li, Yu Ling Jiao
Publication date: 22 July 2022
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2111.10806
line searchhigh-dimensional data analysis\(\ell_0\) penalty\(\ell_2\) error boundsparsity assumption
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Coordinate descent algorithms for nonconvex penalized regression, with applications to biological feature selection
- Nearly unbiased variable selection under minimax concave penalty
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- An extended Newton-type algorithm for \(\ell_2\)-regularized sparse logistic regression and its efficiency for classifying large-scale datasets
- A unified primal dual active set algorithm for nonconvex sparse recovery
- High-dimensional generalized linear models and the lasso
- Calibrating nonconvex penalized regression in ultra-high dimension
- Pathwise coordinate optimization
- Coordinate descent algorithms for lasso penalized regression
- Newton method for \(\ell_0\)-regularized optimization
- Atomic Decomposition by Basis Pursuit
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- Description of the Minimizers of Least Squares Regularized with $\ell_0$-norm. Uniqueness of the Global Minimizer
- Compressed Sensing With Nonlinear Observations and Related Nonlinear Optimization Problems
- The Group Lasso for Logistic Regression
- Elastic-net regularization: error estimates and active set methods
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- A Primal Dual Active Set Algorithm With Continuation for Compressed Sensing
- High-Dimensional Statistics
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sequential Lasso Cum EBIC for Feature Selection With Ultra-High Dimensional Feature Space
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Sparse Recovery With Orthogonal Matching Pursuit Under RIP
- Greedy Sparsity-Constrained Optimization
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: A data-driven line search rule for support recovery in high-dimensional data analysis