Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression
From MaRDI portal
Publication:2687439
DOI10.1007/s10898-022-01220-5OpenAlexW4294675186MaRDI QIDQ2687439
Chong Li, Tianzi Jiang, Xin Li, Yao-Hua Hu, Xiao Qi Yang
Publication date: 2 March 2023
Published in: Journal of Global Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1911.05073
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- DC approximation approaches for sparse optimization
- Sparse and robust normal and \(t\)-portfolios by penalized \(L_q\)-likelihood minimization
- High-dimensional regression with noisy and missing data: provable guarantees with nonconvexity
- Iterative thresholding for sparse approximations
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- A filled function method for finding a global minimizer of a function of several variables
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Equations and inequalities. Elementary problems and theorems in algebra and number theory. Translated from the second (1996) Czech edition by Karl Dilcher and revised by the authors
- A linear programming model for selection of sparse high-dimensional multiperiod portfolios
- Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Atomic Decomposition by Basis Pursuit
- A Nonlinear Lagrangian Approach to Constrained Optimization Problems
- Abstract Convexity and Augmented Lagrangians
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Iteratively reweighted least squares minimization for sparse recovery
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Uncertainty principles and ideal atomic decomposition
- Sparse Approximate Solutions to Linear Systems
- On Recovery of Sparse Signals Via $\ell _{1}$ Minimization
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- The Discrete Dantzig Selector: Estimating Sparse Linear Models via Mixed Integer Linear Optimization
- Group sparse optimization via $\ell_{p,q}$ regularization
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution
- Stable signal recovery from incomplete and inaccurate measurements
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Linear Statistical Inference and its Applications
- A Unified Augmented Lagrangian Approach to Duality and Exact Penalization
- Mathematical Programs with Equilibrium Constraints
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- A general theory of concave regularization for high-dimensional sparse estimation problems
This page was built for publication: Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression