Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions

From MaRDI portal
Publication:1683689

DOI10.1007/s10107-017-1114-yzbMath1386.90116OpenAlexW2587436146WikidataQ47263899 ScholiaQ47263899MaRDI QIDQ1683689

Hongcheng Liu, Yinyu Ye, Tao Yao, Run-Ze Li

Publication date: 1 December 2017

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Full work available at URL: http://europepmc.org/articles/pmc5720392



Related Items

A second-order optimality condition with first- and second-order complementarity associated with global convergence of algorithms, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions, Some theoretical limitations of second-order algorithms for smooth constrained optimization, Solving constrained nonsmooth group sparse optimization via group Capped-\(\ell_1\) relaxation and group smoothing proximal gradient algorithm, Regularized sample average approximation for high-dimensional stochastic optimization under low-rankness, Sparse estimation via lower-order penalty optimization methods in high-dimensional linear regression, Regularized Linear Programming Discriminant Rule with Folded Concave Penalty for Ultrahigh-Dimensional Data, Linear-step solvability of some folded concave and singly-parametric sparse optimization problems, Unnamed Item, A cubic spline penalty for sparse approximation under tight frame balanced model, Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points, Computation of second-order directional stationary points for group sparse optimization, Hessian Barrier Algorithms for Linearly Constrained Optimization Problems, Sample average approximation with sparsity-inducing penalty for high-dimensional stochastic programming, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary



Cites Work