Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem
From MaRDI portal
Publication:5231697
DOI10.1137/18M1207752zbMath1427.90200arXiv1808.07181MaRDI QIDQ5231697
Meixia Lin, Defeng Sun, Yong-Jin Liu, Kim-Chuan Toh
Publication date: 27 August 2019
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1808.07181
Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Applications of mathematical programming (90C90)
Related Items
A geometric proximal gradient method for sparse least squares regression with probabilistic simplex constraint, Difference-of-Convex Algorithms for a Class of Sparse Group $\ell_0$ Regularized Optimization Problems, A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems, A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems, The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion, An efficient Hessian based algorithm for singly linearly and box constrained least squares regression, An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems, A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems, Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming
- A unified primal-dual algorithm framework based on Bregman iteration
- Active set algorithms for isotonic regression; a unifying framework
- Split Bregman method for large scale fused Lasso
- On the Douglas-Rachford splitting method and the proximal point algorithm for maximal monotone operators
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Newton and quasi-Newton methods for normal maps with polyhedral sets
- Sparse regression with exact clustering
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- A nonsmooth version of Newton's method
- Hankel Matrix Rank Minimization with Applications to System Identification and Realization
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Some continuity properties of polyhedral multifunctions
- Monotone Operators and the Proximal Point Algorithm
- Semismooth and Semiconvex Functions in Constrained Optimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Sparse Reconstruction by Separable Approximation
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Sparsity and Smoothness Via the Fused Lasso
- Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method
- Regularization and Variable Selection Via the Elastic Net
- Model Selection and Estimation in Regression with Grouped Variables
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Convex Analysis
- Semismooth Matrix-Valued Functions