On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
From MaRDI portal
Publication:4571884
DOI10.1137/17M1136390zbMath1401.90145arXiv1706.08732OpenAlexW2962681231MaRDI QIDQ4571884
Defeng Sun, Xudong Li, Kim-Chuan Toh
Publication date: 3 July 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1706.08732
Semidefinite programming (90C22) Convex programming (90C25) Large-scale problems in mathematical programming (90C06) Quadratic programming (90C20)
Related Items
Solving the OSCAR and SLOPE Models Using a Semismooth Newton-Based Augmented Lagrangian Method, An augmented Lagrangian method with constraint generation for shape-constrained convex regression problems, Efficient projection onto the intersection of a half-space and a box-like set and its generalized Jacobian, A dual-based stochastic inexact algorithm for a class of stochastic nonsmooth convex composite problems, Randomized Block Proximal Damped Newton Method for Composite Self-Concordant Minimization, A Riemannian Proximal Newton Method, A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems, The Linear and Asymptotically Superlinear Convergence Rates of the Augmented Lagrangian Method with a Practical Relative Error Criterion, Double fused Lasso penalized LAD for matrix regression, A dual based semismooth Newton-type algorithm for solving large-scale sparse Tikhonov regularization problems, An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems, Double fused Lasso regularized regression with both matrix and vector valued predictors, An efficient Hessian based algorithm for singly linearly and box constrained least squares regression, Spectral Operators of Matrices: Semismoothness and Characterizations of the Generalized Jacobian, Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem, An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems, A semismooth Newton-based augmented Lagrangian algorithm for density matrix least squares problems, B-Subdifferentials of the Projection onto the Generalized Simplex, Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- An efficient inexact symmetric Gauss-Seidel based majorized ADMM for high-dimensional convex composite conic programming
- A unified primal-dual algorithm framework based on Bregman iteration
- A dual algorithm for the solution of nonlinear variational problems via finite element approximation
- Secant methods for semismooth equations
- Newton and quasi-Newton methods for normal maps with polyhedral sets
- On concepts of directional differentiability
- Local extremes, runs, strings and multiresolution. (With discussion)
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- The convex geometry of linear inverse problems
- A nonsmooth version of Newton's method
- Pathwise coordinate optimization
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- Sparse Optimization with Least-Squares Constraints
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Solution Continuity in Monotone Affine Variational Inequalities
- Probing the Pareto Frontier for Basis Pursuit Solutions
- Optimization and nonsmooth analysis
- Monotone Operators and the Proximal Point Algorithm
- Semismooth and Semiconvex Functions in Constrained Optimization
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Variational Analysis
- A Highly Efficient Semismooth Newton Augmented Lagrangian Method for Solving Lasso Problems
- Sparsity and Smoothness Via the Fused Lasso
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- Piecewise Smoothness, Local Invertibility, and Parametric Analysis of Normal Maps
- Gauge Optimization and Duality
- Regularization and Variable Selection Via the Elastic Net
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Convex Analysis
- Semismooth Matrix-Valued Functions