``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?
From MaRDI portal
Publication:2311100
DOI10.1007/s11590-018-1325-zzbMath1426.90253arXiv1712.03577OpenAlexW3101462918MaRDI QIDQ2311100
Julie Nutini, Mark Schmidt, Warren L. Hare
Publication date: 10 July 2019
Published in: Optimization Letters (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1712.03577
convex optimizationnon-smooth optimizationproximal gradient methodactive-set complexityactive-set identification
Related Items (12)
Nonsmoothness in machine learning: specific structure, proximal identification, and applications ⋮ An augmented Lagrangian method exploiting an active-set strategy and second-order information ⋮ A Subspace Acceleration Method for Minimization Involving a Group Sparsity-Inducing Regularizer ⋮ Screening for a reweighted penalized conditional gradient method ⋮ Active-Set Identification with Complexity Guarantees of an Almost Cyclic 2-Coordinate Descent Method with Armijo Line Search ⋮ On the strong convergence of forward-backward splitting in reconstructing jointly sparse signals ⋮ Thresholding gradient methods in Hilbert spaces: support identification and linear convergence ⋮ Local linear convergence of proximal coordinate descent algorithm ⋮ Active Set Complexity of the Away-Step Frank--Wolfe Algorithm ⋮ Distributed Learning with Sparse Communications by Identification ⋮ First-order Methods for the Impatient: Support Identification in Finite Time with Convergent Frank--Wolfe Variants ⋮ Avoiding bad steps in Frank-Wolfe variants
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Gradient methods for minimizing composite functions
- Introductory lectures on convex optimization. A basic course.
- Support-vector networks
- A globally convergent primal-dual active-set framework for large-scale convex quadratic optimization
- A Fast Active Set Block Coordinate Descent Algorithm for $\ell_1$-Regularized Least Squares
- A Feasible Active Set Method with Reoptimization for Convex Quadratic Mixed-Integer Programming
- Identifying Active Manifolds in Regularization Problems
- Accelerated Block-coordinate Relaxation for Regularized Optimization
- Activity Identification and Local Linear Convergence of Forward--Backward-type Methods
- A Fast Algorithm for Sparse Reconstruction Based on Shrinkage, Subspace Optimization, and Continuation
- Identifiable Surfaces in Constrained Optimization
- An algorithm for quadratic ℓ1-regularized optimization with a flexible active-set strategy
- On the Identification of Active Constraints
- On the Goldstein-Levitin-Polyak gradient projection method
- Signal Recovery by Proximal Forward-Backward Splitting
This page was built for publication: ``Active-set complexity of proximal gradient: how long does it take to find the sparsity pattern?