Proximal methods for the latent group lasso penalty
From MaRDI portal
Publication:457209
DOI10.1007/s10589-013-9628-6zbMath1305.90388arXiv1209.0368OpenAlexW2028007849MaRDI QIDQ457209
Lorenzo Rosasco, Silvia Villa, Sofia Mosci, Alessandro Verri
Publication date: 26 September 2014
Published in: Computational Optimization and Applications (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1209.0368
Related Items
A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure ⋮ Efficient inexact proximal gradient algorithms for structured sparsity-inducing norm ⋮ The \(L_{2,1}\)-norm-based unsupervised optimal feature selection with applications to action recognition ⋮ Hierarchical sparse modeling: a choice of two group Lasso formulations ⋮ Convergence of stochastic proximal gradient algorithm ⋮ Selective linearization for multi-block statistical learning ⋮ Graphical-model based high dimensional generalized linear models ⋮ A stochastic inertial forward–backward splitting algorithm for multivariate monotone inclusions ⋮ Efficient proximal mapping computation for low-rank inducing norms ⋮ Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Smooth minimization of non-smooth functions
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- Smoothing proximal gradient method for general structured sparse regression
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Accelerated projected gradient method for linear inverse problems with sparsity constraints
- Theoretical foundations and numerical methods for sparse recovery. Papers based on the presentations of the summer school ``Theoretical foundations and numerical methods for sparse recovery, Vienna, Austria, August 31 -- September 4, 2009.
- The composite absolute penalties family for grouped and hierarchical variable selection
- An algorithm for minimizing a differentiable function subject to box constraints and errors
- Least angle regression. (With discussion)
- The approximation of fixed points of compositions of nonexpansive mappings in Hilbert space
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Efficient block-coordinate descent algorithms for the group Lasso
- Accelerated and Inexact Forward-Backward Algorithms
- Optimization with Sparsity-Inducing Penalties
- Nonparametric sparsity and regularization
- NESTA: A Fast and Accurate First-Order Method for Sparse Recovery
- The Gradient Projection Method for Nonlinear Programming. Part I. Linear Constraints
- Fixed-Point Continuation for $\ell_1$-Minimization: Methodology and Convergence
- The Group Lasso for Logistic Regression
- Monotone Operators and the Proximal Point Algorithm
- Atomic Decomposition by Basis Pursuit
- Projected Newton Methods for Optimization Problems with Simple Constraints
- L 1-Regularization Path Algorithm for Generalized Linear Models
- Dequantizing Compressed Sensing: When Oversampling and Non-Gaussian Constraints Combine
- Structured Sparsity via Alternating Direction Methods
- Model Selection and Estimation in Regression with Grouped Variables
- Signal Recovery by Proximal Forward-Backward Splitting
- Convex analysis and monotone operator theory in Hilbert spaces