Smooth Bilevel Programming for Sparse Regularization

From MaRDI portal
Publication:6369297

arXiv2106.01429MaRDI QIDQ6369297

Clarice Poon, Gabriel Peyré

Publication date: 2 June 2021

Abstract: Iteratively reweighted least square (IRLS) is a popular approach to solve sparsity-enforcing regression problems in machine learning. State of the art approaches are more efficient but typically rely on specific coordinate pruning schemes. In this work, we show how a surprisingly simple reparametrization of IRLS, coupled with a bilevel resolution (instead of an alternating scheme) is able to achieve top performances on a wide range of sparsity (such as Lasso, group Lasso and trace norm regularizations), regularization strength (including hard constraints), and design matrices (ranging from correlated designs to differential operators). Similarly to IRLS, our method only involves linear systems resolutions, but in sharp contrast, corresponds to the minimization of a smooth function. Despite being non-convex, we show that there is no spurious minima and that saddle points are "ridable", so that there always exists a descent direction. We thus advocate for the use of a BFGS quasi-Newton solver, which makes our approach simple, robust and efficient. We perform a numerical benchmark of the convergence speed of our algorithm against state of the art solvers for Lasso, group Lasso, trace norm and linearly constrained problems. These results highlight the versatility of our approach, removing the need to use different solvers depending on the specificity of the ML problem under study.




Has companion code repository: https://github.com/miclegr/sbp-sr








This page was built for publication: Smooth Bilevel Programming for Sparse Regularization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6369297)