Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression

From MaRDI portal
Publication:6322210

arXiv1907.07167MaRDI QIDQ6322210

Author name not available (Why is that?)

Publication date: 16 July 2019

Abstract: Linear regression in ellp-norm is a canonical optimization problem that arises in several applications, including sparse recovery, semi-supervised learning, and signal processing. Generic convex optimization algorithms for solving ellp-regression are slow in practice. Iteratively Reweighted Least Squares (IRLS) is an easy to implement family of algorithms for solving these problems that has been studied for over 50 years. However, these algorithms often diverge for p > 3, and since the work of Osborne (1985), it has been an open problem whether there is an IRLS algorithm that is guaranteed to converge rapidly for p > 3. We propose p-IRLS, the first IRLS algorithm that provably converges geometrically for any pin[2,infty). Our algorithm is simple to implement and is guaranteed to find a (1+varepsilon)-approximate solution in O(p3.5mfracp22(p1)logfracmvarepsilon)leOp(sqrtmlogfracmvarepsilon) iterations. Our experiments demonstrate that it performs even better than our theoretical bounds, beats the standard Matlab/CVX implementation for solving these problems by 10--50x, and is the fastest among available implementations in the high-accuracy regime.




Has companion code repository: https://github.com/utoronto-theory/pIRLS








This page was built for publication: Fast, Provably convergent IRLS Algorithm for p-norm Linear Regression

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6322210)