The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold
From MaRDI portal
Publication:6409738
arXiv2209.01812MaRDI QIDQ6409738
Author name not available (Why is that?)
Publication date: 5 September 2022
Abstract: We give an effective solution to the regularized optimization problem , where is constrained on the unit sphere . Here is a smooth cost with Lipschitz continuous gradient within the unit ball whereas is typically non-smooth but convex and absolutely homogeneous, extit{e.g.,}~norm regularizers and their combinations. Our solution is based on the Riemannian proximal gradient, using an idea we call extit{proxy step-size} -- a scalar variable which we prove is monotone with respect to the actual step-size within an interval. The proxy step-size exists ubiquitously for convex and absolutely homogeneous , and decides the actual step-size and the tangent update in closed-form, thus the complete proximal gradient iteration. Based on these insights, we design a Riemannian proximal gradient method using the proxy step-size. We prove that our method converges to a critical point, guided by a line-search technique based on the cost only. The proposed method can be implemented in a couple of lines of code. We show its usefulness by applying nuclear norm, norm, and nuclear-spectral norm regularization to three classical computer vision problems. The improvements are consistent and backed by numerical experiments.
Has companion code repository: https://bitbucket.org/fangbai/proxystepsize-pgs
This page was built for publication: The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6409738)