Block majorization-minimization with diminishing radius for constrained nonconvex optimization

From MaRDI portal
Publication:6355444

arXiv2012.03503MaRDI QIDQ6355444

Author name not available (Why is that?)

Publication date: 7 December 2020

Abstract: Block coordinate descent (BCD), also known as nonlinear Gauss-Seidel, is a simple iterative algorithm for nonconvex optimization that sequentially minimizes the objective function in each block coordinate while the other coordinates are held fixed. We propose a version of BCD that, for block multi-convex and smooth objective functions under constraints, is guaranteed to converge to the stationary points with worst-case rate of convergence of O((logn)2/n) for n iterations, and a bound of O(epsilon1(logepsilon1)2) for the number of iterations to achieve an epsilon-approximate stationary point. Furthermore, we show that these results continue to hold even when the convex sub-problems are inexactly solved if the optimality gaps are uniformly summable against initialization. A key idea is to restrict the parameter search within a diminishing radius to promote stability of iterates. As an application, we provide an alternating least squares algorithm with diminishing radius for nonnegative CP tensor decomposition that converges to the stationary points of the reconstruction error with the same robust worst-case convergence rate and complexity bounds. We also experimentally validate our results with both synthetic and real-world data and demonstrate that using auxiliary search radius restriction can in fact improve the rate of convergence.




Has companion code repository: https://github.com/HanbaekLyu/BCD-DR








This page was built for publication: Block majorization-minimization with diminishing radius for constrained nonconvex optimization

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6355444)