Low-complexity subspace-descent over symmetric positive definite manifold

From MaRDI portal
Publication:6435159

arXiv2305.02041MaRDI QIDQ6435159

Author name not available (Why is that?)

Publication date: 3 May 2023

Abstract: This work puts forth low-complexity Riemannian subspace descent algorithms for the minimization of functions over the symmetric positive definite (SPD) manifold. Different from the existing Riemannian gradient descent variants, the proposed approach utilizes carefully chosen subspaces that allow the update to be written as a product of the Cholesky factor of the iterate and a sparse matrix. The resulting updates avoid the costly matrix operations like matrix exponentiation and dense matrix multiplication, which are generally required in almost all other Riemannian optimization algorithms on SPD manifold. We further identify a broad class of functions, arising in diverse applications, such as kernel matrix learning, covariance estimation of Gaussian distributions, maximum likelihood parameter estimation of elliptically contoured distributions, and parameter estimation in Gaussian mixture model problems, over which the Riemannian gradients can be calculated efficiently. The proposed uni-directional and multi-directional Riemannian subspace descent variants incur per-iteration complexities of mathcalO(n) and mathcalO(n2) respectively, as compared to the mathcalO(n3) or higher complexity incurred by all existing Riemannian gradient descent variants. The superior runtime and low per-iteration complexity of the proposed algorithms is also demonstrated via numerical tests on large-scale covariance estimation problems.




Has companion code repository: https://github.com/yogeshd-iitk/subspace_descent_over_SPD_manifold








This page was built for publication: Low-complexity subspace-descent over symmetric positive definite manifold

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6435159)