Denise: Deep Robust Principal Component Analysis for Positive Semidefinite Matrices

From MaRDI portal
Publication:6339603

arXiv2004.13612MaRDI QIDQ6339603

Author name not available (Why is that?)

Publication date: 28 April 2020

Abstract: The robust PCA of covariance matrices plays an essential role when isolating key explanatory features. The currently available methods for performing such a low-rank plus sparse decomposition are matrix specific, meaning, those algorithms must re-run for every new matrix. Since these algorithms are computationally expensive, it is preferable to learn and store a function that nearly instantaneously performs this decomposition when evaluated. Therefore, we introduce Denise, a deep learning-based algorithm for robust PCA of covariance matrices, or more generally, of symmetric positive semidefinite matrices, which learns precisely such a function. Theoretical guarantees for Denise are provided. These include a novel universal approximation theorem adapted to our geometric deep learning problem and convergence to an optimal solution to the learning problem. Our experiments show that Denise matches state-of-the-art performance in terms of decomposition quality, while being approximately 2000imes faster than the state-of-the-art, principal component pursuit (PCP), and 200imes faster than the current speed-optimized method, fast PCP.




Has companion code repository: https://github.com/DeepRPCA/Denise








This page was built for publication: Denise: Deep Robust Principal Component Analysis for Positive Semidefinite Matrices

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6339603)