Manifold Filter-Combine Networks

From MaRDI portal
Publication:6443054

arXiv2307.04056MaRDI QIDQ6443054

Joyce Chew, Michael Perlmutter, Smita Krishnaswamy, Deanna Needell, Edward De Brouwer

Publication date: 8 July 2023

Abstract: We introduce a large class of manifold neural networks (MNNs) which we call Manifold Filter-Combine Networks. This class includes as special cases, the MNNs considered in previous work by Wang, Ruiz, and Ribeiro, the manifold scattering transform (a wavelet-based model of neural networks), and other interesting examples not previously considered in the literature such as the manifold equivalent of Kipf and Welling's graph convolutional network. We then consider a method, based on building a data-driven graph, for implementing such networks when one does not have global knowledge of the manifold, but merely has access to finitely many sample points. We provide sufficient conditions for the network to provably converge to its continuum limit as the number of sample points tends to infinity. Unlike previous work (which focused on specific MNN architectures and graph constructions), our rate of convergence does not explicitly depend on the number of filters used. Moreover, it exhibits linear dependence on the depth of the network rather than the exponential dependence obtained previously.




Has companion code repository: https://github.com/dj408/mfcn








This page was built for publication: Manifold Filter-Combine Networks

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6443054)