Simplicial Attention Networks
From MaRDI portal
Publication:6396975
arXiv2204.09455MaRDI QIDQ6396975
Author name not available (Why is that?)
Publication date: 20 April 2022
Abstract: Graph representation learning methods have mostly been limited to the modelling of node-wise interactions. Recently, there has been an increased interest in understanding how higher-order structures can be utilised to further enhance the learning abilities of graph neural networks (GNNs) in combinatorial spaces. Simplicial Neural Networks (SNNs) naturally model these interactions by performing message passing on simplicial complexes, higher-dimensional generalisations of graphs. Nonetheless, the computations performed by most existent SNNs are strictly tied to the combinatorial structure of the complex. Leveraging the success of attention mechanisms in structured domains, we propose Simplicial Attention Networks (SAT), a new type of simplicial network that dynamically weighs the interactions between neighbouring simplicies and can readily adapt to novel structures. Additionally, we propose a signed attention mechanism that makes SAT orientation equivariant, a desirable property for models operating on (co)chain complexes. We demonstrate that SAT outperforms existent convolutional SNNs and GNNs in two image and trajectory classification tasks.
Has companion code repository: https://github.com/ggoh29/simplicial-neural-network-benchmark
This page was built for publication: Simplicial Attention Networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6396975)