Oversquashing in GNNs through the lens of information contraction and graph expansion

From MaRDI portal
Publication:6407140

arXiv2208.03471MaRDI QIDQ6407140

Author name not available (Why is that?)

Publication date: 6 August 2022

Abstract: The quality of signal propagation in message-passing graph neural networks (GNNs) strongly influences their expressivity as has been observed in recent works. In particular, for prediction tasks relying on long-range interactions, recursive aggregation of node features can lead to an undesired phenomenon called "oversquashing". We present a framework for analyzing oversquashing based on information contraction. Our analysis is guided by a model of reliable computation due to von Neumann that lends a new insight into oversquashing as signal quenching in noisy computation graphs. Building on this, we propose a graph rewiring algorithm aimed at alleviating oversquashing. Our algorithm employs a random local edge flip primitive motivated by an expander graph construction. We compare the spectral expansion properties of our algorithm with that of an existing curvature-based non-local rewiring strategy. Synthetic experiments show that while our algorithm in general has a slower rate of expansion, it is overall computationally cheaper, preserves the node degrees exactly and never disconnects the graph.




Has companion code repository: https://github.com/kedar2/oversquashing








This page was built for publication: Oversquashing in GNNs through the lens of information contraction and graph expansion

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6407140)