Abstract
We present Gradient Gating (G2), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G2 alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Funding
770880 - Computation and analysis of statistical solutions of fluid flow (EC)
More
Show all metadata
ETH Bibliography
yes
Altmetrics