Gradient Gating for Deep Multi-Rate Learning on Graphs


METADATA ONLY
Loading...

Date

2022-10

Publication Type

Report

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

We present Gradient Gating (G2), a novel framework for improving the performance of Graph Neural Networks (GNNs). Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph. Local gradients are harnessed to further modulate message passing updates. Our framework flexibly allows one to use any basic GNN layer as a wrapper around which the multi-rate gradient gating mechanism is built. We rigorously prove that G2 alleviates the oversmoothing problem and allows the design of deep GNNs. Empirical results are presented to demonstrate that the proposed framework achieves state-of-the-art performance on a variety of graph learning tasks, including on large-scale heterophilic graphs.

Permanent link

Publication status

published

Editor

Book title

Volume

2022-41

Pages / Article No.

Publisher

Seminar for Applied Mathematics, ETH Zurich

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

03851 - Mishra, Siddhartha / Mishra, Siddhartha check_circle

Notes

Funding

770880 - Computation and analysis of statistical solutions of fluid flow (EC)

Related publications and datasets