Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: WiML Workshop 1

Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

Yujun Yan · Milad Hashemi · Kevin Swersky · Yaoqing Yang · Danai Koutra


Abstract:

In node classification tasks, graph convolutional neural networks (GCNs) may perform poorly in graphs where neighbors have different features/classes (heterophily) and when stacking multiple layers (oversmoothing). These two seemingly unrelated problems have been studied mostly independently, but there is recent empirical evidence that solving one problem may benefit the other. In this work, going beyond empirical observations, we aim to: (1) propose a unified theoretical perspective to analyze the heterophily and oversmoothing problems, (2) identify the common causes of the two problems based on our theoretical insights, and (3) propose simple yet effective strategies to address the common causes.

In our theoretical analysis, we view the changes of node representations at different layers as their "movements" in the latent space, and we prove that, under certain conditions, movements of nodes towards the other class lead to a non-decreasing misclassification rate of node labels. We show that the common causes of heterophily and oversmoothing problems are those which trigger the nodes to move towards the other class, which include: the relative degree of a node (compared to its neighbors) and the heterophily level of a node's neighborhood. Under certain conditions, our analysis suggests that : (1) Nodes with high heterophily have a higher misclassification rate. (2) Even with low heterophily, degree disparity between nodes can influence the moving dynamics of nodes and result in a pseudo-heterophily situation, which helps to explain oversmoothing. Based on our insights, we propose simple modifications to the GCN architecture---i.e., degree corrections and signed messages---which alleviate the causes of these issues.

Chat is not available.