`

Timezone: »

 
Poster
Adaptive Diffusion in Graph Neural Networks
Jialin Zhao · Yuxiao Dong · Ming Ding · Evgeny Kharlamov · Jie Tang

Thu Dec 09 04:30 PM -- 06:00 PM (PST) @ None #None

The success of graph neural networks (GNNs) largely relies on the process of aggregating information from neighbors defined by the input graph structures. Notably, message passing based GNNs, e.g., graph convolutional networks, leverage the immediate neighbors of each node during the aggregation process, and recently, graph diffusion convolution (GDC) is proposed to expand the propagation neighborhood by leveraging generalized graph diffusion. However, the neighborhood size in GDC is manually tuned for each graph by conducting grid search over the validation set, making its generalization practically limited. To address this issue, we propose the adaptive diffusion convolution (ADC) strategy to automatically learn the optimal neighborhood size from the data. Furthermore, we break the conventional assumption that all GNN layers and feature channels (dimensions) should use the same neighborhood for propagation. We design strategies to enable ADC to learn a dedicated propagation neighborhood for each GNN layer and each feature channel, making the GNN architecture fully coupled with graph structures---the unique property that differs GNNs from traditional neural networks. By directly plugging ADC into existing GNNs, we observe consistent and significant outperformance over both GDC and their vanilla versions across various datasets, demonstrating the improved model capacity brought by automatically learning unique neighborhood size per layer and per channel in GNNs.

Author Information

Jialin Zhao (Tsinghua University, Tsinghua University)
Yuxiao Dong (Microsoft)
Ming Ding (Tsinghua University)
Evgeny Kharlamov (University of Oslo, Norway)
Jie Tang (Tsinghua University)

More from the Same Authors