Skip to yearly menu bar Skip to main content


Poster

Recurrent Temporal Revision Graph Networks

Yizhou Chen · Anxiang Zeng · Qingtao Yu · Kerui Zhang · Cao Yuanpeng · Kangle Wu · Guangda Huzhang · Han Yu · Zhiming Zhou

Great Hall & Hall B1+B2 (level 1) #625

Abstract:

Temporal graphs offer more accurate modeling of many real-world scenarios than static graphs. However, neighbor aggregation, a critical building block of graph networks, for temporal graphs, is currently straightforwardly extended from that of static graphs. It can be computationally expensive when involving all historical neighbors during such aggregation. In practice, typically only a subset of the most recent neighbors are involved. However, such subsampling leads to incomplete and biased neighbor information. To address this limitation, we propose a novel framework for temporal neighbor aggregation that uses the recurrent neural network with node-wise hidden states to integrate information from all historical neighbors for each node to acquire the complete neighbor information. We demonstrate the superior theoretical expressiveness of the proposed framework as well as its state-of-the-art performance in real-world applications. Notably, it achieves a significant +9.4% improvement on averaged precision in a real-world Ecommerce dataset over existing methods on 2-layer models.

Chat is not available.