Timezone: »

 
Poster
Scalable Graph Neural Networks via Bidirectional Propagation
Ming Chen · Zhewei Wei · Bolin Ding · Yaliang Li · Ye Yuan · Xiaoyong Du · Ji-Rong Wen

Thu Dec 10 09:00 PM -- 11:00 PM (PST) @ Poster Session 6 #1839

Graph Neural Networks (GNN) are an emerging field for learning on non-Euclidean data. Recently, there has been increased interest in designing GNN that scales to large graphs. Most existing methods use "graph sampling" or "layer-wise sampling" techniques to reduce training time; However, these methods still suffer from degrading performance and scalability problems when applying to graphs with billions of edges. In this paper, we present GBP, a scalable GNN that utilizes a localized bidirectional propagation process from both the feature vector and the training/testing nodes. Theoretical analysis shows that GBP is the first method that achieves sub-linear time complexity for both the precomputation and the training phases. An extensive empirical study demonstrates that GBP achieves state-of-the-art performance with significantly less training/testing time. Most notably, GBP is able to deliver superior performance on a graph with over 60 million nodes and 1.8 billion edges in less than 2,000 seconds on a single machine.

Author Information

Ming Chen (Renmin University of China)
Zhewei Wei (Renmin University of China)
Bolin Ding ("Data Analytics and Intelligence Lab, Alibaba Group")
Yaliang Li (Alibaba Group)
Ye Yuan ( Beijing Institute of Technology)
Xiaoyong Du (Renmin University of China)
Ji-Rong Wen (Renmin University of China)

More from the Same Authors