Federated Link Prediction on Dynamic Graphs
Yuhang Yao · Xinyi (Cynthia) Fan · Ryan Rossi · Sungchul Kim · Handong Zhao · Tong Yu · Carlee Joe-Wong
Abstract
Link prediction on dynamic, large-scale graphs is widely used in applications such as forecasting customer visits or predicting purchases. However, graph data is often localized due to privacy and efficiency concerns. While federated learning enables collaborative training without sharing raw data, vanilla FL on full historical graphs incurs prohibitive computational costs. Training only on recent data reduces overhead but harms accuracy and introduces data imbalance across clients. We introduce FedLink, a federated graph training framework for solving link prediction tasks on dynamic graphs. By continuously training on fixed-size buffers of client data, we can significantly reduce the computation overhead compared to training on the entire historical graph, while still training a global model across regions. Experiments demonstrate that FedLink matches the accuracy of training a centralized model while requiring 3.41$\times$ less memory and running 28.9\% faster compared with full-batch federated graph training. \footnote{Code and data will be made publicly available upon acceptance.}
Chat is not available.
Successful Page Load