Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Workshop on Federated Learning in the Age of Foundation Models in Conjunction with NeurIPS 2023 (FL@FM-NeurIPS'23)

FedML-HE: An Efficient Homomorphic-Encryption-Based Privacy-Preserving Federated Learning System

Weizhao Jin · Yuhang Yao · Shanshan Han · Carlee Joe-Wong · Srivatsan Ravi · Salman Avestimehr · Chaoyang He

Keywords: [ Federated Learning; Homomorphic Encryption; Privacy ]


Abstract:

Federated Learning trains machine learning models on distributed devices by aggregating local model updates instead of local data. However, privacy concerns arise as the aggregated local models on the server may reveal sensitive personal information by inversion attacks. Privacy-preserving methods, such as Homomorphic Encryption (HE), then become necessary for FL training. Despite HE's post-quantum security advantages, its applications suffer from impractical overheads, especially for foundation models. In this paper, we present FedHE, the first practical federated learning system with efficient HE-based secure model aggregation. FedHE proposes to selectively encrypt sensitive parameters, significantly reducing both computation and communication overheads during training while providing customizable privacy preservation. Our optimized system demonstrates considerable overhead reduction, particularly for large foundation models (e.g., ~10x reduction for HE-federated training of ResNet-50, and ~40x reduction for BERT), demonstrating the potential for scalable HE-based FL deployment.

Chat is not available.