Oral Poster
DapperFL: Domain Adaptive Federated Learning with Model Fusion Pruning for Edge Devices
Yongzhe Jia · Xuyun Zhang · Hongsheng Hu · Kim-Kwang Raymond Choo · Lianyong Qi · Xiaolong Xu · Amin Beheshti · Wanchun Dou
West Ballroom A-D #6307
Fri 13 Dec 3:30 p.m. PST — 4:30 p.m. PST
Federated learning (FL) has emerged as a prominent machine learning paradigm in edge computing environments, enabling edge devices to collaboratively optimize a global model without sharing their private data. However, existing FL frameworks suffer from efficacy deterioration due to the system heterogeneity inherent in edge computing, especially in the presence of domain shifts across local data. In this paper, we propose a heterogeneous FL framework DapperFL, to enhance model performance across multiple domains. In DapperFL, we introduce a dedicated Model Fusion Pruning (MFP) module to produce personalized compact local models for clients to address the system heterogeneity challenges. The MFP module prunes local models with fused knowledge obtained from both local and remaining domains, ensuring robustness to domain shifts. Additionally, we design a Domain Adaptive Regularization (DAR) module to further improve the overall performance of DapperFL. The DAR module employs regularization generated by the pruned model, aiming to learn robust representations across domains. Furthermore, we introduce a specific aggregation algorithm for aggregating heterogeneous local models with tailored architectures and weights. We implement DapperFL on a real-world FL platform with heterogeneous clients. Experimental results on benchmark datasets with multiple domains demonstrate that DapperFL outperforms several state-of-the-art FL frameworks by up to 2.28\%, while significantly achieving model volume reductions ranging from 20\% to 80\%. Our code is available at: https://anonymous.4open.science/r/DapperFL.
Live content is unavailable. Log in and register to view live content