Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership

Cronus: Robust and Heterogeneous Collaborative Learning with Black-Box Knowledge Transfer

CHANG hongyan · Virat Shejwalkar · Reza Shokri · Amir Houmansadr


Abstract:

Majority of existing collaborative learning algorithms propose to exchange local model parameters between collaborating clients via a central server. Unfortunately, this approach has many known security and privacy weaknesses, primarily because of the high dimensionality of the updates involved; furthermore, it is limited to models with homogeneous architectures. The high dimensionality of the updates makes these approaches more susceptible to poisoning attacks and exposes the local models to inference attacks. Based on this intuition, we propose Cronus that uses knowledge transfer via model outputs} to exchange information between clients. We show that this significantly reduces the dimensions of the clients' updates, and therefore, improves the robustness of the server's aggregation algorithm. Our extensive evaluations demonstrates that Cronus outperforms state-of-the-art robust federated learning algorithms. Furthermore, we show that treating local models as black-box significantly reduces the information leakage. Finally, Cronus also allows collaboration between models with heterogeneous architectures.

Chat is not available.