Skip to yearly menu bar Skip to main content


Invited Talk
in
Workshop: Federated Learning: Recent Advances and New Challenges

Scalable and Communication-Efficient Vertical Federated Learning

Stacy Patterson


Abstract:

Vertical Federated Learning (VFL) algorithms are an important class of federated learning algorithms in which parties’ local datasets share a common sample ID space but have different feature sets. This is in contrast to Horizontal Federated Learning (HFL), where parties share the same feature sets but for different sample IDs. While much work has been done to advance the efficiency and flexibility of HFL, these techniques do not directly extend to VFL due to differences in the model architecture and training paradigm. In this talk, I will present two methods for efficient and robust VFL. The first, Compressed VFL, reduces communication cost through message compression while achieving the same asymptotic convergence rate as standard VFL with no compression. The second, Flex-VFL, extends VFL to support heterogeneous parties that may use different local optimizers and may operate at different rates. I will highlight some interesting theoretical and experimental results for each method, and finally, I will present some directions and open questions for future work in VFL.

Chat is not available.