Skip to yearly menu bar Skip to main content


Poster

Federated Transformer: Scalable Vertical Federated Learning on Practical Fuzzily Linked Data

Zhaomin Wu · Junyi Hou · Yiqun Diao · Bingsheng He

East Exhibit Hall A-C #4100
[ ] [ Project Page ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Federated Learning (FL) is an evolving paradigm that enables multiple parties to collaboratively train models without sharing raw data. Vertical Federated Learning (VFL), which involves multiple parties contributing distinct features of a shared instance group, is prevalent in real-world, cross-organizational collaborations. In such setups, parties are typically linked by fuzzy identifiers, a common scenario in practice termed as \textit{multi-party fuzzy VFL}. Existing models generally address either multi-party VFL or fuzzy VFL between two parties. Extending these models to the practical multi-party fuzzy VFL typically results in significant performance degradation and increased costs for maintaining privacy. To overcome these limitations, we introduce the \textit{Federated Transformer (FeT)}, a novel framework designed to support multi-party VFL with fuzzy identifiers. FeT encodes identifiers into data representations and conducts training using a transformer architecture distributed across different parties, incorporating three new techniques to enhance performance. Additionally, we have developed a scalable privacy framework that integrates differential privacy with secure multi-party computation, effectively protecting local representations at manageable costs. Experiments show that the FeT surpasses the performance of baseline models by up to 46 percentage points when scaled to 50 parties. Additionally, FeT outperforms cutting-edge models in two-party fuzzy VFL settings, while offering improved privacy.

Live content is unavailable. Log in and register to view live content