Skip to yearly menu bar Skip to main content


Poster
in
Affinity Workshop: WiML Workshop 1

Privacy-Preseving Federated Multi-Task Linear Regression: A One-shot Linear Mixing Approach Inspired by Graph Regularization

Harlin Lee


Abstract:

We investigate multi-task learning (MTL), where multiple learning tasks are performed jointly rather than separately to leverage their similarities and improve performance. We focus on the federated multi-task linear regression setting, where each machine possesses its own data for individual tasks and sharing the full local data between machines is prohibited. Motivated by graph regularization, we propose a novel fusion framework that only requires a one-shot communication of local estimates. Our method linearly combines the local estimates to produce an improved estimate for each task, and we show that the ideal mixing weight for fusion is a function of task similarity and task difficulty. A practical algorithm is developed and shown to significantly reduce mean squared error (MSE) on synthetic data, as well as improve performance on a real-world income prediction task.

Chat is not available.