Timezone: »

Clustered Multi-Task Learning: A Convex Formulation
Laurent Jacob · Francis Bach · Jean-Philippe Vert

Wed Dec 10 03:26 PM -- 03:27 PM (PST) @

In multi-task learning several related tasks are considered simultaneously, with the hope that by an appropriate sharing of information across tasks, each task may benefit from the others. In the context of learning linear functions for supervised classification or regression, this can be achieved by including a priori information about the weight vectors associated with the tasks, and how they are expected to be related to each other. In this paper, we assume that tasks are clustered into groups, which are unknown beforehand, and that tasks within a group have similar weight vectors. We design a new spectral norm that encodes this a priori assumption, without the prior knowledge of the partition of tasks into groups, resulting in a new convex optimization formulation for multi-task learning. We show in simulations on synthetic examples and on the iedb MHC-I binding dataset, that our approach outperforms well-known convex methods for multi-task learning, as well as related non convex methods dedicated to the same problem.

Author Information

Laurent Jacob (UC Berkeley)
Francis Bach (INRIA - Ecole Normale Superieure)
Jean-Philippe Vert (Owkin / PSL University)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors