Skip to yearly menu bar Skip to main content


( events)   Timezone:  
Workshop
Sat Dec 12 05:30 AM -- 03:30 PM (PST) @ 514 bc
Transfer and Multi-Task Learning: Trends and New Perspectives
Anastasia Pentina · Christoph Lampert · Sinno Jialin Pan · Mingsheng Long · Judy Hoffman · Baochen Sun · Kate Saenko





Workshop Home Page

This workshop aims to bring together researchers and practitioners from machine learning, computer vision, natural language processing and related fields to discuss and document recent advances in transfer and multi-task learning. This includes the main topics of transfer and multi-task learning, together with several related variants as domain adaptation and dataset bias, and new discoveries and directions in deep learning based approaches.

Transfer and multi-task learning methods aim to better exploit the available data during training and adapt previously learned knowledge to new domains or tasks. This mitigates the burden of human labeling for emerging applications and enables learning from very few labeled examples.

In the past years there have been increasing activities in these areas, mainly driven by practical applications (e.g. object recognition, sentiment analysis) as well as state-of-the-art deep learning frameworks (e.g. CNN). Of the recently proposed solutions, most lack joint theoretical justifications, especially those deep learning based approaches. On the other hand, most of the existing theoretically justified approaches are rarely used in practice.

This NIPS 2015 workshop will focus on closing the gap between theory and practice by providing an opportunity for researchers and practitioners to get together, to share ideas and debate current theories and empirical results. The goal is to promote a fruitful exchange of ideas across different communities, leading to global advancement of the field.

Tentative topics:
New perspectives or theories on transfer and multi-task learning
Dataset bias and concept drift
Domain adaptation
Multi-task learning
Zero-shot or one-shot learning
Feature based approaches
Instance based approaches
Deep architectures for transfer and multi-task learning
Transferability of deep representations
Transfer across different architectures, e.g. CNN to RNN
Transfer across different modalities, e.g. image to text
Transfer across different tasks, e.g. recognition and detection
Transfer from weakly labeled or noisy data, e.g. Web data
Transfer in practical settings, e.g. online, active, and large-scale learning
Innovative applications, e.g. machine translation, computational biology
Datasets, benchmarks, and open-source packages

Learning Representations for Unsupervised and Transfer Learning (Talk)
Intro and Adapting Deep Networks Across Domains, Modalities, and Tasks (Talk)
Learning Shared Representations in MDPs (Talk)
On Weight Ratio Estimation for Covariate Shift (Talk)
The Benefit of Multitask Representation Learning (Talk)
A Theory of Multiple Source Adaptation (Talk)
Domain Adaptation for Binary Classification (Talk)
Shai Ben-David (Talk)
Multitask Generalized Eigenvalue Program (Talk)
Actor-Mimic (Talk)
Sharing the "How" (and not the "What") (Talk)
Transitive Transfer Learning (Talk)