Workshop
Transfer and Multi-Task Learning: Trends and New Perspectives
Anastasia Pentina · Christoph Lampert · Sinno Jialin Pan · Mingsheng Long · Judy Hoffman · Baochen Sun · Kate Saenko
514 bc
This workshop aims to bring together researchers and practitioners from machine learning, computer vision, natural language processing and related fields to discuss and document recent advances in transfer and multi-task learning. This includes the main topics of transfer and multi-task learning, together with several related variants as domain adaptation and dataset bias, and new discoveries and directions in deep learning based approaches.
Transfer and multi-task learning methods aim to better exploit the available data during training and adapt previously learned knowledge to new domains or tasks. This mitigates the burden of human labeling for emerging applications and enables learning from very few labeled examples.
In the past years there have been increasing activities in these areas, mainly driven by practical applications (e.g. object recognition, sentiment analysis) as well as state-of-the-art deep learning frameworks (e.g. CNN). Of the recently proposed solutions, most lack joint theoretical justifications, especially those deep learning based approaches. On the other hand, most of the existing theoretically justified approaches are rarely used in practice.
This NIPS 2015 workshop will focus on closing the gap between theory and practice by providing an opportunity for researchers and practitioners to get together, to share ideas and debate current theories and empirical results. The goal is to promote a fruitful exchange of ideas across different communities, leading to global advancement of the field.
Tentative topics:
New perspectives or theories on transfer and multi-task learning
Dataset bias and concept drift
Domain adaptation
Multi-task learning
Zero-shot or one-shot learning
Feature based approaches
Instance based approaches
Deep architectures for transfer and multi-task learning
Transferability of deep representations
Transfer across different architectures, e.g. CNN to RNN
Transfer across different modalities, e.g. image to text
Transfer across different tasks, e.g. recognition and detection
Transfer from weakly labeled or noisy data, e.g. Web data
Transfer in practical settings, e.g. online, active, and large-scale learning
Innovative applications, e.g. machine translation, computational biology
Datasets, benchmarks, and open-source packages
Schedule
Fri 8:40 a.m. - 9:20 a.m.
|
Learning Representations for Unsupervised and Transfer Learning
(
Talk
)
>
|
Yoshua Bengio 🔗 |
Sat 5:50 a.m. - 6:30 a.m.
|
Intro and Adapting Deep Networks Across Domains, Modalities, and Tasks
(
Talk
)
>
|
Trevor Darrell 🔗 |
Sat 6:00 a.m. - 6:25 a.m.
|
Learning Shared Representations in MDPs
(
Talk
)
>
|
Diana Borsa 🔗 |
Sat 6:05 a.m. - 6:20 a.m.
|
On Weight Ratio Estimation for Covariate Shift
(
Talk
)
>
|
Ruth Urner 🔗 |
Sat 6:30 a.m. - 7:00 a.m.
|
The Benefit of Multitask Representation Learning
(
Talk
)
>
|
Massimiliano Pontil 🔗 |
Sat 7:30 a.m. - 8:00 a.m.
|
A Theory of Multiple Source Adaptation
(
Talk
)
>
|
Mehryar Mohri 🔗 |
Sat 11:30 a.m. - 12:00 p.m.
|
Shai Ben-David
(
Talk
)
>
|
🔗 |
Sat 11:30 a.m. - 12:00 p.m.
|
Domain Adaptation for Binary Classification
(
Talk
)
>
|
Shai Ben-David 🔗 |
Sat 12:00 p.m. - 12:30 p.m.
|
Multitask Generalized Eigenvalue Program
(
Talk
)
>
|
Boyu Wang 🔗 |
Sat 12:30 p.m. - 1:00 p.m.
|
Actor-Mimic
(
Talk
)
>
|
Emilio Parisotto 🔗 |
Sat 2:00 p.m. - 2:30 p.m.
|
Sharing the "How" (and not the "What")
(
Talk
)
>
|
Percy Liang 🔗 |
Sat 2:30 p.m. - 3:00 p.m.
|
Transitive Transfer Learning
(
Talk
)
>
|
Qiang Yang 🔗 |