Timezone: »

 
Few-Shot Out-of-Domain Transfer of Natural Language Explanations
Yordan Yordanov · Vid Kocijan · Thomas Lukasiewicz · Oana-Maria Camburu
Event URL: https://openreview.net/forum?id=g9PUonwGk2M »

Recently, there has been an increasing interest in models that generate natural language explanations (NLEs) for their decisions. However, training a model to explain its decisions in natural language requires the acquisition of task-specific NLEs, which is time- and resource-consuming. A potential solution is the out-of-domain transfer of NLEs, where explainability is transferred from a domain with rich data to a domain with scarce data via few-shot transfer learning. In this work, we introduce and compare four approaches for few-shot transfer learning for NLEs. We transfer explainability from the natural language inference domain, where a large dataset of human-written NLEs already exists, to the domains of hard cases of pronoun resolution, and commonsense validation. Our results demonstrate that few-shot transfer far outperforms both zero-shot transfer and single-task training with few examples. We also investigate the scalability of the few-shot transfer of explanations, both in terms of training data and model size.

Author Information

Yordan Yordanov (University of Oxford)
Vid Kocijan (University of Oxford)
Thomas Lukasiewicz (University of Oxford)
Oana-Maria Camburu (University of Oxford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors