Timezone: »
While neural conversation models have shown great potentials towards generating informative and engaging responses via introducing external knowledge, learning such a model often requires knowledge-grounded dialogues that are difficult to obtain. To overcome the data challenge and reduce the cost of building a knowledge-grounded dialogue system, we explore the problem under a zero-resource setting by assuming no context-knowledge-response triples are needed for training. To this end, we propose representing the knowledge that bridges a context and a response and the way that the knowledge is expressed as latent variables, and devise a variational approach that can effectively estimate a generation model from independent dialogue corpora and knowledge corpora. Evaluation results on three benchmarks of knowledge-grounded dialogue generation indicate that our model can achieve comparable performance with state-of-the-art methods that rely on knowledge-grounded dialogues for training, and exhibits a good generalization ability over different datasets.
Author Information
Linxiao Li (Peking University)
Can Xu (microsoft)
Wei Wu (Meituan-Dianping Group)
YUFAN ZHAO (Microsoft)
Xueliang Zhao (Peking University)
Chongyang Tao (Microsoft)
More from the Same Authors
-
2022 Poster: Moderate-fitting as a Natural Backdoor Defender for Pre-trained Language Models »
Biru Zhu · Yujia Qin · Ganqu Cui · Yangyi Chen · Weilin Zhao · Chong Fu · Yangdong Deng · Zhiyuan Liu · Jingang Wang · Wei Wu · Maosong Sun · Ming Gu -
2021 Poster: Neural Rule-Execution Tracking Machine For Transformer-Based Text Generation »
Yufei Wang · Can Xu · Huang Hu · Chongyang Tao · Stephen Wan · Mark Dras · Mark Johnson · Daxin Jiang