`

Timezone: »

 
Poster
Duplex Sequence-to-Sequence Learning for Reversible Machine Translation
Zaixiang Zheng · Hao Zhou · Shujian Huang · Jiajun Chen · Jingjing Xu · Lei Li

Wed Dec 08 12:30 AM -- 02:00 AM (PST) @

Sequence-to-sequence learning naturally has two directions. How to effectively utilize supervision signals from both directions? Existing approaches either require two separate models, or a multitask-learned model but with inferior performance. In this paper, we propose REDER (Reversible Duplex Transformer), a parameter-efficient model and apply it to machine translation. Either end of REDER can simultaneously input and output a distinct language. Thus REDER enables {\em reversible machine translation} by simply flipping the input and output ends. Experiments verify that REDER achieves the first success of reversible machine translation, which helps outperform its multitask-trained baselines by up to 1.3 BLEU.

Author Information

Zaixiang Zheng (ByteDance AI Lab)
Hao Zhou (Bytedance)
Shujian Huang (Nanjing University)
Jiajun Chen (Nanjing University)
Jingjing Xu (Bytedance)
Lei Li (Toutiao Lab)

More from the Same Authors