Timezone: »

 
Efficient Multi-lingual Neural Machine Translation
Boxing Chen

To support Alibaba’s globalization, we developed a Multi-lingual Neural Machine Translation (MNMT) system to conduct translation between 214 languages with one model. The main challenges of MNMT are model capacity, zero-shot translation, inference speed and energy cost, etc. Therefore, we performed several studies to make the training, inference, and energy more efficient while remaining the competitive translation quality. Which include: 1. Language-aware interlingua-based new MNMT architecture; 2. Improving zero-shot translation via joint training with denoising autoencoding; 3. Speedup decoding with strategies of shallow decoder, decoder attention weights sharing, and shortlist prediction; 4. A new energy-efficient attention mechanism that replaces multiplication operations with binarized selective and addition operations.

Author Information

Boxing Chen (Alibaba Group)

More from the Same Authors