Timezone: »
We introduce Generative Neural Machine Translation (GNMT), a latent variable architecture which is designed to model the semantics of the source and target sentences. We modify an encoder-decoder translation model by adding a latent variable as a language agnostic representation which is encouraged to learn the meaning of the sentence. GNMT achieves competitive BLEU scores on pure translation tasks, and is superior when there are missing words in the source sentence. We augment the model to facilitate multilingual translation and semi-supervised learning without adding parameters. This framework significantly reduces overfitting when there is limited paired data available, and is effective for translating between pairs of languages not seen during training.
Author Information
Harshil Shah (UCL)
David Barber (University College London)
More from the Same Authors
-
2021 : Adaptive Optimization with Examplewise Gradients »
Julius Kunze · James Townsend · David Barber -
2018 Poster: Online Structured Laplace Approximations for Overcoming Catastrophic Forgetting »
Hippolyt Ritter · Aleksandar Botev · David Barber -
2018 Poster: Modular Networks: Learning to Decompose Neural Computation »
Louis Kirsch · Julius Kunze · David Barber -
2017 Poster: Thinking Fast and Slow with Deep Learning and Tree Search »
Thomas Anthony · Zheng Tian · David Barber -
2017 Poster: Wider and Deeper, Cheaper and Faster: Tensorized LSTMs for Sequence Learning »
Zhen He · Shaobing Gao · Liang Xiao · Daxue Liu · Hangen He · David Barber