Skip to yearly menu bar Skip to main content


Poster

Sequence Modeling with Unconstrained Generation Order

Dmitrii Emelianenko · Elena Voita · Pavel Serdyukov

East Exhibition Hall B + C #121

Keywords: [ Stochastic Optim ] [ Applications -> Natural Language Processing; Deep Learning; Deep Learning -> Attention Models; Optimization ] [ Deep Learning ] [ Generative Models ]


Abstract:

The dominant approach to sequence generation is to produce a sequence in some predefined order, e.g. left to right. In contrast, we propose a more general model that can generate the output sequence by inserting tokens in any arbitrary order. Our model learns decoding order as a result of its training procedure. Our experiments show that this model is superior to fixed order models on a number of sequence generation tasks, such as Machine Translation, Image-to-LaTeX and Image Captioning.

Live content is unavailable. Log in and register to view live content