Timezone: »

Latent Template Induction with Gumbel-CRFs
Yao Fu · Chuanqi Tan · Bin Bi · Mosha Chen · Yansong Feng · Alexander Rush

Tue Dec 08 09:00 AM -- 11:00 AM (PST) @ Poster Session 1 #61

Learning to control the structure of sentences is a challenging problem in text generation. Existing work either relies on simple deterministic approaches or RL-based hard structures. We explore the use of structured variational autoencoders to infer latent templates for sentence generation using a soft, continuous relaxation in order to utilize reparameterization for training. Specifically, we propose a Gumbel-CRF, a continuous relaxation of the CRF sampling algorithm using a relaxed Forward-Filtering Backward-Sampling (FFBS) approach. As a reparameterized gradient estimator, the Gumbel-CRF gives more stable gradients than score-function based estimators. As a structured inference network, we show that it learns interpretable templates during training, which allows us to control the decoder during testing. We demonstrate the effectiveness of our methods with experiments on data-to-text generation and unsupervised paraphrase generation.

Author Information

Yao Fu (Columbia University)
Chuanqi Tan (Alibaba Group)
Bin Bi (Alibaba Group)
Mosha Chen (Alibaba Group)

Mosha is a NLP algorithm engineer focus on medical application.

Yansong Feng (Peking University)
Alexander Rush (Cornell University)

More from the Same Authors