Skip to yearly menu bar Skip to main content


Poster

Controllable Text Generation with Neurally-Decomposed Oracle

Tao Meng · Sidi Lu · Nanyun Peng · Kai-Wei Chang

Hall J (level 1) #425

Keywords: [ controllable text generation ] [ constrained decoding ]


Abstract:

We propose a general and efficient framework to control auto-regressive generation models with NeurAlly-Decomposed Oracle (NADO). Given a pre-trained base language model and a sequence-level boolean oracle function, we aim to decompose the oracle function into token-level guidance to steer the base model in text generation. Specifically, the token-level guidance is provided by NADO, a neural model trained with examples sampled from the base model, demanding no additional auxiliary labeled data. Based on posterior regularization, we present the close-form optimal solution to incorporate the decomposed token-level guidance into the base model for controllable generation. We further discuss how the neural approximation affects the quality of the solution. These experiments conducted on two different applications: (1) text generation with lexical constraints and (2) machine translation with formality control demonstrate that our framework efficiently guides the base model towards the given oracle while keeping high generation quality.

Chat is not available.