Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Transfer Learning for Natural Language Processing

Classifiers are Better Experts for Controllable Text Generation

Askhat Sitdikov · Nikita Balagansky · Daniil Gavrilov · Alexander Markov


Abstract:

This paper proposes a simple method for controllable text generation based on weighting logits with a free-form classifier, namely CAIF sampling. Using an arbitrary text classifier, we adjust a small part of a language model's logits and guide text generation towards or away from classifier prediction. We experimented with toxicity avoidance and sentiment control tasks and showed that the proposed method significantly outperforms recent PPLM, GeDi, and DExperts on PPL and task accuracy metrics based on the external classifier of generated texts. In addition, compared to other approaches, it is easier to implement and tune and has significantly fewer restrictions and requirements.

Chat is not available.