Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Meaning in Context: Pragmatic Communication in Humans and Machines

Unveiling the Meaning Through Emotional Context

Tatiana Botskina


Abstract:

Generating emotional text which is highly adaptive to different life scenarios is an important step towards understanding the role of context in generative language models. While large language models with billions of parameters (e.g. GPT-3) are able to produce coherent text indistinguishable from human generated text, sometimes they fail to generate contextually relevant sentences with anticipated sentiment tone. Main challenge in generating the text with required emotional context is a complexity of human emotions. Since variability of emotions makes it difficult to recognize the emotion in text by humans without understanding the context, conditional text generation controlling sentiment and context helps to prevent contextual confusion. In this paper we suggest exploring how generative language models improve the meaning of the generated text by controlling the sentiment in text generation and providing broader context to generated scenarios within given situation. We demonstrated how existing research in sentiment analysis, style-transfer and controllable text generation can be used in future research to understand the meaning of generated language through emotional context.