Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Instruction Tuning and Instruction Following

From Classification to Generation: Insights into Crosslingual Retrieval Augmented ICL

Xiaoqian Li · Ercong Nie · Sheng Liang

Keywords: [ NLP ] [ in-context learning ] [ cross-lingual ] [ retrieval augmented prompt ] [ LRLs ] [ Classification ] [ summarization ]


Abstract:

The remarkable ability of Large Language Models (LLMs) to understand and follow instructions has sometimes been limited by their in-context learning (ICL) performance in low-resource languages. To address this, we introduce a novel approach that leverages cross-lingual retrieval-augmented in-context learning (CREA-ICL). By extracting semantically similar prompts from high-resource languages, we aim to bolster the zero-shot performance of multilingual pretrained language models (MPLMs) across diverse tasks. Though our approach yields steady improvements in classification tasks, it faces challenges in generation tasks, with Bangla serving as a key case study. Our evaluation offers insights into the performance dynamics of retrieval-augmented in-context learning across both classification and generation domains.

Chat is not available.