Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Table Representation Learning Workshop

Tabular Representation, Noisy Operators, and Impacts on Table Structure Understanding Tasks in LLMs

Ananya Singha · José Cambronero · Sumit Gulwani · Vu Le · Chris Parnin

Keywords: [ in-context learnin ] [ Large language models ] [ table structure ]

[ ] [ Project Page ]
Fri 15 Dec noon PST — 12:07 p.m. PST
 
presentation: Table Representation Learning Workshop
Fri 15 Dec 6:30 a.m. PST — 3:30 p.m. PST

Abstract:

Large language models (LLMs) are increasingly applied for tabular tasks usingin-context learning. The prompt representation for a table may play a role in theLLMs ability to process the table. Inspired by prior work, we generate a collectionof self-supervised structural tasks (e.g. navigate to a cell and row; transpose thetable) and evaluate the performance differences when using 8 formats. In contrastto past work, we introduce 8 noise operations inspired by real-world messy dataand adversarial inputs, and show that such operations can impact LLM performanceacross formats for different structural understanding tasks.

Chat is not available.