Skip to yearly menu bar Skip to main content


Poster

FACE: Evaluating Natural Language Generation with Fourier Analysis of Cross-Entropy

Zuhao Yang · Yingfang Yuan · Yang Xu · SHUO ZHAN · Huajun Bai · Kefan Chen

Great Hall & Hall B1+B2 (level 1) #516
[ ] [ Project Page ]
[ Paper [ Slides [ Poster [ OpenReview
Thu 14 Dec 3 p.m. PST — 5 p.m. PST

Abstract:

Measuring the distance between machine-produced and human language is a critical open problem. Inspired by empirical findings from psycholinguistics on the periodicity of entropy in language, we propose FACE, a set of metrics based on Fourier Analysis of the estimated Cross-Entropy of language, for measuring the similarity between model-generated and human-written languages. Based on an open-ended generation task and the experimental data from previous studies, we find that FACE can effectively identify the human-model gap, scales with model size, reflects the outcomes of different sampling methods for decoding, correlates well with other evaluation metrics and with human judgment scores.

Chat is not available.