Skip to yearly menu bar Skip to main content


Spotlight Poster

MKGL: Mastery of a Three-Word Language

Lingbing Guo · Zhongpu Bo · Zhuo Chen · Yichi Zhang · Jiaoyan Chen · Lan Yarong · Mengshu Sun · Zhiqiang Zhang · Yangyifei Luo · Qian Li · Qiang Zhang · Wen Zhang · Huajun Chen

[ ]
Fri 13 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Large language models (LLMs) have significantly advanced performance across a spectrum of natural language processing (NLP) tasks. Yet, their application to knowledge graphs (KGs), which describe facts in the form of triplets and allow minimal hallucinations, remains an underexplored frontier. In this paper, we investigate the integration of LLMs with KGs by introducing a specialized KG Language (KGL), where a sentence precisely consists of an entity noun, a relation verb, and ends with another entity noun. Despite KGL's unfamiliar vocabulary to the LLM, we facilitate its learning through a tailored dictionary and illustrative sentences, and enhance context understanding via real-time KG context retrieval and KGL token embedding augmentation. Our results reveal that LLMs can achieve fluency in KGL, drastically reducing errors compared to conventional KG embedding methods on KG completion. Furthermore, our enhanced LLM shows exceptional competence in generating accurate three-word sentences from an initial entity and interpreting new unseen terms out of KGs.

Live content is unavailable. Log in and register to view live content