Skip to yearly menu bar Skip to main content


Poster
in
Workshop: New Frontiers in Graph Learning (GLFrontiers)

GenTKG: Generative Forecasting on Temporal Knowledge Graph

Ruotong Liao · Xu Jia · Yunpu Ma · Volker Tresp

Keywords: [ Temporal Knowledge Graph Forecasting ] [ Large language models ] [ Efficient-Finetuning ]


Abstract:

The rapid advancements in large language models (LLMs) have ignited interest in the realm of the temporal knowledge graph (TKG) domain, where conventional carefully designed embedding-based and rule-based models dominate. The question remains open of whether pre-trained LLMs can understand structured temporal relational data and replace them as the foundation model for temporal relational forecasting. Besides, challenges occur in the huge chasms between complex graph data structure and linear natural expressions LLMs can handle, and between the enormous data volume of TKGs and heavy computation costs of finetuning LLMs. To address these challenges, we bring temporal knowledge forecasting into the generative setting and propose a novel retrieval augmented generation framework named GenTKG combining a temporal logical rule-based retrieval strategy and lightweight few-shot parameter-efficient instruction tuning to solve the above challenges. Extensive experiments have shown that GenTKG is a simple but effective, efficient, and generalizable approach that outperforms conventional methods on temporal relational forecasting with extremely limited computation. Our work opens a new frontier for the temporal knowledge graph domain.

Chat is not available.