Skip to yearly menu bar Skip to main content


Poster

Prompt-Based Knowledge Graph Foundation Model for Universal In-Context Reasoning

Yuanning Cui · Zequn Sun · Wei Hu


Abstract:

Extensive knowledge graphs (KGs) have been constructed to facilitate knowledge-driven tasks across various scenarios. However, existing work usually develops separate reasoning models for different KGs, lacking the ability to generalize and transfer knowledge across diverse KGs and reasoning settings. In this paper, we propose a prompt-based KG foundation model via in-context learning, namely KG-ICL, to achieve a universal reasoning ability. Specifically, we introduce a prompt graph centered with a query-related example fact as context to understand the query relation. To encode prompt graphs with the generalization ability to unseen entities and relations in queries, we first propose a unified tokenizer that maps entities and relations in prompt graphs to predefined tokens. Then, we propose two message passing neural networks to perform prompt encoding and KG reasoning, respectively. We conduct evaluations on 43 different KGs in both transductive and inductive settings. Results indicate that the proposed model outperforms baselines on most datasets, showcasing its outstanding generalization and universal reasoning capabilities. The source code is accessible in supplemental materials.

Live content is unavailable. Log in and register to view live content