Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: Second Workshop on Efficient Natural Language and Speech Processing (ENLSP-II)

Collective Knowledge Graph Completion with Mutual Knowledge Distillation

Weihang Zhang · Ovidiu Serban · Jiahao Sun · Yike Guo

Keywords: [ ENLSP-Main ] [ Efficient Graphs for NLP ]


Abstract:

Knowledge graph completion (KGC), the task that aims at predicting missing information based on the already existing relational data inside a knowledge graph(KG), has drawn significant attention in the recent years. However, predictive power of KGC methods is often limited by the completeness of the existing knowledge graphs. In monolingual and multilingual settings, KGs from different sources and languages are potentially complementary to each other. In this paper, we study the problem of multi-KG completion, where we focus on maximizing the collective knowledge from different KGs to alleviate the incompleteness on individual KGs. Specifically, we propose a novel method called CKGC-MKD that uses augmented CompGCN-based encoder models on both individual KGs and a large connected KG in which seed alignments between KGs are regarded as edges for message propagation. Additional mutual knowledge distillation are employed to maximize the knowledge transfer between the "global" connected KG and the "local" individual KGs. Experimental results on multilingual datasets has shown that our method outperforms all state-of-the-art models.

Chat is not available.