Skip to yearly menu bar Skip to main content


Poster

DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs

Ali Sadeghian · Mohammadreza Armandpour · Patrick Ding · Daisy Zhe Wang

East Exhibition Hall B + C #246

Keywords: [ Algorithms -> Relational Learning; Algorithms ] [ Representation Learning ] [ Deep Learning ]


Abstract:

In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves these problems. We motivate our method by making a connection between learning confidence scores for each rule and low-rank tensor approximation. DRUM uses bidirectional RNNs to share useful information across the tasks of learning rules for different relations. We also empirically demonstrate the efficiency of DRUM over existing rule mining methods for inductive link prediction on a variety of benchmark datasets.

Live content is unavailable. Log in and register to view live content