Interstellar: Searching Recurrent Architecture for Knowledge Graph Embedding
Yongqi Zhang, Quanming Yao, Lei Chen
Spotlight presentation: Orals & Spotlights Track 33: Health/AutoML/(Soft|Hard)ware
on 2020-12-10T19:10:00-08:00 - 2020-12-10T19:20:00-08:00
on 2020-12-10T19:10:00-08:00 - 2020-12-10T19:20:00-08:00
Toggle Abstract Paper (in Proceedings / .pdf)
Abstract: Knowledge graph (KG) embedding is well-known in learning representations of KGs. Many models have been proposed to learn the interactions between entities and relations of the triplets. However, long-term information among multiple triplets is also important to KG. In this work, based on the relational paths, which are composed of a sequence of triplets, we define the Interstellar as a recurrent neural architecture search problem for the short-term and long-term information along the paths. First, we analyze the difficulty of using a unified model to work as the Interstellar. Then, we propose to search for recurrent architecture as the Interstellar for different KG tasks. A case study on synthetic data illustrates the importance of the defined search problem. Experiments on real datasets demonstrate the effectiveness of the searched models and the efficiency of the proposed hybrid-search algorithm.