Skip to yearly menu bar Skip to main content


Poster

Efficient Graph Generation with Graph Recurrent Attention Networks

Renjie Liao · Yujia Li · Yang Song · Shenlong Wang · Will Hamilton · David Duvenaud · Raquel Urtasun · Richard Zemel

East Exhibition Hall B + C #124

Keywords: [ Recurr ] [ Algorithms -> Relational Learning; Applications -> Network Analysis; Deep Learning -> Attention Models; Deep Learning ] [ Deep Learning ] [ Generative Models ]


Abstract:

We propose a new family of efficient and expressive deep generative models of graphs, called Graph Recurrent Attention Networks (GRANs). Our model generates graphs one block of nodes and associated edges at a time. The block size and sampling stride allow us to trade off sample quality for efficiency. Compared to previous RNN-based graph generative models, our framework better captures the auto-regressive conditioning between the already-generated and to-be-generated parts of the graph using Graph Neural Networks (GNNs) with attention. This not only reduces the dependency on node ordering but also bypasses the long-term bottleneck caused by the sequential nature of RNNs. Moreover, we parameterize the output distribution per block using a mixture of Bernoulli, which captures the correlations among generated edges within the block. Finally, we propose to handle node orderings in generation by marginalizing over a family of canonical orderings. On standard benchmarks, we achieve state-of-the-art time efficiency and sample quality compared to previous models. Additionally, we show our model is capable of generating large graphs of up to 5K nodes with good quality. Our code is released at: \url{https://github.com/lrjconan/GRAN}.

Live content is unavailable. Log in and register to view live content