Timezone: »

Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks
Sitao Luan · Mingde Zhao · Xiao-Wen Chang · Doina Precup

Wed Dec 11 10:45 AM -- 12:45 PM (PST) @ East Exhibition Hall B + C #26

Recently, neural network based approaches have achieved significant progress for solving large, complex, graph-structured problems. Nevertheless, the advantages of multi-scale information and deep architectures have not been sufficiently exploited. In this paper, we first analyze key factors constraining the expressive power of existing Graph Convolutional Networks (GCNs), including the activation function and shallow learning mechanisms. Then, we generalize spectral graph convolution and deep GCN in block Krylov subspace forms, upon which we devise two architectures, both scalable in depth however making use of multi-scale information differently. On several node classification tasks, the proposed architectures achieve state-of-the-art performance.

Author Information

Sitao Luan (McGill University, Mila)

I’m a second year Ph.D. student working with Professor Doina Precup and Professor Xiao-Wen Chang on the cross area of reinforcement learning and matrix computations. I’m currently interested in approximate dynamic programming and Krylov subspace methods. I'm currently working on constructiong basis functions for value function approximation in model-based reinforcement learning.

Mingde Zhao (Mila & McGill University)
Xiao-Wen Chang (McGill University)
Doina Precup (McGill University / Mila / DeepMind Montreal)

More from the Same Authors