Timezone: »
A theoretical performance analysis of the graph neural network (GNN) is presented. For classification tasks, the neural network approach has the advantage in terms of flexibility that it can be employed in a data-driven manner, whereas Bayesian inference requires the assumption of a specific model. A fundamental question is then whether GNN has a high accuracy in addition to this flexibility. Moreover, whether the achieved performance is predominately a result of the backpropagation or the architecture itself is a matter of considerable interest. To gain a better insight into these questions, a mean-field theory of a minimal GNN architecture is developed for the graph partitioning problem. This demonstrates a good agreement with numerical experiments.
Author Information
Tatsuro Kawamoto (National Institute of Advanced Industrial Science and Technology)
Masashi Tsubaki (National Institute of Advanced Industrial Science and Technology (AIST))
Tomoyuki Obuchi (Tokyo Institute of Technology)
More from the Same Authors
-
2020 Poster: On the equivalence of molecular graph convolution and molecular wave function with poor basis set »
Masashi Tsubaki · Teruyasu Mizoguchi -
2018 Poster: Objective and efficient inference for couplings in neuronal networks »
Yu Terada · Tomoyuki Obuchi · Takuya Isomura · Yoshiyuki Kabashima -
2017 : Poster session 1 »
Van-Doan Nguyen · Stephan Eismann · Haozhen Wu · Garrett Goh · Kristina Preuer · Thomas Unterthiner · Matthew Ragoza · Tien-Lam PHAM · Günter Klambauer · Andrea Rocchetto · Maxwell Hutchinson · Qian Yang · Rafael Gomez-Bombarelli · Sheshera Mysore · Brooke Husic · Ryan-Rhys Griffiths · Masashi Tsubaki · Emma Strubell · Philippe Schwaller · Théophile Gaudin · Michael Brenner · Li Li -
2017 : Poster spotlights »
Emma Strubell · Garrett Goh · Masashi Tsubaki · Théophile Gaudin · Philippe Schwaller · Matthew Ragoza · Rafael Gomez-Bombarelli