Timezone: »
In this paper, from a theoretical perspective, we study how powerful graph neural networks (GNNs) can be for learning approximation algorithms for combinatorial problems. To this end, we first establish a new class of GNNs that can solve a strictly wider variety of problems than existing GNNs. Then, we bridge the gap between GNN theory and the theory of distributed local algorithms. We theoretically demonstrate that the most powerful GNN can learn approximation algorithms for the minimum dominating set problem and the minimum vertex cover problem with some approximation ratios with the aid of the theory of distributed local algorithms. We also show that most of the existing GNNs such as GIN, GAT, GCN, and GraphSAGE cannot perform better than with these ratios. This paper is the first to elucidate approximation ratios of GNNs for combinatorial problems. Furthermore, we prove that adding coloring or weak-coloring to each node feature improves these approximation ratios. This indicates that preprocessing and feature engineering theoretically strengthen model capabilities.
Author Information
Ryoma Sato (Kyoto University)
I am a first year student in master's program at Kashima-Yamada Lab, Kyoto University Interests: Machine Learning on Graphs, Discrete Algorithms
Makoto Yamada (Kyoto University/RIKEN AIP)
Hisashi Kashima (Kyoto University/RIKEN Center for AIP)
More from the Same Authors
-
2021 Poster: Adversarial Regression with Doubly Non-negative Weighting Matrices »
Tam Le · Truyen Nguyen · Makoto Yamada · Jose Blanchet · Viet Anh Nguyen -
2021 Poster: Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares »
Hiroaki Yamada · Makoto Yamada -
2020 Poster: Fast Unbalanced Optimal Transport on a Tree »
Ryoma Sato · Makoto Yamada · Hisashi Kashima -
2020 Poster: Neural Methods for Point-wise Dependency Estimation »
Yao-Hung Hubert Tsai · Han Zhao · Makoto Yamada · Louis-Philippe Morency · Russ Salakhutdinov -
2020 Spotlight: Neural Methods for Point-wise Dependency Estimation »
Yao-Hung Hubert Tsai · Han Zhao · Makoto Yamada · Louis-Philippe Morency · Russ Salakhutdinov -
2019 Poster: Fast Sparse Group Lasso »
Yasutoshi Ida · Yasuhiro Fujiwara · Hisashi Kashima -
2019 Poster: Theoretical evidence for adversarial robustness through randomization »
Rafael Pinot · Laurent Meunier · Alexandre Araujo · Hisashi Kashima · Florian Yger · Cedric Gouy-Pailler · Jamal Atif -
2019 Poster: Kernel Stein Tests for Multiple Model Comparison »
Jen Ning Lim · Makoto Yamada · Bernhard Schölkopf · Wittawat Jitkrittum -
2019 Poster: Tree-Sliced Variants of Wasserstein Distances »
Tam Le · Makoto Yamada · Kenji Fukumizu · Marco Cuturi -
2018 Poster: Persistence Fisher Kernel: A Riemannian Manifold Kernel for Persistence Diagrams »
Tam Le · Makoto Yamada