Skip to yearly menu bar Skip to main content


Poster

Sequential Signal Mixing Aggregation for Message Passing Graph Neural Networks

Mitchell Keren Taraday · Almog David · Chaim Baskin

East Exhibit Hall A-C #2810
[ ] [ Project Page ]
Thu 12 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Message Passing Graph Neural Networks (MPGNNs) have emerged as the preferred method for modeling complex interactions across diverse graph entities. While the theory of such models is well understood, their aggregation module has not received sufficient attention. Sum-based aggregators have solid theoretical foundations regarding their separation capabilities. However, practitioners often prefer using more complex aggregations and mixtures of diverse aggregations. In this work, we unveil a possible explanation for this gap. We claim that sum-based aggregators fail to "mix" features belonging to distinct neighbors, preventing them from succeeding at downstream tasks.To this end, we introduce \aggname (\aggnameabbrv), a novel plug-and-play aggregation for MPGNNs. \aggnameabbrv treats the neighbor features as 2D discrete signals and sequentially convolves them, inherently enhancing the ability to mix features attributed to distinct neighbors. By performing extensive experiments, we show that when combining \aggnameabbrv with well-established MPGNN architectures, we achieve substantial performance gains across various benchmarks, achieving new state-of-the-art results in many settings.We published our code at \githubrepo

Live content is unavailable. Log in and register to view live content