Timezone: »

 
Poster
Implicit Graph Neural Networks
Fangda Gu · Heng Chang · Wenwu Zhu · Somayeh Sojoudi · Laurent El Ghaoui

Thu Dec 10 09:00 PM -- 11:00 PM (PST) @ Poster Session 6 #1842

Graph Neural Networks (GNNs) are widely used deep learning models that learn meaningful representations from graph-structured data. Due to the finite nature of the underlying recurrent structure, current GNN methods may struggle to capture long-range dependencies in underlying graphs. To overcome this difficulty, we propose a graph learning framework, called Implicit Graph Neural Networks (IGNN), where predictions are based on the solution of a fixed-point equilibrium equation involving implicitly defined "state" vectors. We use the Perron-Frobenius theory to derive sufficient conditions that ensure well-posedness of the framework. Leveraging implicit differentiation, we derive a tractable projected gradient descent method to train the framework. Experiments on a comprehensive range of tasks show that IGNNs consistently capture long-range dependencies and outperform state-of-the-art GNN models.

Author Information

Fangda Gu (UC Berkeley)

Fangda Gu is a second-year PhD student at the University of California at Berkeley majoring in Electrical Engineering and Computer Sciences. Fangda is advised by Professor Laurent El Ghaoui. Fangda's research interests include machine learning, optimization, control, and power systems. Before coming to Berkeley, Fangda completed an undergraduate degree in Physics at Tsinghua University.

Heng Chang (Tsinghua University)
Wenwu Zhu (Tsinghua University)
Somayeh Sojoudi (University of California, Berkeley)
Laurent El Ghaoui (UC Berkeley)

More from the Same Authors