Skip to yearly menu bar Skip to main content

Workshop: New Frontiers in Graph Learning

A Simple Hypergraph Kernel Convolution based on Discounted Markov Diffusion Process

Fuyang Li · Jiying Zhang · Xi Xiao · bin zhang · Dijun Luo

Keywords: [ Hypergraph ] [ Graph Convolutional Networks ] [ Diffusion Kernal ] [ Node Classification ]


Kernels on discrete structures evaluate pairwise similarities between objects which capture semantics and inherent topology information. Existing kernels on discrete structures are only developed by topology information(such as adjacency matrix of graphs), without considering original attributes of objects. This paper proposes a two-phase paradigm to aggregate comprehensive information on discrete structures leading to a Discount Markov Diffusion Learnable Kernel (DMDLK). Specifically, based on the underlying projection of DMDLK, we design a Simple Hypergraph Kernel Convolution (SHKC) for hidden representation of vertices. SHKC can adjust diffusion steps rather than stacking convolution layers to aggregate information from long-range neighborhoods which prevents over-smoothing issues of existing hypergraph convolutions. Moreover, we utilize the uniform stability bound theorem in transductive learning to analyze critical factors for the effectiveness and generalization ability of SHKC from a theoretical perspective. The experimental results on several benchmark datasets for node classification tasks verified the superior performance of SHKC over state-of-the-art methods.

Chat is not available.