A Generative Framework for Exchangeable Graphs with Global and Local Latent Structure
Abstract
We introduce a generative framework for exchangeable graphs that combines a Set Transformer-based encoder-decoder architecture with a hierarchical latent space composed of a global variable and node-specific variables. The global latent is modeled via diffusion process and acts as contextual input for node-level Gaussian mixtures. The decoder uses self-attention layers with global context injection to predict edge probabilities, ensuring high expressivity and full permutation invariance. The overall architecture ensures permutation invariance and can operate without node features, using the information in the adjacency matrix, which enables broad applicability beyond feature-rich domains. Through extensive experiments on synthetic benchmarks—including SBM, MMSBM, and the realistic LFR benchmark—we show that our approach accurately reproduces key graph statistics, especially for community-based networks. The global and local latent variables provide meaningful graph and node level context, while the architecture remains scalable to medium-sized dense graphs (e.g. 500-1000 nodes). Overall, our framework balances expressiveness, interpretability, and structural fidelity, offering a versatile tool for modeling complex graph data.