Skip to yearly menu bar Skip to main content


Oral Poster

Graph Diffusion Transformers for Multi-Conditional Molecular Generation

Gang Liu · Jiaxin Xu · Tengfei Luo · Meng Jiang

East Exhibit Hall A-C #2610
[ ] [ Project Page ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST
 
Oral presentation: Oral Session 6A: Machine Learning and Science, Safety
Fri 13 Dec 3:30 p.m. PST — 4:30 p.m. PST

Abstract:

Inverse molecular design with diffusion models holds great potential for advancements in material and drug discovery. Despite success in unconditional molecule generation, integrating multiple properties such as synthetic score and gas permeability as condition constraints into diffusion models remains unexplored. We present the Graph Diffusion Transformer (Graph DiT) for multi-conditional molecular generation. Graph DiT has a condition encoder to learn the representation of numerical and categorical properties and utilizes a Transformer-based graph denoiser to achieve molecular graph denoising under conditions. Unlike previous graph diffusion models that add noise separately on the atoms and bonds in the forward diffusion process, we propose a graph-dependent noise model for training Graph DiT, designed to accurately estimate graph-related noise in molecules. We extensively validate the Graph DiT for multi-conditional polymer and small molecule generation. Results demonstrate our superiority across metrics from distribution learning to condition control for molecular properties. A polymer inverse design task for gas separation with feedback from domain experts further demonstrates its practical utility. The code is available at https://github.com/liugangcode/Graph-DiT.

Live content is unavailable. Log in and register to view live content