Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning in Structural Biology Workshop

Enhancing Antibody Language Models with Structural Information

Justin Barton · Jacob Galson · Jinwoo Leem


Abstract:

The central tenet of molecular biology is that a protein’s amino acid sequence determines its three-dimensional structure, and thus its function. However, proteins with similar sequences do not always fold into the same shape, and vice-versa, dissimilar sequences can adopt similar folds. In this work, we explore antibodies, a class of proteins in the immune system, whose local shapes are highly unpredictable, even with small variations in their sequence. Inspired by the CLIP method [1], we propose a multimodal contrastive learning approach, contrastive sequence-structure pre-training (CSSP), which amalgamates the representations of antibody sequences and structures in a mutual latent space. Integrating structural information leads both antibody and protein language models to show better correspondence with structural similarity and improves accuracy and data efficiency in downstream binding prediction tasks. We provide an optimised CSSP-trained model, AbFormer-CSSP, for non-commercial use at [HuggingFace link redacted for anonymity].

Chat is not available.