Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Science: Progress and Promises

A Pareto-optimal compositional energy-based model for sampling and optimization of protein sequences

NataĊĦa Tagasovska · Nathan Frey · Andreas Loukas · Isidro Hotzel · Julien Lafrance-Vanasse · Ryan Kelly · Yan Wu · Arvind Rajpal · Richard Bonneau · Kyunghyun Cho · Stephen Ra · Vladimir Gligorijevic


Abstract:

Deep generative models have emerged as a popular machine learning-based approach for inverse design problems in the life sciences. However, these problems often require sampling new designs that satisfy multiple properties of interest in addition to learning the data distribution. This multi-objective optimization becomes more challenging when properties are independent or orthogonal to each other.In this work, we propose a Pareto-compositional energy-based model (pcEBM), a framework that uses multiple gradient descent for sampling new designs that adhere to various constraints in optimizing distinct properties. We demonstrate its ability to learn non-convex Pareto fronts and generate sequences that simultaneously satisfy multiple desired properties across a series of real-world antibody design tasks.

Chat is not available.