Skip to yearly menu bar Skip to main content


Poster

Language models enable zero-shot prediction of the effects of mutations on protein function

Joshua Meier · Roshan Rao · Robert Verkuil · Jason Liu · Tom Sercu · Alex Rives

Keywords: [ Self-Supervised Learning ]


Abstract:

Modeling the effect of sequence variation on function is a fundamental problem for understanding and designing proteins. Since evolution encodes information about function into patterns in protein sequences, unsupervised models of variant effects can be learned from sequence data. The approach to date has been to fit a model to a family of related sequences. The conventional setting is limited, since a new model must be trained for each prediction task. We show that using only zero-shot inference, without any supervision from experimental data or additional training, protein language models capture the functional effects of sequence variation, performing at state-of-the-art.

Chat is not available.