Skip to yearly menu bar Skip to main content


Poster
in
Workshop: AI for Accelerated Materials Design (AI4Mat-2023)

MTENCODER: A Multi-task Pretrained Transformer Encoder for Materials Representation Learning

Thorben Prein · Elton Pan · Tom Doerr · Elsa Olivetti · Jennifer Rupp

Keywords: [ Inorganic materials ] [ Materials Informatics ] [ Representation Learning ]


Abstract:

Given the vast spectrum of material properties characterizing each compound,learning representations for inorganic materials is intricate. The prevailing trendwithin the materials informatics community leans towards designing specializedmodels that predict single properties. We introduce a multi-task learning frame-work, wherein a transformer-based encoder is co-trained across diverse materialsproperties and a denoising objective, resulting in robust and generalizable mate-rials representations. Our method not only amplifies the performance observedin single-dataset pretraining, but also showcases scalability and adaptability to-ward multi-dataset pretraining. Experiments demonstrate that the trained encoderMTENCODER captures chemically meaningful representations, surpassing theperformance of contemporary structure-agnostic materials encoders. This approachpaves the way to improvements in a multitude of deep materials informatics tasks,prominently including materials property prediction and generation of synthesisroutes for novel materials discovery

Chat is not available.