Skip to yearly menu bar Skip to main content


Spotlight
in
Workshop: AI for Accelerated Materials Design (AI4Mat-2023)

ExPT: Synthetic Pretraining for Few-Shot Experimental Design

Tung Nguyen · Sudhanshu Agrawal · Aditya Grover

Keywords: [ experimental design ] [ synthetic pretraining ] [ Black-box Optimization ] [ Foundation Model ] [ transformers ]

[ ] [ Project Page ]
Fri 15 Dec 8:40 a.m. PST — 8:50 a.m. PST

Abstract:

Experimental design for optimizing black-box functions is a fundamental problem in many science and engineering fields. In this problem, sample efficiency is crucial due to the time, money, and safety costs of real-world design evaluations. Existing approaches either rely on active data collection or access to large, labeled datasets of past experiments, making them impractical in many real-world scenarios. In this work, we address the more challenging yet realistic setting of few-shot experimental design, where only a few labeled data points of input designs and their corresponding values are available. We introduce Experiment Pretrained Transformers (ExPT), a foundation model for few-shot experimental design that combines unsupervised learning and in-context pretraining. In ExPT, we only assume knowledge of a finite collection of unlabelled data points from the input domain and pretrain a transformer neural network to optimize diverse synthetic functions defined over this domain. Unsupervised pretraining allows ExPT to adapt to any design task at test time in an in-context fashion by conditioning on a few labeled data points from the target task and generating the candidate optima. We evaluate ExPT on few-shot experimental design in challenging domains and demonstrate its superior generality and performance compared to existing methods. The source code is available at https://github.com/tung-nd/ExPT.git.

Chat is not available.