`

Timezone: »

 
Poster
RoMA: Robust Model Adaptation for Offline Model-based Optimization
Sihyun Yu · Sungsoo Ahn · Le Song · Jinwoo Shin

Wed Dec 08 04:30 PM -- 06:00 PM (PST) @ None #None

We consider the problem of searching an input maximizing a black-box objective function given a static dataset of input-output queries. A popular approach to solving this problem is maintaining a proxy model, e.g., a deep neural network (DNN), that approximates the true objective function. Here, the main challenge is how to avoid adversarially optimized inputs during the search, i.e., the inputs where the DNN highly overestimates the true objective function. To handle the issue, we propose a new framework, coined robust model adaptation (RoMA), based on gradient-based optimization of inputs over the DNN. Specifically, it consists of two steps: (a) a pre-training strategy to robustly train the proxy model and (b) a novel adaptation procedure of the proxy model to have robust estimates for a specific set of candidate solutions. At a high level, our scheme utilizes the local smoothness prior to overcome the brittleness of the DNN. Experiments under various tasks show the effectiveness of RoMA compared with previous methods, obtaining state-of-the-art results, e.g., RoMA outperforms all at 4 out of 6 tasks and achieves runner-up results at the remaining tasks.

Author Information

Sihyun Yu (Korea Advanced Institute of Science and Technology)
Sungsoo Ahn (MBZUAI)
Le Song (Georgia Institute of Technology)
Jinwoo Shin (KAIST)

More from the Same Authors