Skip to yearly menu bar Skip to main content


Poster

AdaFlow

Xixi Hu · Qiang Liu · Xingchao Liu · Bo Liu

West Ballroom A-D #7309
[ ]
Wed 11 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

Diffusion-based imitation learning improves Behavioral Cloning (BC) on multi-modal decision-making, but comes at the cost of significantly slower inference due to the recursion in the diffusion process. It urges us to design efficient policy generators while keeping the ability to generate diverse actions. To address this challenge, we propose AdaFlow, an imitation learning framework based on flow-based generative modeling. AdaFlow represents the policy with state-conditioned ordinary differential equations (ODEs), which are known as probability flows. We reveal an intriguing connection between the conditional variance of their training loss and the discretization error of the ODEs.With this insight, we propose a variance-adaptive ODE solver that can adjust its step size in the inference stage, makingAdaFlow an adaptive decision-maker, offering rapid inference without sacrificing diversity. Interestingly, it automatically reduces to a one-step generator when the action distribution is uni-modal. Our comprehensive empirical evaluation shows that AdaFlow achieves high performance with fast inference speed.

Live content is unavailable. Log in and register to view live content