Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop on Diffusion Models

Sharp analysis of learning a flow-based generative model from limited sample complexity

Hugo Cui · Eric Vanden-Eijnden · Florent Krzakala · Lenka Zdeborová


Abstract: We study the problem of training a flow-based generative model, parametrized by a two-layer autoencoder, to sample from a high-dimensional Gaussian mixture. We provide a sharp end-to-end analysis of the problem. First, we provide a tight closed-form characterization of the learnt generative flow, when parametrized by a shallow denoising auto-encoder trained on a finite number n of samples from the target distribution. Building on this analysis, we provide closed-form formulae for the distance between the means of the generated mixture and the mean of the target mixture, which we show decays as Θn(1n). Finally, this rate is shown to be in fact Bayes-optimal.

Chat is not available.