Timezone: »

 
Poster
Divergence Frontiers for Generative Models: Sample Complexity, Quantization Effects, and Frontier Integrals
Lang Liu · Krishna Pillutla · Sean Welleck · Sewoong Oh · Yejin Choi · Zaid Harchaoui

Thu Dec 09 04:30 PM -- 06:00 PM (PST) @

The spectacular success of deep generative models calls for quantitative tools to measure their statistical performance. Divergence frontiers have recently been proposed as an evaluation framework for generative models, due to their ability to measure the quality-diversity trade-off inherent to deep generative modeling. We establish non-asymptotic bounds on the sample complexity of divergence frontiers. We also introduce frontier integrals which provide summary statistics of divergence frontiers. We show how smoothed estimators such as Good-Turing or Krichevsky-Trofimov can overcome the missing mass problem and lead to faster rates of convergence. We illustrate the theoretical results with numerical examples from natural language processing and computer vision.

Author Information

Lang Liu (University of Washington)
Krishna Pillutla (University of Washington)
Sean Welleck (University of Washington)
Sewoong Oh (University of Washington)
Yejin Choi (University of Washington)
Zaid Harchaoui (University of Washington)

More from the Same Authors