Timezone: »

Bayesian GANs
Yunus Saatci · Andrew Wilson

Wed Dec 06 11:40 AM -- 11:45 AM (PST) @ Hall C

Generative adversarial networks (GANs) can implicitly learn rich distributions over images, audio, and data which are hard to model with an explicit likelihood. We present a practical Bayesian formulation for unsupervised and semi-supervised learning with GANs. We use stochastic gradient Hamiltonian Monte Carlo to marginalize the weights of generator and discriminator networks. The resulting approach is straightforward and obtains good performance without any standard interventions such as feature matching, or mini-batch discrimination. By exploring an expressive posterior over the parameters of the generator, the Bayesian GAN avoids mode-collapse, produces interpretable candidate samples with notable variability, and in particular provides state-of-the-art quantitative results for semi-supervised learning on benchmarks including SVHN, CelebA, and CIFAR-10, outperforming DCGAN, Wasserstein GANs, and DCGAN ensembles.

Author Information

Yunus Saatci (Uber AI Labs)
Andrew Wilson (Cornell University)

Related Events (a corresponding poster, oral, or spotlight)

  • 2017 Poster: Bayesian GAN »
    Thu. Dec 7th 02:30 -- 06:30 AM Room Pacific Ballroom #112

More from the Same Authors