Timezone: »

Dualing GANs
Yujia Li · Alex Schwing · Kuan-Chieh Wang · Richard Zemel

Wed Dec 06 11:50 AM -- 11:55 AM (PST) @ Hall C

Generative adversarial nets (GANs) are a promising technique for modeling a distribution from samples. It is however well known that GAN training suffers from instability due to the nature of its maximin formulation. In this paper, we explore ways to tackle the instability problem by dualizing the discriminator. We start from linear discriminators in which case conjugate duality provides a mechanism to reformulate the maximin objective into a maximization problem, such that both the generator and the discriminator of this ‘dualing GAN’ act in concert. We then demonstrate how to extend this intuition to non-linear formulations. For GANs with linear discriminators our approach is able to remove the instability in training, while for GANs with nonlinear discriminators our approach provides an alternative to the commonly used GAN training algorithm.

Author Information

Yujia Li (University of Toronto)
Alex Schwing (University of Illinois at Urbana-Champaign)
Kuan-Chieh Wang (University of Toronto)
Richard Zemel (Vector Institute/University of Toronto)

Related Events (a corresponding poster, oral, or spotlight)

  • 2017 Poster: Dualing GANs »
    Thu. Dec 7th 02:30 -- 06:30 AM Room Pacific Ballroom #103

More from the Same Authors