Timezone: »

UniGAN: Reducing Mode Collapse in GANs using a Uniform Generator
Ziqi Pan · Li Niu · Liqing Zhang

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #222
Despite the significant progress that has been made in the training of Generative Adversarial Networks (GANs), the mode collapse problem remains a major challenge in training GANs, which refers to a lack of diversity in generative samples. In this paper, we propose a new type of generative diversity named uniform diversity, which relates to a newly proposed type of mode collapse named $u$-mode collapse where the generative samples distribute nonuniformly over the data manifold. From a geometric perspective, we show that the uniform diversity is closely related with the generator uniformity property, and the maximum uniform diversity is achieved if the generator is uniform. To learn a uniform generator, we propose UniGAN, a generative framework with a Normalizing Flow based generator and a simple yet sample efficient generator uniformity regularization, which can be easily adapted to any other generative framework. A new type of diversity metric named udiv is also proposed to estimate the uniform diversity given a set of generative samples in practice. Experimental results verify the effectiveness of our UniGAN in learning a uniform generator and improving uniform diversity.

Author Information

Ziqi Pan (Shanghai JiaoTong University)
Li Niu (Shanghai Jiao Tong University)
Liqing Zhang (Shanghai Jiao Tong University)

More from the Same Authors