Skip to yearly menu bar Skip to main content


Poster
in
Workshop: INTERPOLATE — First Workshop on Interpolation Regularizers and Beyond

LSGANs with Gradient Regularizers are Smooth High-dimensional Interpolators

Siddarth Asokan · Chandra Seelamantula

Keywords: [ higher-order gradient penalty ] [ generative adversarial networks ] [ variational calculus ] [ radial basis function ] [ Least-squares GAN ] [ polyharmonic function ]


Abstract:

We consider the problem of discriminator optimization in least-squares generative adversarial networks (LSGANs) subject to higher-order gradient regularization enforced on the convex hull of all possible interpolation points between the target (real) and generated (fake) data. We analyze the proposed LSGAN cost within a variational framework, and show that the optimal discriminator solves a regularized least-squares problem, and can be represented through a polyharmonic radial basis function (RBF) interpolator. The optimal RBF discriminator can be implemented in closed-form, with the weights computed by solving a linear system of equations. We validate the proposed approach on synthetic Gaussian and standard image datasets. While the optimal LSGAN discriminator leads to a superior convergence on Gaussian data, the inherent low-dimensional manifold structure of images makes the implementation of the optimal discriminator ill-posed. Nevertheless, replacing the trainable discriminator network with a closed-form RBF interpolator results in superior convergence on 2-D Gaussian data, while overcoming pitfalls in GAN training, namely mode dropping and mode collapse.

Chat is not available.