Skip to yearly menu bar Skip to main content


Poster

Minimax Estimation of Neural Net Distance

Kaiyi Ji · Yingbin Liang

Room 517 AB #104

Keywords: [ Learning Theory ] [ Information Theory ]


Abstract:

An important class of distance metrics proposed for training generative adversarial networks (GANs) is the integral probability metric (IPM), in which the neural net distance captures the practical GAN training via two neural networks. This paper investigates the minimax estimation problem of the neural net distance based on samples drawn from the distributions. We develop the first known minimax lower bound on the estimation error of the neural net distance, and an upper bound tighter than an existing bound on the estimator error for the empirical neural net distance. Our lower and upper bounds match not only in the order of the sample size but also in terms of the norm of the parameter matrices of neural networks, which justifies the empirical neural net distance as a good approximation of the true neural net distance for training GANs in practice.

Live content is unavailable. Log in and register to view live content