Skip to yearly menu bar Skip to main content


Poster

Practical and Consistent Estimation of f-Divergences

Paul Rubenstein · Olivier Bousquet · Josip Djolonga · Carlos Riquelme · Ilya Tolstikhin

East Exhibition Hall B, C #51

Keywords: [ Frequentist Statistics ] [ Theory ] [ Deep Learning -> Deep Autoencoders; Probabilistic Methods -> Latent Variable Models; Theory ] [ Learning Theory ]


Abstract:

The estimation of an f-divergence between two probability distributions based on samples is a fundamental problem in statistics and machine learning. Most works study this problem under very weak assumptions, in which case it is provably hard. We consider the case of stronger structural assumptions that are commonly satisfied in modern machine learning, including representation learning and generative modelling with autoencoder architectures. Under these assumptions we propose and study an estimator that can be easily implemented, works well in high dimensions, and enjoys faster rates of convergence. We verify the behavior of our estimator empirically in both synthetic and real-data experiments, and discuss its direct implications for total correlation, entropy, and mutual information estimation.

Chat is not available.