Timezone: »
Poster
Estimators for Multivariate Information Measures in General Probability Spaces
Arman Rahimzamani · Himanshu Asnani · Pramod Viswanath · Sreeram Kannan
Information theoretic quantities play an important role in various settings in machine learning, including causality testing, structure inference in graphical models, time-series problems, feature selection as well as in providing privacy guarantees. A key quantity of interest is the mutual information and generalizations thereof, including conditional mutual information, multivariate mutual information, total correlation and directed information. While the aforementioned information quantities are well defined in arbitrary probability spaces, existing estimators employ a $\Sigma H$ method, which can only work in purely discrete space or purely continuous case since entropy (or differential entropy) is well defined only in that regime.
In this paper, we define a general graph divergence measure ($\mathbb{GDM}$), generalizing the aforementioned information measures and we construct a novel estimator via a coupling trick that directly estimates these multivariate information measures using the Radon-Nikodym derivative. These estimators are proven to be consistent in a general setting which includes several cases where the existing estimators fail, thus providing the only known estimators for the following settings: (1) the data has some discrete and some continuous valued components (2) some (or all) of the components themselves are discrete-continuous \textit{mixtures} (3) the data is real-valued but does not have a joint density on the entire space, rather is supported on a low-dimensional manifold. We show that our proposed estimators significantly outperform known estimators on synthetic and real datasets.
Author Information
Arman Rahimzamani (University of Washington)
Himanshu Asnani (University of Washington, Seattle)
Pramod Viswanath (UIUC)
Sreeram Kannan (University of Washington)
More from the Same Authors
-
2019 Poster: Turbo Autoencoder: Deep learning based channel codes for point-to-point communication channels »
Yihan Jiang · Hyeji Kim · Himanshu Asnani · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2018 Poster: Deepcode: Feedback Codes via Deep Learning »
Hyeji Kim · Yihan Jiang · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2017 Poster: Deanonymization in the Bitcoin P2P Network »
Giulia Fanti · Pramod Viswanath -
2017 Poster: Estimating Mutual Information for Discrete-Continuous Mixtures »
Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2017 Spotlight: Estimating Mutual Information for Discrete-Continuous Mixtures »
Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2017 Poster: Discovering Potential Correlations via Hypercontractivity »
Hyeji Kim · Weihao Gao · Sreeram Kannan · Sewoong Oh · Pramod Viswanath -
2016 Poster: Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation »
Weihao Gao · Sewoong Oh · Pramod Viswanath -
2015 Poster: Secure Multi-party Differential Privacy »
Peter Kairouz · Sewoong Oh · Pramod Viswanath -
2014 Poster: Extremal Mechanisms for Local Differential Privacy »
Peter Kairouz · Sewoong Oh · Pramod Viswanath