Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Principles in Cognitive Systems (InfoCog)

A Work in Progress: Tighter Bounds on the Information Bottleneck for Deep Learning

Nir Weingarten · Moshe Butman · Ran Gilad-Bachrach

[ ] [ Project Page ]
Fri 15 Dec 12:40 p.m. PST — 1:30 p.m. PST

Abstract:

The field of Deep Neural Nets (DNNs) is still evolving and new architectures are emerging to better extract information from available data. The Information Bottleneck, IB, offers an optimal information theoretic framework for data modeling. However, IB is intractable in most settings. In recent years attempts were made to combine deep learning with IB both for optimization and to explain the inner workings of deep neural nets. VAE inspired variational approximations such as VIB became a popular method to approximate bounds on the required mutual information computations. This work continues this direction by introducing a new tractable variational upper bound for the IB functional which is empirically tighter than previous bounds. When used as an objective function it enhances the performance of previous IB-inspired DNNs in terms of test accuracy and robustness to adversarial attacks across several challenging tasks. Furthermore, the utilization of information theoretic tools allows us to analyze experiments and confirm theoretical predictions in real world problems.

Chat is not available.