Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Causal Inference Challenges in Sequential Decision Making: Bridging Theory and Practice

A Variational Information Bottleneck Principle for Recurrent Neural Networks

ADRIAN TOVAR · Varun Jog


Abstract:

We propose an information bottleneck principle for causal time-series prediction. We develop variational bounds on the information bottleneck objective function that can be optimized using recurrent neural networks. We implement our algorithm on simulated data as well as real-world weather-prediction and stock market-prediction datasets and show that these problems can be successfully solved using the new information bottleneck principle.

Chat is not available.