We propose an information bottleneck principle for causal time-series prediction. We develop variational bounds on the information bottleneck objective function that can be optimized using recurrent neural networks. We implement our algorithm on simulated data as well as real-world weather-prediction and stock market-prediction datasets and show that these problems can be successfully solved using the new information bottleneck principle.