Poster
Scalable Structure Learning of Continuous-Time Bayesian Networks from Incomplete Data
Dominik Linzner · Michael Schmidt · Heinz Koeppl

Tue Dec 10th 05:30 -- 07:30 PM @ East Exhibition Hall B + C #169

Continuous-time Bayesian Networks (CTBNs) represent a compact yet powerful framework for understanding multivariate time-series data. Given complete data, parameters and structure can be estimated efficiently in closed-form. However, if data is incomplete, the latent states of the CTBN have to be estimated by laboriously simulating the intractable dynamics of the assumed CTBN. This is a problem, especially for structure learning tasks, where this has to be done for each element of a super-exponentially growing set of possible structures. In order to circumvent this notorious bottleneck, we develop a novel gradient-based approach to structure learning. Instead of sampling and scoring all possible structures individually, we assume the generator of the CTBN to be composed as a mixture of generators stemming from different structures. In this framework, structure learning can be performed via a gradient-based optimization of mixture weights. We combine this approach with a new variational method that allows for a closed-form calculation of this mixture marginal likelihood. We show the scalability of our method by learning structures of previously inaccessible sizes from synthetic and real-world data.

Author Information

Dominik Linzner (Technische Universität Darmstadt)
Michael Schmidt (TU Darmstadt)
Heinz Koeppl (Technische Universität Darmstadt)

More from the Same Authors