Timezone: »
Probabilistic graphical models use local factors to represent dependence among sets of variables. For many problem domains, for instance climatology and epidemiology, in addition to local dependencies, we may also wish to model heavy-tailed statistics, where extreme deviations should not be treated as outliers. Specifying such distributions using graphical models for probability density functions (PDFs) generally lead to intractable inference and learning. Cumulative distribution networks (CDNs) provide a means to tractably specify multivariate heavy-tailed models as a product of cumulative distribution functions (CDFs). Currently, algorithms for inference and learning, which correspond to computing mixed derivatives, are exact only for tree-structured graphs. For graphs of arbitrary topology, an efficient algorithm is needed that takes advantage of the sparse structure of the model, unlike symbolic differentiation programs such as Mathematica and D* that do not. We present an algorithm for recursively decomposing the computation of derivatives for CDNs of arbitrary topology, where the decomposition is naturally described using junction trees. We compare the performance of the resulting algorithm to Mathematica and D*, and we apply our method to learning models for rainfall and H1N1 data, where we show that CDNs with cycles are able to provide a significantly better fits to the data as compared to tree-structured and unstructured CDNs and other heavy-tailed multivariate distributions such as the multivariate copula and logistic models.
Author Information
Jim C Huang (Microsoft Research)
Nebojsa Jojic (Microsoft Research)
Christopher Meek (Microsoft Research)
Related Events (a corresponding poster, oral, or spotlight)
-
2010 Poster: Exact inference and learning for cumulative distribution functions on loopy graphs »
Wed. Dec 8th 08:00 -- 08:00 AM Room
More from the Same Authors
-
2021 : Few-Shot Learning Evaluation in Natural Language Understanding »
Subhabrata Mukherjee · Xiaodong Liu · Guoqing Zheng · Saghar Hosseini · Hao Cheng · Ge Yang · Christopher Meek · Ahmed Awadallah · Jianfeng Gao -
2022 Poster: Diffusion Models as Plug-and-Play Priors »
Alexandros Graikos · Nikolay Malkin · Nebojsa Jojic · Dimitris Samaras -
2020 Workshop: Causal Discovery and Causality-Inspired Machine Learning »
Biwei Huang · Sara Magliacane · Kun Zhang · Danielle Belgrave · Elias Bareinboim · Daniel Malinsky · Thomas Richardson · Christopher Meek · Peter Spirtes · Bernhard Schölkopf -
2016 Poster: Iterative Refinement of the Approximate Posterior for Directed Belief Networks »
R Devon Hjelm · Russ Salakhutdinov · Kyunghyun Cho · Nebojsa Jojic · Vince Calhoun · Junyoung Chung -
2014 Poster: Recursive Inversion Models for Permutations »
Christopher Meek · Marina Meila -
2013 Poster: Documents as multiple overlapping windows into grids of counts »
Alessandro Perina · Nebojsa Jojic · Manuele Bicego · Andrzej Truski -
2013 Poster: A Comparative Framework for Preconditioned Lasso Algorithms »
Fabian L Wauthier · Nebojsa Jojic · Michael Jordan -
2013 Demonstration: Making Smooth Topical Connections on Touch Devices »
Nebojsa Jojic · Alessandro Perina · Andrzej Truski -
2011 Poster: A Model for Temporal Dependencies in Event Streams »
Asela Gunawardana · Christopher Meek · Puyang Xu -
2010 Poster: Structural epitome: a way to summarize one’s visual experience »
Nebojsa Jojic · Alessandro Perina · Vittorio Murino -
2009 Poster: Free energy score space »
Alessandro Perina · Marco Cristani · Umberto Castellani · Vittorio Murino · Nebojsa Jojic -
2008 Poster: MAS: a multiplicative approximation scheme for probabilistic inference »
Ydo Wexler · Christopher Meek -
2008 Oral: MAS: a multiplicative approximation scheme for probabilistic inference »
Ydo Wexler · Christopher Meek -
2008 Poster: Structured ranking learning using cumulative distribution networks »
Jim C Huang · Brendan J Frey