Timezone: »
Bayesian nonparametric methods based on the Dirichlet process (DP), gamma process and beta process, have proven effective in capturing aspects of various datasets arising in machine learning. However, it is now recognized that such processes have their limitations in terms of the ability to capture power law behavior. As such there is now considerable interest in models based on the Stable Processs (SP), Generalized Gamma process (GGP) and Stable-beta process (SBP). These models present new challenges in terms of practical statistical implementation. In analogy to tractable processes such as the finite-dimensional Dirichlet process, we describe a class of random processes, we call iid finite-dimensional BFRY processes, that enables one to begin to develop efficient posterior inference algorithms such as variational Bayes that readily scale to massive datasets. For illustrative purposes, we describe a simple variational Bayes algorithm for normalized SP mixture models, and demonstrate its usefulness with experiments on synthetic and real-world datasets.
Author Information
Juho Lee (POSTECH)
Lancelot F James (HKUST)
Seungjin Choi (POSTECH)
More from the Same Authors
-
2015 Poster: Tree-Guided MCMC Inference for Normalized Random Measure Mixture Models »
Juho Lee · Seungjin Choi -
2009 Poster: Clustering sequence sets for motif discovery »
Jong Kyoung Kim · Seungjin Choi