Timezone: »
Multiple modalities often co-occur when describing natural phenomena. Learning a joint representation of these modalities should yield deeper and more useful representations.Previous generative approaches to multi-modal input either do not learn a joint distribution or require additional computation to handle missing data. Here, we introduce a multimodal variational autoencoder (MVAE) that uses a product-of-experts inference network and a sub-sampled training paradigm to solve the multi-modal inference problem. Notably, our model shares parameters to efficiently learn under any combination of missing modalities. We apply the MVAE on four datasets and match state-of-the-art performance using many fewer parameters. In addition, we show that the MVAE is directly applicable to weakly-supervised learning, and is robust to incomplete supervision. We then consider two case studies, one of learning image transformations---edge detection, colorization, segmentation---as a set of modalities, followed by one of machine translation between two languages. We find appealing results across this range of tasks.
Author Information
Mike Wu (Stanford University)
Noah Goodman (Stanford University)
More from the Same Authors
-
2021 : DABS: a Domain-Agnostic Benchmark for Self-Supervised Learning »
Alex Tamkin · Vincent Liu · Rongfei Lu · Daniel Fein · Colin Schultz · Noah Goodman -
2021 : Learning to solve complex tasks by growing knowledge culturally across generations »
Michael Tessler · Jason Madeano · Pedro Tsividis · Noah Goodman · Josh Tenenbaum -
2021 : Spotlight Talk: Learning to solve complex tasks by growing knowledge culturally across generations »
Noah Goodman · Josh Tenenbaum · Michael Tessler · Jason Madeano -
2021 : Multi-party referential communication in complex strategic games »
Jessica Mankewitz · Veronica Boyce · Brandon Waldon · Georgia Loukatou · Dhara Yu · Jesse Mu · Noah Goodman · Michael C Frank -
2021 Workshop: Meaning in Context: Pragmatic Communication in Humans and Machines »
Jennifer Hu · Noga Zaslavsky · Aida Nematzadeh · Michael Franke · Roger Levy · Noah Goodman -
2021 : Opening remarks »
Jennifer Hu · Noga Zaslavsky · Aida Nematzadeh · Michael Franke · Roger Levy · Noah Goodman -
2021 Poster: Emergent Communication of Generalizations »
Jesse Mu · Noah Goodman -
2021 Poster: Contrastive Reinforcement Learning of Symbolic Reasoning Domains »
Gabriel Poesia · WenXin Dong · Noah Goodman -
2021 Poster: Improving Compositionality of Neural Networks by Decoding Representations to Inputs »
Mike Wu · Noah Goodman · Stefano Ermon -
2021 Panel: The Consequences of Massive Scaling in Machine Learning »
Noah Goodman · Melanie Mitchell · Joelle Pineau · Oriol Vinyals · Jared Kaplan -
2020 Poster: Language Through a Prism: A Spectral Approach for Multiscale Language Representations »
Alex Tamkin · Dan Jurafsky · Noah Goodman -
2019 Poster: Variational Bayesian Optimal Experimental Design »
Adam Foster · Martin Jankowiak · Elias Bingham · Paul Horsfall · Yee Whye Teh · Thomas Rainforth · Noah Goodman -
2019 Spotlight: Variational Bayesian Optimal Experimental Design »
Adam Foster · Martin Jankowiak · Elias Bingham · Paul Horsfall · Yee Whye Teh · Thomas Rainforth · Noah Goodman -
2018 Poster: Bias and Generalization in Deep Generative Models: An Empirical Study »
Shengjia Zhao · Hongyu Ren · Arianna Yuan · Jiaming Song · Noah Goodman · Stefano Ermon -
2018 Spotlight: Bias and Generalization in Deep Generative Models: An Empirical Study »
Shengjia Zhao · Hongyu Ren · Arianna Yuan · Jiaming Song · Noah Goodman · Stefano Ermon -
2017 : Morning panel discussion »
Jürgen Schmidhuber · Noah Goodman · Anca Dragan · Pushmeet Kohli · Dhruv Batra -
2017 : "Language in context" »
Noah Goodman -
2017 : Contributed talk: Beyond Sparsity: Tree-based Regularization of Deep Models for Interpretability »
Mike Wu · Sonali Parbhoo · Finale Doshi-Velez -
2017 Poster: Learning Disentangled Representations with Semi-Supervised Deep Generative Models »
Siddharth Narayanaswamy · Brooks Paige · Jan-Willem van de Meent · Alban Desmaison · Noah Goodman · Pushmeet Kohli · Frank Wood · Philip Torr -
2016 Poster: Neurally-Guided Procedural Models: Amortized Inference for Procedural Graphics Programs using Neural Networks »
Daniel Ritchie · Anna Thomas · Pat Hanrahan · Noah Goodman -
2015 Workshop: Bounded Optimality and Rational Metareasoning »
Samuel J Gershman · Falk Lieder · Tom Griffiths · Noah Goodman -
2013 Poster: Learning and using language via recursive pragmatic reasoning about other agents »
Nathaniel J Smith · Noah Goodman · Michael C Frank -
2013 Poster: Learning Stochastic Inverses »
Andreas Stuhlmüller · Jacob Taylor · Noah Goodman -
2012 Workshop: Probabilistic Programming: Foundations and Applications (2 day) »
Vikash Mansinghka · Daniel Roy · Noah Goodman -
2012 Workshop: Probabilistic Programming: Foundations and Applications (2 day) »
Vikash Mansinghka · Daniel Roy · Noah Goodman -
2012 Poster: Burn-in, bias, and the rationality of anchoring »
Falk Lieder · Tom Griffiths · Noah Goodman -
2011 Poster: Nonstandard Interpretations of Probabilistic Programs for Efficient Inference »
David Wingate · Noah Goodman · Andreas Stuhlmueller · Jeffrey Siskind