Timezone: »
Multiple-output regression models require estimating multiple functions, one for each output. To improve parameter estimation in such models, methods based on structural regularization of the model parameters are usually needed. In this paper, we present a multiple-output regression model that leverages the covariance structure of the functions (i.e., how the multiple functions are related with each other) as well as the conditional covariance structure of the outputs. This is in contrast with existing methods that usually take into account only one of these structures. More importantly, unlike most of the other existing methods, none of these structures need be known a priori in our model, and are learned from the data. Several previously proposed structural regularization based multiple-output regression models turn out to be special cases of our model. Moreover, in addition to being a rich model for multiple-output regression, our model can also be used in estimating the graphical model structure of a set of variables (multivariate outputs) conditioned on another set of variables (inputs). Experimental results on both synthetic and real datasets demonstrate the effectiveness of our method.
Author Information
Piyush Rai (Duke University)
Abhishek Kumar (Google Brain)
Hal Daumé III (University of Maryland - College Park)
More from the Same Authors
-
2021 : Poster: The Many Roles that Causal Reasoning Plays in Reasoning about Fairness in Machine Learning »
Irene Y Chen · Hal Daumé III · Solon Barocas -
2021 : The Many Roles that Causal Reasoning Plays in Reasoning about Fairness in Machine Learning »
Irene Y Chen · Hal Daumé III · Solon Barocas -
2018 Workshop: Wordplay: Reinforcement and Language Learning in Text-based Games »
Adam Trischler · Angeliki Lazaridou · Yonatan Bisk · Wendy Tay · Nate Kushman · Marc-Alexandre Côté · Alessandro Sordoni · Daniel Ricks · Tom Zahavy · Hal Daumé III -
2016 Poster: A Credit Assignment Compiler for Joint Prediction »
Kai-Wei Chang · He He · Stephane Ross · Hal Daumé III · John Langford -
2014 Workshop: Representation and Learning Methods for Complex Outputs »
Richard Zemel · Dale Schuurmans · Kilian Q Weinberger · Yuhong Guo · Jia Deng · Francesco Dinuzzo · Hal Daumé III · Honglak Lee · Noah A Smith · Richard Sutton · Jiaqian YU · Vitaly Kuznetsov · Luke Vilnis · Hanchen Xiong · Calvin Murdock · Thomas Unterthiner · Jean-Francis Roy · Martin Renqiang Min · Hichem SAHBI · Fabio Massimo Zanzotto -
2012 Poster: Imitation Learning by Coaching »
He He · Hal Daumé III · Jason Eisner -
2012 Poster: Learned Prioritization for Trading Off Accuracy and Speed »
Jiarong Jiang · Adam Teichert · Hal Daumé III · Jason Eisner -
2011 Poster: Message-Passing for Approximate MAP Inference with Latent Variables »
Jiarong Jiang · Piyush Rai · Hal Daumé III -
2011 Poster: Co-regularized Multi-view Spectral Clustering »
Abhishek Kumar · Piyush Rai · Hal Daumé III -
2010 Poster: Learning Multiple Tasks using Manifold Regularization »
Arvind Agarwal · Hal Daumé III · Samuel Gerber -
2010 Poster: Co-regularization Based Semi-supervised Domain Adaptation »
Hal Daumé III · Abhishek Kumar · Avishek Saha -
2009 Poster: Multi-Label Prediction via Sparse Infinite CCA »
Piyush Rai · Hal Daumé III -
2008 Poster: Nonparametric Bayesian Sparse Hierarchical Factor Modeling and Regression »
Piyush Rai · Hal Daumé III -
2007 Poster: Bayesian Agglomerative Clustering with Coalescents »
Yee Whye Teh · Hal Daumé III · Daniel Roy -
2007 Oral: Bayesian Agglomerative Clustering with Coalescents »
Yee Whye Teh · Hal Daumé III · Daniel Roy