Timezone: »
An agent who interacts with a wide population of other agents needs to be aware that there may be variations in their understanding of the world. Furthermore, the machinery which they use to perceive may be inherently different, as is the case between humans and machines. In this work, we present both an image reference game between a speaker and a population of listeners where reasoning about the concepts other agents can comprehend is necessary and a model formulation with this capability. We focus on reasoning about the conceptual understanding of others, as well as adapting to novel gameplay partners and dealing with differences in perceptual machinery. Our experiments on three benchmark image/attribute datasets suggest that our learner indeed encodes information directly pertaining to the understanding of other agents, and that leveraging this information is crucial for maximizing gameplay performance.
Author Information
Rodolfo Corona Rodriguez (UC Berkeley)
Stephan Alaniz (Max Planck Institute for Informatics)
Zeynep Akata (University of Amsterdam)
Related Events (a corresponding poster, oral, or spotlight)
-
2019 Poster: Modeling Conceptual Understanding in Image Reference Games »
Thu. Dec 12th 06:45 -- 08:45 PM Room East Exhibition Hall B + C #79
More from the Same Authors
-
2019 Poster: Combining Generative and Discriminative Models for Hybrid Inference »
Victor Garcia Satorras · Zeynep Akata · Max Welling -
2019 Spotlight: Combining Generative and Discriminative Models for Hybrid Inference »
Victor Garcia Satorras · Max Welling · Zeynep Akata