Timezone: »
Communication enables agents to cooperate to achieve their goals. Learning when to communicate, i.e., sparse (in time) communication, and whom to message is particularly important when bandwidth is limited. Recent work in learning sparse individualized communication, however, suffers from high variance during training, where decreasing communication comes at the cost of decreased reward, particularly in cooperative tasks. We use the information bottleneck to reframe sparsity as a representation learning problem, which we show naturally enables lossless sparse communication at lower budgets than prior art. In this paper, we propose a method for true lossless sparsity in communication via Information Maximizing Gated Sparse Multi-Agent Communication (IMGS-MAC). Our model uses two individualized regularization objectives, an information maximization autoencoder and sparse communication loss, to create informative and sparse communication. We evaluate the learned communication `language' through direct causal analysis of messages in non-sparse runs to determine the range of lossless sparse budgets, which allow zero-shot sparsity, and the range of sparse budgets that will inquire a reward loss, which is minimized by our learned gating function with few-shot sparsity. To demonstrate the efficacy of our results, we experiment in cooperative multi-agent tasks where communication is essential for success. We evaluate our model with both continuous and discrete messages. We focus our analysis on a variety of ablations to show the effect of message representations, including their properties, and lossless performance of our model.
Author Information
Seth Karten (Carnegie Mellon University)
Mycal Tucker (Massachusetts Institute of Technology)
Siva Kailas (Carnegie Mellon University)
Katia Sycara
More from the Same Authors
-
2022 : Trading off Utility, Informativeness, and Complexity in Emergent Communication »
Mycal Tucker · Julie A Shah · Roger Levy · Noga Zaslavsky -
2022 : Generalization and Translatability in Emergent Communication via Informational Constraints »
Mycal Tucker · Roger Levy · Julie A Shah · Noga Zaslavsky -
2023 Poster: Human-Guided Complexity-Controlled Abstractions »
Andi Peng · Mycal Tucker · Eoin Kenny · Noga Zaslavsky · Pulkit Agrawal · Julie A Shah -
2023 Poster: Characterizing Out-of-Distribution Error via Optimal Transport »
Yuzhe Lu · Yilong Qin · Runtian Zhai · Andrew Shen · Ketong Chen · Zhenlin Wang · Soheil Kolouri · Simon Stepputtis · Joseph Campbell · Katia Sycara -
2022 : Generalization and Translatability in Emergent Communication via Informational Constraints »
Mycal Tucker · Roger Levy · Julie A Shah · Noga Zaslavsky -
2022 Workshop: Information-Theoretic Principles in Cognitive Systems »
Noga Zaslavsky · Mycal Tucker · Sarah Marzen · Irina Higgins · Stephanie Palmer · Samuel J Gershman -
2022 Poster: Trading off Utility, Informativeness, and Complexity in Emergent Communication »
Mycal Tucker · Roger Levy · Julie Shah · Noga Zaslavsky -
2021 Poster: Emergent Discrete Communication in Semantic Spaces »
Mycal Tucker · Huao Li · Siddharth Agrawal · Dana Hughes · Katia Sycara · Michael Lewis · Julie A Shah