Skip to yearly menu bar Skip to main content


Poster

Compositional Generalization from First Principles

Thaddäus Wiedemer · Prasanna Mayilvahanan · Matthias Bethge · Wieland Brendel

Great Hall & Hall B1+B2 (level 1) #2018

Abstract:

Leveraging the compositional nature of our world to expedite learning and facilitate generalization is a hallmark of human perception. In machine learning, on the other hand, achieving compositional generalization has proven to be an elusive goal, even for models with explicit compositional priors. To get a better handle on compositional generalization, we here approach it from the bottom up: Inspired by identifiable representation learning, we investigate compositionality as a property of the data-generating process rather than the data itself. This reformulation enables us to derive mild conditions on only the support of the training distribution and the model architecture, which are sufficient for compositional generalization. We further demonstrate how our theoretical framework applies to real-world scenarios and validate our findings empirically. Our results set the stage for a principled theoretical study of compositional generalization.

Chat is not available.