Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Information-Theoretic Principles in Cognitive Systems (InfoCog)

Natural Language Systematicity from a Constraint on Excess Entropy

Richard Futrell


Abstract:

Natural language is systematic: utterances are composed of individually meaningful parts which are typically concatenated together. We argue that natural-language-like systematicity arises in codes when they are constrained by excess entropy, the mutual information between the past and the future of a process. In three examples, we show that codes with natural-language-like systematicity have lower excess entropy than matched alternatives.

Chat is not available.