## Exponential Separations in Symmetric Neural Networks

### Aaron Zweig · Joan Bruna

##### Hall J #815

Keywords: [ deepsets ] [ separation ] [ relational network ] [ symmetric function ] [ self-attention ] [ set-based ]

[ Abstract ]
[ [ [
Thu 1 Dec 2 p.m. PST — 4 p.m. PST

Abstract: In this work we demonstrate a novel separation between symmetric neural network architectures. Specifically, we consider the Relational Network~\parencite{santoro2017simple} architecture as a natural generalization of the DeepSets~\parencite{zaheer2017deep} architecture, and study their representational gap. Under the restriction to analytic activation functions, we construct a symmetric function acting on sets of size \$N\$ with elements in dimension \$D\$, which can be efficiently approximated by the former architecture, but provably requires width exponential in \$N\$ and \$D\$ for the latter.

Chat is not available.