Timezone: »

Spotlight Poster
Learning Probabilistic Symmetrization for Architecture Agnostic Equivariance
Jinwoo Kim · Dat Nguyen · Ayhan Suleymanzade · Hyeokjun An · Seunghoon Hong

Thu Dec 14 08:45 AM -- 10:45 AM (PST) @ Great Hall & Hall B1+B2 #708
Event URL: https://github.com/jw9730/lps »

We present a novel framework to overcome the limitations of equivariant architectures in learning functions with group symmetries. In contrary to equivariant architectures, the framework uses an arbitrary backbone (such as an MLP or a transformer) and symmetrizes it to be equivariant to given group by employing a small equivariant network that parameterizes the probabilistic distribution underlying the symmetrization. The distribution is end-to-end trained with the backbone which can maximize performance while reducing sample complexity of symmetrization. We show that this approach ensures not only equivariance to the given group but also universal approximation ability in expectation. We implement our method on a simple patch-based transformer backbone initialized from pretrained vision transformer, and test it for a wide range of symmetry groups including permutation and Euclidean groups and their combinations. Empirical tests show competitive results against tailored equivariant architectures, suggesting the potential for learning equivariant functions for diverse groups using a non-equivariant universal backbone. We further show evidence of enhanced learning in symmetric modalities, like graphs, when pretrained from non-symmetric modalities, like vision.

Author Information

Jinwoo Kim (KAIST)
Dat Nguyen (Korea Advanced Institute of Science & Technology)
Ayhan Suleymanzade (Korea Advanced Institute of Science & Technology)
Hyeokjun An (KAIST)
Seunghoon Hong (Korea Advanced Institute of Science and Technology)

More from the Same Authors