Skip to yearly menu bar Skip to main content


Poster

SEA: State-Exchange Attention for High-Fidelity Physics-Based Transformers

Parsa Esmati · Amirhossein Dadashzadeh · Vahid Ardakani · Nicolas Larrosa · Nicolò Grilli

East Exhibit Hall A-C #4003
[ ] [ Project Page ]
Wed 11 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Current approaches using sequential networks have shown promise in estimating field variables of dynamical systems, but most are limited by high rollout error gradients. The unresolved issue of rollout error accumulation results in unreliable estimations as the network predicts further into the future, with each step's error compounding and leading to an increase in inaccuracy.Here, we introduce the State-Exchange Attention (SEA) module, a novel transformer based module enabling information exchange between encoded fields through multi-head cross-attention. The cross-field multidirectional information exchange design enables all state variables in the system to exchange information with one another, capturing physical relationships and symmetries between fields. This enhances the model's ability to represent complex interactions between the field variables, resulting in improved rollout error accumulation. Our results show that the Transformer model integrated with the State-Exchange Attention (SEA) module outperforms competitive baseline models, including the GMR-GMUS Transformer and PbGMR-GMUS Transformer-RealNVP, by 66% and 72%, respectively, achieving state-of-the-art error rates. Furthermore, we demonstrate that the SEA module alone can reduce errors by 97% for state variables that are highly dependent on other states of the system.

Live content is unavailable. Log in and register to view live content