Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Transformers for Scattering Amplitudes

Garrett Merz · Francois Charton · Tianji Cai · Kyle Cranmer · Lance Dixon · Niklas Nolte · Matthias Wilhelm


Abstract:

We pursue the use of Transformers to extend state-of-the-art results in theoretical particle physics. Specifically, we use Transformers to predict the integer coefficients of large mathematical expressions that represent scattering amplitudes in planar N=4 Yang-Mills theory, which is a quantum field theory closely related to the theory that describes Higgs boson production at the Large Hadron Collider. We first formulate the physics problem in a language-based representation that is amenable to Transformer architectures and standard training objectives. Then we show that the model can achieve high accuracy (>98%) on two tasks.

Chat is not available.