Reciprocal Space Attention for Learning Long-Range Interactions
Abstract
Machine learning interatomic potentials (MLIPs) have revolutionized the modeling of materials and molecules by directly fitting to ab-initio data. However, while these models excel at capturing local and semi-local interactions, they often prove insufficient when an explicit and efficient treatment of long-range interactions are required. To address this limitation, we introduce Reciprocal-Space Attention (RSA), designed to capture long-range interactions in the Fourier domain. RSA can be seamlessly integrated with any existing local or semi-local MLIP framework. Our key contribution is mapping the linear-scaling attention mechanism into Fourier space. This technique allows us to effectively capture long-range interactions, such as electrostatics and dispersion, without requiring predefined charges or other explicit empirical assumptions. We demonstrate the effectiveness of our method through a diverse set of benchmarks, including the dimer binding curve, dispersion interactions in layered phosphorene exfoliation, and molecular dynamics simulation of water. Our results show that RSA successfully captures long-range interactions in various such chemical environments.