Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Machine Learning and the Physical Sciences

Using neural networks to reduce communication in numerical solution of partial differential equations

Laurent White · Ganesh Dasika


Abstract:

High-performance computing (HPC) applications are frequently communication-bound and so are unable to take advantage of the full extent of compute resources available on a node. Examples abound in scientific computing, where large-scale partial differential equations (PDEs) are solved on hundreds to thousands of nodes. The vast majority of these problems rely on mesh-based discretization techniques and on the calculation of fluxes across element boundaries. The mathematical expression for those fluxes is based on data that is available in the local memory and on neighboring data transferred from another compute node. That data transfer can account for a significant percentage of the simulation time and energy consumption. We present algorithmic approaches for replacing data transfers with local computations, potentially leading to a reduction in simulation cost and avenues for kernel acceleration that would otherwise not be worthwhile. The communication cost can be reduced by up to 50%, with limited impact on physical simulation accuracy.

Chat is not available.