Skip to yearly menu bar Skip to main content


Poster
in
Workshop: NeurIPS 2023 Workshop: Machine Learning and the Physical Sciences

Physics-consistency of infinite neural networks

Sascha Ranftl


Abstract:

Recent work demonstrates the integration of physics prior knowledge into neural networks through neural activation functions and the infinite-width correspondence to Gaussian processes, provided the Central Limit Theorem holds. Together with the construction of physics-consistent Gaussian process kernels, former connection begs the question for physics-consistent infinite neural networks. So construed regression models find specialized applications such as inverse problems, uncertainty quantification, and optimization, particularly in data-scarce situations. These 'surrogate' models can efficiently learn from limited data while maintaining physical consistency.

Chat is not available.