Skip to yearly menu bar Skip to main content


Poster

Amortized Fourier Neural Operators

Zipeng Xiao · Siqi Kou · Hao Zhongkai · Bokai Lin · Zhijie Deng

East Exhibit Hall A-C #3805
[ ]
Thu 12 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Fourier Neural Operators (FNOs) have shown promise for solving partial differential equations (PDEs).Typically, FNOs employ separate parameters for different frequency modes to specify tunable kernel integrals in Fourier space, which, yet, results in an undesirably large number of parameters when solving high-dimensional PDEs. A workaround is to abandon the frequency modes exceeding a predefined threshold, but this limits the FNOs' ability to represent high-frequency details and poses non-trivial challenges for hyper-parameter specification. To address these, we propose AMortized Fourier Neural Operator (AM-FNO), where an amortized neural parameterization of the kernel function is deployed to accommodate arbitrarily many frequency modes using a fixed number of parameters. We introduce two implementations of AM-FNO, based on the recently developed, appealing Kolmogorov–Arnold Network (KAN) and Multi-Layer Perceptrons (MLPs) equipped with orthogonal embedding functions respectively. We extensively evaluate our method on diverse datasets from various domains and observe up to 31\% average improvement compared to competing neural operator baselines.

Live content is unavailable. Log in and register to view live content