On universal approximation and error bounds for Fourier Neural Operators


METADATA ONLY
Loading...

Date

2021-07

Publication Type

Report

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

Fourier neural operators (FNOs) have recently been proposed as an effective framework for learning operators that map between infinite-dimensional spaces. We prove that FNOs are universal, in the sense that they can approximate any continuous operator to desired accuracy. Moreover, we suggest a mechanism by which FNOs can approximate operators associated with PDEs efficiently. Explicit error bounds are derived to show that the size of the FNO, approximating operators associated with a Darcy type elliptic PDE and with the incompressible Navier-Stokes equations of fluid dynamics, only increases sub (log)-linearly in terms of the reciprocal of the error. Thus, FNOs are shown to efficiently approximate operators arising in a large class of PDEs.

Permanent link

Publication status

published

Editor

Book title

Volume

2021-23

Pages / Article No.

Publisher

Seminar for Applied Mathematics, ETH Zurich

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

03851 - Mishra, Siddhartha / Mishra, Siddhartha check_circle

Notes

Funding

Related publications and datasets