Expression Rates of Neural Operators for Linear Elliptic PDEs in Polytopes


METADATA ONLY
Loading...

Date

2024-08

Publication Type

Report

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

We study the approximation rates of a class of deep neural network approximations of operators, which arise as data-to-solution maps G † of linear elliptic partial differential equations (PDEs), and act between pairs X, Y of suitable infinite-dimensional spaces. We prove expression rate bounds for approximate neural operators G with the structure G = R ◦ A ◦ E, with linear encoders E and decoders R. The constructive proofs are via a recurrent NN structure obtained by unrolling exponentially convergent, self-consistent (“Richardson1 ”) iterations. We bound the operator ap proximation error with respect to the linear Kolmogorov N-widths of the data and solution sets and in terms of the size of the approximation network. We derive expression rates for full neural operators for elliptic PDEs set in d-dimensional polytopes, with d ∈ {2, 3}, and with Dirichlet-, Neumann- or mixed boundary conditions. Exploiting the regularity of the solutions sets of ellip tic PDEs in polytopes, we show algebraic rates of expression for problems with data with finite regularity, and exponential operator expression rates for analytic data.

Publication status

published

Editor

Book title

Volume

2024-23

Pages / Article No.

Publisher

Seminar for Applied Mathematics, ETH Zurich

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Neural Operators; Approximation Theory; Deep Neural Networks; Elliptic PDEs; Kolmogorov N-widths

Organisational unit

03435 - Schwab, Christoph / Schwab, Christoph check_circle

Notes

Funding

Related publications and datasets