Metadata only
Date
2024-08Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
We study the approximation rates of a class of deep neural network approximations of operators, which arise as data-to-solution maps G † of linear elliptic partial differential equations (PDEs), and act between pairs X, Y of suitable infinite-dimensional spaces. We prove expression rate bounds for approximate neural operators G with the structure G = R ◦ A ◦ E, with linear encoders E and decoders R. The constructive proofs are via a recurrent NN structure obtained by unrolling exponentially convergent, self-consistent (“Richardson1 ”) iterations. We bound the operator ap proximation error with respect to the linear Kolmogorov N-widths of the data and solution sets and in terms of the size of the approximation network. We derive expression rates for full neural operators for elliptic PDEs set in d-dimensional polytopes, with d ∈ {2, 3}, and with Dirichlet-, Neumann- or mixed boundary conditions. Exploiting the regularity of the solutions sets of ellip tic PDEs in polytopes, we show algebraic rates of expression for problems with data with finite regularity, and exponential operator expression rates for analytic data. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichSubject
Neural Operators; Approximation Theory; Deep Neural Networks; Elliptic PDEs; Kolmogorov N-widthsOrganisational unit
03435 - Schwab, Christoph / Schwab, Christoph
More
Show all metadata
ETH Bibliography
yes
Altmetrics