Error estimates for DeepOnets: A deep learning framework in infinite dimensions
Metadata only
Date
2021-02Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
DeepOnets have recently been proposed as a framework for learning nonlinear operators mapping between infinite dimensional Banach spaces. We analyze DeepOnets and prove estimates on the resulting approximation and generalization errors. In particular, we extend the universal approximation property of DeepOnets to include measurable mappings in non-compact spaces. By a decomposition of the error into encoding, approximation and reconstruction errors, we prove both lower and upper bounds on the total error, relating it to the spectral decay properties of the covariance operators, associated with the underlying measures. We derive almost optimal error bounds with very general affine reconstructors and with random sensor locations as well as bounds on the generalization error, using covering number arguments. We illustrate our general framework with four prototypical examples of nonlinear operators, namely those arising in a nonlinear forced ODE, an elliptic PDE with variable coefficients and nonlinear parabolic and hyperbolic PDEs. In all these examples, we prove that DeepOnets \emph{break the curse of dimensionality}, thus demonstrating the efficient approximation of infinite-dimensional operators with this machine learning framework. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichSubject
Deep learning; Operator learningOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Related publications and datasets
Is previous version of: https://doi.org/10.3929/ethz-b-000558811
More
Show all metadata
ETH Bibliography
yes
Altmetrics