Error estimates for DeepOnets: A deep learning framework in infinite dimensions
METADATA ONLY
Loading...
Author / Producer
Date
2021-02
Publication Type
Report
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
DeepOnets have recently been proposed as a framework for learning nonlinear operators mapping between infinite dimensional Banach spaces. We analyze DeepOnets and prove estimates on the resulting approximation and generalization errors. In particular, we extend the universal approximation property of DeepOnets to include measurable mappings in non-compact spaces. By a decomposition of the error into encoding, approximation and reconstruction errors, we prove both lower and upper bounds on the total error, relating it to the spectral decay properties of the covariance operators, associated with the underlying measures. We derive almost optimal error bounds with very general affine reconstructors and with random sensor locations as well as bounds on the generalization error, using covering number arguments. We illustrate our general framework with four prototypical examples of nonlinear operators, namely those arising in a nonlinear forced ODE, an elliptic PDE with variable coefficients and nonlinear parabolic and hyperbolic PDEs. In all these examples, we prove that DeepOnets \emph{break the curse of dimensionality}, thus demonstrating the efficient approximation of infinite-dimensional operators with this machine learning framework.
Permanent link
Publication status
published
Editor
Book title
Journal / series
Volume
2021-07
Pages / Article No.
Publisher
Seminar for Applied Mathematics, ETH Zurich
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Deep learning; Operator learning
Organisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Notes
Funding
Related publications and datasets
Is previous version of: