Search
Results
-
On generalization error estimates of physics informed neural networks for approximating dispersive PDEs
(2021)SAM Research ReportPhysics informed neural networks (PINNs) have recently been widely used for robust and accurate approximation of PDEs. We provide rigorous upper bounds on the generalization error of PINNs approximating solutions of the forward problem for several dispersive PDEs.Report -
A Multi-level procedure for enhancing accuracy of machine learning algorithms
(2019)SAM Research ReportWe propose a multi-level method to increase the accuracy of machine learning algorithms for approximating observables in scientific computing, particularly those that arise in systems modeled by differential equations. The algorithm relies on judiciously combining a large number of computationally cheap training data on coarse resolutions with a few expensive training samples on fine grid resolutions. Theoretical arguments for lowering ...Report -
Nonlinear Reconstruction for Operator Learning of PDEs with Discontinuities
(2022)SAM Research ReportA large class of hyperbolic and advection-dominated PDEs can have solutions with discontinuities. This paper investigates, both theoretically and empirically, the operator learning of PDEs with discontinuous solutions. We rigorously prove, in terms of lower approximation bounds, that methods which entail a linear reconstruction step (e.g. DeepONet or PCA-Net) fail to efficiently approximate the solution operator of such PDEs. In contrast, ...Report -
Estimates on the generalization error of Physics Informed Neural Networks (PINNs) for approximating PDEs II: A class of inverse problems
(2020)SAM Research ReportPhysics informed neural networks (PINNs) have recently been very successfully applied for efficiently approximating inverse problems for PDEs. We focus on a particular class of inverse problems, the so-called data assimilation or unique continuation problems, and prove rigorous estimates on the generalization error of PINNs approximating them. An abstract framework is presented and conditional stability estimates for the underlying inverse ...Report -
Estimates on the generalization error of Physics Informed Neural Networks (PINNs) for approximating PDEs
(2020)SAM Research ReportPhysics informed neural networks (PINNs) have recently been widely used for robust and accurate approximation of PDEs. We provide rigorous upper bounds on the generalization error of PINNs approximating solutions of the forward problem for PDEs. An abstract formalism is introduced and stability properties of the underlying PDE are leveraged to derive an estimate for the generalization error in terms of the training error and number of ...Report -
Physics Informed Neural Networks for Simulating Radiative Transfer
(2020)SAM Research ReportWe propose a novel machine learning algorithm for simulating radiative transfer. Our algorithmis based on physics informed neural networks (PINNs), which are trained by minimizing the residualof the underlying radiative tranfer equations. We present extensive experiments and theoretical errorestimates to demonstrate that PINNs provide a very easy to implement, fast, robust and accuratemethod for simulating radiative transfer. We also ...Report -
Neural Inverse Operators for Solving PDE Inverse Problems
(2023)SAM Research ReportA large class of inverse problems for PDEs are only well-defined as mappings from operators to functions. Existing operator learning frameworks map functions to functions and need to be modified to learn inverse maps from data. We propose a novel architecture termed Neural Inverse Operators (NIOs) to solve these PDE inverse problems. Motivated by the underlying mathematical structure, NIO is based on a suitable composition of DeepONets ...Report -
Convolutional Neural Operators
(2023)SAM Research ReportAlthough very successfully used in machine learning, convolution based neural network architectures -- believed to be inconsistent in function space -- have been largely ignored in the context of learning solution operators of PDEs. Here, we adapt convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as convolutional neural operators (CNOs), ...Report -
Are Neural Operators Really Neural Operators? Frame Theory Meets Operator Learning
(2023)SAM Research ReportRecently, there has been significant interest in operator learning, i.e. learning mappings between infinite-dimensional function spaces. This has been particularly relevant in the context of learning partial differential equations from data. However, it has been observed that proposed models may not behave as operators when implemented on a computer, questioning the very essence of what operator learning should be. We contend that in ...Report -
Weak physics informed neural networks for approximating entropy solutions of hyperbolic conservation laws
(2022)SAM Research ReportPhysics informed neural networks (PINNs) require regularity of solutions of the underlying PDE to guarantee accurate approximation. Consequently, they may fail at approximating discontinuous solutions of PDEs such as nonlinear hyperbolic equations. To ameliorate this, we propose a novel variant of PINNs, termed as weak PINNs (wPINNs) for accurate approximation of entropy solutions of scalar conservation laws. wPINNs are based on approximating ...Report