Search
Results
-
An operator preconditioning perspective on training in physics-informed machine learning
(2023)SAM Research ReportIn this paper, we investigate the behavior of gradient descent algorithms in physics-informed machine learning methods like PINNs, which minimize resid uals connected to partial differential equations (PDEs). Our key result is that the difficulty in training these models is closely related to the conditioning of a specific differential operator. This operator, in turn, is associated to the Hermi tian square of the differential operator ...Report -
Convolutional Neural Operators
(2023)SAM Research ReportAlthough very successfully used in machine learning, convolution based neural network architectures -- believed to be inconsistent in function space -- have been largely ignored in the context of learning solution operators of PDEs. Here, we adapt convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as convolutional neural operators (CNOs), ...Report -
Are Neural Operators Really Neural Operators? Frame Theory Meets Operator Learning
(2023)SAM Research ReportRecently, there has been significant interest in operator learning, i.e. learning mappings between infinite-dimensional function spaces. This has been particularly relevant in the context of learning partial differential equations from data. However, it has been observed that proposed models may not behave as operators when implemented on a computer, questioning the very essence of what operator learning should be. We contend that in ...Report -
Convolutional Neural Operators for robust and accurate learning of PDEs
(2023)SAM Research ReportAlthough very successfully used in conventional machine learning, convolution based neural network architectures -- believed to be inconsistent in function space -- have been largely ignored in the context of learning solution operators of PDEs. Here, we present novel adaptations for convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as ...Report -