Search
Results
-
The Language of Hyperelastic Materials
(2024)SAM Research ReportThe automated discovery of constitutive laws forms an emerging area that focuses on automatically obtaining symbolic expressions describing the constitutive behavior of solid materials from experimental data. Existing symbolic/sparse regression methods rely on availability of libraries of material models, which are typically hand-designed by a human expert relying on known models as reference, or deploy generative algorithms with exponential ...Report -
Efficient Computation of Large-Scale Statistical Solutions to Incompressible Fluid Flows
(2024)SAM Research ReportThis work presents the development, performance analysis and subsequent optimization of a GPU-based spectral hyperviscosity solver for turbulent flows described by the three dimensional incompressible Navier-Stokes equations. The method solves for the fluid velocity fields directly in Fourier space, eliminating the need to solve a large-scale linear system of equations in order to find the pressure field. Special focus is put on the ...Report -
Numerical analysis of physics-informed neural networks and related models in physics-informed machine learning
(2024)SAM Research ReportPhysics-informed neural networks (PINNs) and their variants have been very popular in recent years as algorithms for the numerical simulation of both forward and inverse problems for partial differential equations. This article aims to provide a comprehensive review of currently available results on the numerical analysis of PINNs and related models that constitute the backbone of physics-informed machine learning. We provide a unified ...Report -
An operator preconditioning perspective on training in physics-informed machine learning
(2023)SAM Research ReportIn this paper, we investigate the behavior of gradient descent algorithms in physics-informed machine learning methods like PINNs, which minimize resid uals connected to partial differential equations (PDEs). Our key result is that the difficulty in training these models is closely related to the conditioning of a specific differential operator. This operator, in turn, is associated to the Hermi tian square of the differential operator ...Report -
How does over-squashing affect the power of GNNs?
(2023)SAM Research ReportGraph Neural Networks (GNNs) are the state-of-the-art model for machine learning on graph-structured data. The most popular class of GNNs operate by exchanging information between adjacent nodes, and are known as Message Passing Neural Networks (MPNNs). Given their widespread use, understanding the expressive power of MPNNs is a key question. However, existing results typically consider settings with uninformative node features. In this ...Report -
Multilevel domain decomposition-based architectures for physics-informed neural networks
(2023)SAM Research ReportPhysics-informed neural networks (PINNs) are a popular and powerful approach for solving problems involving differential equations, yet they often struggle to solve problems with high frequency and/or multi-scale solutions. Finite basis physics-informed neural networks (FBPINNs) improve the performance of PINNs in this regime by combining them with an overlapping domain decomposition approach. In this paper, the FBPINN approach is extended ...Report -
Vandermonde Neural Operators
(2023)SAM Research ReportFourier Neural Operators (FNOs) have emerged as very popular machine learning architectures for learning operators, particularly those arising in PDEs. However, as FNOs rely on the fast Fourier transform for computational efficiency, the architecture can be limited to input data on equispaced Cartesian grids. Here, we generalize FNOs to handle input data on non-equispaced point distributions. Our proposed model, termed as Vandermonde ...Report -
Convolutional Neural Operators for robust and accurate learning of PDEs
(2023)SAM Research ReportAlthough very successfully used in conventional machine learning, convolution based neural network architectures -- believed to be inconsistent in function space -- have been largely ignored in the context of learning solution operators of PDEs. Here, we present novel adaptations for convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as ...Report -
Neural Oscillators are Universal
(2023)SAM Research ReportCoupled oscillators are being increasingly used as the basis of machine learning (ML) architectures, for instance in sequence modeling, graph representation learning and in physical neural networks that are used in analog ML devices. We introduce an abstract class of neural oscillators that encompasses these architectures and prove that neural oscillators are universal, i.e, they can approximate any continuous and casual operator mapping ...Report -
Are Neural Operators Really Neural Operators? Frame Theory Meets Operator Learning
(2023)SAM Research ReportRecently, there has been significant interest in operator learning, i.e. learning mappings between infinite-dimensional function spaces. This has been particularly relevant in the context of learning partial differential equations from data. However, it has been observed that proposed models may not behave as operators when implemented on a computer, questioning the very essence of what operator learning should be. We contend that in ...Report