Multilevel domain decomposition-based architectures for physics-informed neural networks
Metadata only
Date
2023-06Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
Physics-informed neural networks (PINNs) are a popular and powerful approach for solving problems involving differential equations, yet they often struggle to solve problems with high frequency and/or multi-scale solutions. Finite basis physics-informed neural networks (FBPINNs) improve the performance of PINNs in this regime by combining them with an overlapping domain decomposition approach. In this paper, the FBPINN approach is extended by adding multiple levels of domain decompositions to their solution ansatz, inspired by classical multilevel Schwarz domain decomposition methods (DDMs). Furthermore, analogous to typical tests for classical DDMs, strong and weak scaling studies designed for measuring how the accuracy of PINNs and FBPINNs behaves with respect to computational effort and solution complexity are carried out. Our numerical results show that the proposed multilevel FBPINNs consistently and significantly outperform PINNs across a range of problems with high frequency and multi-scale solutions. Furthermore, as expected in classical DDMs, we show that multilevel FBPINNs improve the scalability of FBPINNs to large numbers of subdomains by aiding global communication between subdomains. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichSubject
Physics-informed neural networks; Overlapping domain decomposition methods; Multilevel methods; Multi-scale modeling; Spectral bias; Forward modeling; Differential equationsOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
02219 - ETH AI Center / ETH AI Center
Related publications and datasets
Is supplemented by: https://github.com/benmoseley/FBPINNs
Is identical to: https://doi.org/10.48550/arXiv.2306.05486
More
Show all metadata
ETH Bibliography
yes
Altmetrics