Abstract
Although very successfully used in conventional machine learning, convolution based neural network architectures -- believed to be inconsistent in function space -- have been largely ignored in the context of learning solution operators of PDEs. Here, we present novel adaptations for convolutional neural networks to demonstrate that they are indeed able to process functions as inputs and outputs. The resulting architecture, termed as convolutional neural operators (CNOs), is designed specifically to preserve its underlying continuous nature, even when implemented in a discretized form on a computer. We prove a universality theorem to show that CNOs can approximate operators arising in PDEs to desired accuracy. CNOs are tested on a novel suite of benchmarks, encompassing a diverse set of PDEs with possibly multi-scale solutions and are observed to significantly outperform baselines, paving the way for an alternative framework for robust and accurate operator learning. Mehr anzeigen
Publikationsstatus
publishedExterne Links
Zeitschrift / Serie
SAM Research ReportBand
Verlag
Seminar for Applied Mathematics, ETH ZurichOrganisationseinheit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
09603 - Alaifari, Rima / Alaifari, Rima
02219 - ETH AI Center / ETH AI Center
Zugehörige Publikationen und Daten
References: http://hdl.handle.net/20.500.11850/682560
ETH Bibliographie
yes
Altmetrics