Poseidon: Efficient Foundation Models for PDEs
METADATA ONLY
Loading...
Author / Producer
Date
2024-12
Publication Type
Report
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
We introduce Poseidon, a foundation model for learning the solution operators of PDEs. It is based on a multiscale operator transformer, with time-conditioned layer norms that enable continuous-in-time evaluations. A novel training strategy leveraging the semi-group property of time-dependent PDEs to allow for significant scaling-up of the training data is also proposed. Poseidon is pretrained on a diverse, large scale dataset for the governing equations of fluid dynamics. It is then evaluated on a suite of 15 challenging downstream tasks that include a wide variety of PDE types and operators. We show that Poseidon exhibits excellent performance across the board by outperforming baselines significantly, both in terms of sample efficiency and accuracy. Poseidon also generalizes very well to new physics that is not seen during pretraining. Moreover, Poseidon scales with respect to model and data size, both for pretraining and for downstream tasks. Taken together, our results showcase the surprising ability of Poseidon to learn effective representations from a very small set of PDEs during pretraining in order to generalize well to unseen and unrelated PDEs downstream, demonstrating its potential as an effective, general purpose PDE foundation model. Finally, the Poseidon model as well as underlying pretraining and downstream datasets are open sourced, with code being available at https://github.com/camlab-ethz/poseidon and pretrained models and datasets at https://huggingface.co/camlab-ethz.
Permanent link
Publication status
published
Editor
Book title
Journal / series
Volume
2024-38
Pages / Article No.
Publisher
Seminar for Applied Mathematics, ETH Zurich
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
PDEs; operators; foundation models; transformers; sample efficiency; scientific machine learning
Organisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Notes
Funding
Related publications and datasets
Is cited by: