Error analysis for deep neural network approximations of parametric hyperbolic conservation laws
METADATA ONLY
Loading...
Author / Producer
Date
2022-07
Publication Type
Report
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments.
Permanent link
Publication status
published
Editor
Book title
Journal / series
Volume
2022-34
Pages / Article No.
Publisher
Seminar for Applied Mathematics, ETH Zurich
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Notes
Funding
Related publications and datasets
Is previous version of: