Error analysis for deep neural network approximations of parametric hyperbolic conservation laws
Metadata only
Date
2022-07Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Related publications and datasets
Is previous version of: http://hdl.handle.net/20.500.11850/649793
More
Show all metadata
ETH Bibliography
yes
Altmetrics