Error analysis for deep neural network approximations of parametric hyperbolic conservation laws
Metadata only
Datum
2022-07Typ
- Report
ETH Bibliographie
yes
Altmetrics
Abstract
We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments. Mehr anzeigen
Publikationsstatus
publishedExterne Links
Zeitschrift / Serie
SAM Research ReportBand
Verlag
Seminar for Applied Mathematics, ETH ZurichOrganisationseinheit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
Zugehörige Publikationen und Daten
Is previous version of: http://hdl.handle.net/20.500.11850/649793
ETH Bibliographie
yes
Altmetrics