Error analysis for deep neural network approximations of parametric hyperbolic conservation laws


METADATA ONLY
Loading...

Date

2022-07

Publication Type

Report

ETH Bibliography

yes

Citations

Altmetric
METADATA ONLY

Data

Rights / License

Abstract

We derive rigorous bounds on the error resulting from the approximation of the solution of parametric hyperbolic scalar conservation laws with ReLU neural networks. We show that the approximation error can be made as small as desired with ReLU neural networks that overcome the curse of dimensionality. In addition, we provide an explicit upper bound on the generalization error in terms of the training error, number of training samples and the neural network size. The theoretical results are illustrated by numerical experiments.

Publication status

published

Editor

Book title

Volume

2022-34

Pages / Article No.

Publisher

Seminar for Applied Mathematics, ETH Zurich

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

03851 - Mishra, Siddhartha / Mishra, Siddhartha check_circle

Notes

Funding

Related publications and datasets

Is previous version of: