Open access
Autor(in)
Datum
2020-01-31Typ
- Master Thesis
ETH Bibliographie
yes
Altmetrics
Abstract
Deep neural networks and the ENO procedure are both efficient frameworks for approximating rough functions. We prove that at any order, the stencil shifts of the ENO and ENO-SR interpolation procedures can be exactly obtained using a deep ReLU neural network. In addition, we construct and provide error bounds for ReLU neural networks that directly approximate the output of the ENO and ENO- SR interpolation procedures. This surprising fact enables the transfer of several desirable properties of the ENO procedure to deep neural networks, including its high-order accuracy at approximating Lipschitz functions. Numerical tests for the resulting neural networks show excellent performance for interpolating rough functions, data compression and approximating solutions of nonlinear conservation laws. Mehr anzeigen
Persistenter Link
https://doi.org/10.3929/ethz-b-000397533Publikationsstatus
publishedBeteiligte
Referent: Mishra, Siddhartha
Verlag
ETH ZurichThema
Numerical analysis; Interpolation; Deep learning; ENO reconstruction; ReLUOrganisationseinheit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics
ETH Bibliographie
yes
Altmetrics