Show simple item record

dc.contributor.author
Herrmann, Lukas
dc.contributor.author
Opschoor, Joost A.A.
dc.contributor.author
Schwab, Christoph
dc.date.accessioned
2021-06-24T09:30:37Z
dc.date.available
2021-06-24T08:12:52Z
dc.date.available
2021-06-24T09:30:37Z
dc.date.issued
2021-01
dc.identifier.uri
http://hdl.handle.net/20.500.11850/491120
dc.description.abstract
We propose an efficient, deterministic algorithm for constructing exponentially convergent deep neural network (DNN) approximations of multivariate, analytic maps f:[−1,1]K→R. We address in particular networks with the rectified linear unit (ReLU) activation function. Similar results and proofs apply for many other popular activation functions. The algorithm is based on collocating f in deterministic families of grid points with small Lebesgue constants, and by a-priori (i.e., ``offline'') emulation of a spectral basis with DNNs to prescribed fidelity. Assuming availability of N function values of a possibly corrupted, numerical approximation f~ of f in [−1,1]K, a bound on ‖f−f~‖L∞([−1,1]K), and a given target accuracy ε>0, we provide an explicit, computational construction of a deep ReLU NN which attains accuracy ε uniformly, with respect to the inputs. For analytic maps f:[−1,1]K→R, we prove exponential convergence of expression and generalization errors of the constructed ReLU DNNs with respect to ε, in the norm W1,∞([−1,1]K;R), up to an additive data-corruption bound ‖f−f~‖L∞([−1,1]K) multiplied with a factor growing slowly with 1/ε. The algorithmic construction of the ReLU DNNs which will realize the approximations, is explicit and deterministic in terms of the function values of f~ in tensorized Clenshaw--Curtis grids in [−1,1]K. We illustrate the proposed methodology by a constructive algorithm for (offline) computations of posterior expectations in Bayesian PDE inversion.
en_US
dc.language.iso
en
en_US
dc.publisher
Seminar for Applied Mathematics, ETH Zurich
en_US
dc.subject
Deep ReLU neural networks
en_US
dc.subject
Exponential convergence
en_US
dc.subject
Neural network construction
en_US
dc.title
Constructive Deep ReLU Neural Network Approximation
en_US
dc.type
Report
ethz.journal.title
SAM Research Report
ethz.journal.volume
2021-04
en_US
ethz.size
33 p.
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02000 - Dep. Mathematik / Dep. of Mathematics::02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics::03435 - Schwab, Christoph / Schwab, Christoph
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02000 - Dep. Mathematik / Dep. of Mathematics::02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics::03435 - Schwab, Christoph / Schwab, Christoph
en_US
ethz.identifier.url
https://math.ethz.ch/sam/research/reports.html?id=946
ethz.date.deposited
2021-06-24T08:12:57Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.identifier.internal
https://math.ethz.ch/sam/research/reports.html?id=946
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2021-06-24T09:30:45Z
ethz.rosetta.lastUpdated
2021-06-24T09:30:45Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Constructive%20Deep%20ReLU%20Neural%20Network%20Approximation&rft.jtitle=SAM%20Research%20Report&rft.date=2021-01&rft.volume=2021-04&rft.au=Herrmann,%20Lukas&Opschoor,%20Joost%20A.A.&Schwab,%20Christoph&rft.genre=report&
 Search print copy at ETH Library

Files in this item

FilesSizeFormatOpen in viewer

There are no files associated with this item.

Publication type

Show simple item record