Metadata only
Datum
2021-01Typ
- Report
ETH Bibliographie
yes
Altmetrics
Abstract
We propose an efficient, deterministic algorithm for constructing exponentially convergent deep neural network (DNN) approximations of multivariate, analytic maps f:[−1,1]K→R. We address in particular networks with the rectified linear unit (ReLU) activation function. Similar results and proofs apply for many other popular activation functions. The algorithm is based on collocating f in deterministic families of grid points with small Lebesgue constants, and by a-priori (i.e., ``offline'') emulation of a spectral basis with DNNs to prescribed fidelity. Assuming availability of N function values of a possibly corrupted, numerical approximation f~ of f in [−1,1]K, a bound on ‖f−f~‖L∞([−1,1]K), and a given target accuracy ε>0, we provide an explicit, computational construction of a deep ReLU NN which attains accuracy ε uniformly, with respect to the inputs. For analytic maps f:[−1,1]K→R, we prove exponential convergence of expression and generalization errors of the constructed ReLU DNNs with respect to ε, in the norm W1,∞([−1,1]K;R), up to an additive data-corruption bound ‖f−f~‖L∞([−1,1]K) multiplied with a factor growing slowly with 1/ε. The algorithmic construction of the ReLU DNNs which will realize the approximations, is explicit and deterministic in terms of the function values of f~ in tensorized Clenshaw--Curtis grids in [−1,1]K. We illustrate the proposed methodology by a constructive algorithm for (offline) computations of posterior expectations in Bayesian PDE inversion. Mehr anzeigen
Publikationsstatus
publishedExterne Links
Zeitschrift / Serie
SAM Research ReportBand
Verlag
Seminar for Applied Mathematics, ETH ZurichThema
Deep ReLU neural networks; Exponential convergence; Neural network constructionOrganisationseinheit
03435 - Schwab, Christoph / Schwab, Christoph
ETH Bibliographie
yes
Altmetrics