Exponential ReLU DNN Expression of Holomorphic Maps in High Dimension
OPEN ACCESS
Loading...
Author / Producer
Date
2022
Publication Type
Journal Article
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Rights / License
Abstract
For a parameter dimension d∈ N, we consider the approximation of many-parametric maps u:[-1,1]d→R by deep ReLU neural networks. The input dimension d may possibly be large, and we assume quantitative control of the domain of holomorphy of u: i.e., u admits a holomorphic extension to a Bernstein polyellipse Eρ1×⋯×Eρd⊂Cd of semiaxis sums ρi> 1 containing [-1,1]d. We establish the exponential rate O(exp(-bN1/(d+1))) of expressive power in terms of the total NN size N and of the input dimension d of the ReLU NN in W1,∞([-1,1]d). The constant b> 0 depends on (ρj)j=1d which characterizes the coordinate-wise sizes of the Bernstein-ellipses for u. We also prove exponential convergence in stronger norms for the approximation by DNNs with more regular, so-called “rectified power unit” activations. Finally, we extend DNN expression rate bounds also to two classes of non-holomorphic functions, in particular to d-variate, Gevrey-regular functions, and, by composition, to certain multivariate probability distribution functions with Lipschitz marginals.
Permanent link
Publication status
published
External links
Editor
Book title
Journal / series
Volume
55 (1)
Pages / Article No.
537 - 582
Publisher
Springer
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Deep ReLU neural networks; Approximation rates; Exponential convergence; Gevrey regularity
Organisational unit
03435 - Schwab, Christoph / Schwab, Christoph