Search
Results
-
Deep learning in high dimension: ReLU network Expression Rates for Bayesian PDE inversion
(2020)SAM Research ReportWe establish dimension independent expression rates by deep ReLU networks for so-called (b,ε,X)-holomorphic functions. These are mappings from [−1,1]N→X, with X being a Banach space, that admit analytic extensions to certain polyellipses in each of the input variables. The significance of this function class has been established in previous works, where it was shown that functions of this type occur widely in uncertainty quantification ...Report -
-
Deep ReLU networks and high-order finite element methods II: Chebyshev emulation
(2023)SAM Research ReportExpression rates and stability in Sobolev norms of deep ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions \(\mathcal{T}\) of a bounded interval \((a,b)\) are addressed. Novel constructions of ReLU NN surrogates encoding the approximated functions in terms of Chebyshev polynomial expansion coefficients are developed. Chebyshev ...Report -
Constructive Deep ReLU Neural Network Approximation
(2021)SAM Research ReportWe propose an efficient, deterministic algorithm for constructing exponentially convergent deep neural network (DNN) approximations of multivariate, analytic maps f:[−1,1]K→R. We address in particular networks with the rectified linear unit (ReLU) activation function. Similar results and proofs apply for many other popular activation functions. The algorithm is based on collocating f in deterministic families of grid points with small ...Report -
Exponential ReLU DNN expression of holomorphic maps in high dimension
(2019)SAM Research ReportReport -
Exponential ReLU Neural Network Approximation Rates for Point and Edge Singularities
(2020)SAM Research ReportWe prove exponential expressivity with stable ReLU Neural Networks (ReLU NNs) in H1(Ω) for weighted analytic function classes in certain polytopal domains Ω, in space dimension d=2,3. Functions in these classes are locally analytic on open subdomains D⊂Ω, but may exhibit isolated point singularities in the interior of Ω or corner and edge singularities at the boundary ∂Ω. The exponential expression rate bounds proved here imply uniform ...Report -
-
Neural Networks for Singular Perturbations
(2024)SAM Research ReportWe prove deep neural network (DNN for short) expressivity rate bounds for solution sets of a model class of singularly perturbed, elliptic two-point boundary value problems, in Sobolev norms, on the bounded interval (−1,1). We assume that the given source term and reaction coefficient are analytic in [−1,1]. We establish expression rate bounds in Sobolev norms in terms of the NN size which are uniform with respect to the singular perturbation ...Report -
Exponential expressivity of ${\rm ReLU}^k$ neural networks on Gevrey classes with point singularities
(2024)SAM Research ReportWe analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains \(\mathrm{D} \subset \mathbb{R}^d\), \(d=2,3\). We prove exponential emulation rates in Sobolev spaces in terms of the number of neurons and in terms of the number of nonzero coefficients for Gevrey-regular solution classes defined in terms of weighted Sobolev scales in \(\mathrm{D}\), comprising the countably-normed ...Report