Search
Results
-
Exponential Expressivity of ReLU^k Neural Networks on Gevrey Classes with Point Singularities
(2024)SAM Research ReportWe analyze deep Neural Network emulation rates of smooth functions with point singularities in bounded, polytopal domains \(\mathrm{D} \subset \mathbb{R}^d\), \(d=2,3\). We prove exponential emulation rates in Sobolev spaces in terms of the number of neurons and in terms of the number of nonzero coefficients for Gevrey-regular solution classes defined in terms of weighted Sobolev scales in \(\mathrm{D}\), comprising the countably-normed ...Report -
Wavelet compressed, modified Hilbert transform in the space-time discretization of the heat equation
(2024)SAM Research ReportOn a finite time interval \((0,T)\), we consider the multiresolution Galerkin discretization of a modified Hilbert transform \((H_T)\) which arises in the space-time Galerkin discretization of the linear diffusion equation. To this end, we design spline-wavelet systems in \((0,T)\) consisting of piecewise polynomials of degree \(\geq 1\) with sufficiently many vanishing moments which constitute Riesz bases in the Sobolev spaces \( ...Report -
Neural Networks for Singular Perturbations
(2024)SAM Research ReportWe prove deep neural network (DNN for short) expressivity rate bounds for solution sets of a model class of singularly perturbed, elliptic two-point boundary value problems, in Sobolev norms, on the bounded interval (−1,1). We assume that the given source term and reaction coefficient are analytic in [−1,1]. We establish expression rate bounds in Sobolev norms in terms of the NN size which are uniform with respect to the singular perturbation ...Report -
Deep ReLU networks and high-order finite element methods II: Chebyshev emulation
(2023)SAM Research ReportExpression rates and stability in Sobolev norms of deep ReLU neural networks (NNs) in terms of the number of parameters defining the NN for continuous, piecewise polynomial functions, on arbitrary, finite partitions \(\mathcal{T}\) of a bounded interval \((a,b)\) are addressed. Novel constructions of ReLU NN surrogates encoding the approximated functions in terms of Chebyshev polynomial expansion coefficients are developed. Chebyshev ...Report -
The Gevrey class implicit mapping theorem with application to UQ of semilinear elliptic PDEs
(2023)SAM Research ReportThis article is concerned with a regularity analysis of parametric operator equations with a perspective on uncertainty quantification. We study the regularity of mappings between Banach spaces near branches of isolated solutions that are implicitly defined by a residual equation. Under \(s\)-Gevrey assumptions on on the residual equation, we establish \(s\)-Gevrey bounds on the Fréchet derivatives of the local data-to-solution mapping. ...Report -
Weighted analytic regularity for the integral fractional Laplacian in polyhedra
(2023)SAM Research ReportOn polytopal domains in 3D, we prove weighted analytic regularity of solutions to the Dirichlet problem for the integral fractional Laplacian with analytic right-hand side. Employing the Caffarelli-Silvestre extension allows to localize the problem and to decompose the regularity estimates into results on vertex, edge, face, vertex-edge, vertex-face, edge-face and vertex-edge-face neighborhoods of the boundary. Using tangential differentiability ...Report -
Deep Operator Network Approximation Rates for Lipschitz Operators
(2023)SAM Research ReportWe establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or Hölder) continuous maps \(\mathcal G:\mathcal X\to\mathcal Y\) between (subsets of) separable Hilbert spaces \(\mathcal X\), \(\mathcal Y\). The DON architecture considered uses linear encoders \(\mathcal E\) and decoders \(\mathcal D\) via (biorthogonal) Riesz bases of \(\mathcal X\), \(\mathcal Y\), and an ...Report -
A-posteriori QMC-FEM error estimation for Bayesian inversion and optimal control with entropic risk measure
(2023)SAM Research ReportWe propose a novel a-posteriori error estimation technique where the target quantities of interest are ratios of high-dimensional integrals, as occur e.g. in PDE constrained Bayesian inversion and PDE constrained optimal control subject to an entropic risk measure. We consider in particular parametric, elliptic PDEs with affine-parametric diffusion coefficient, on high-dimensional parameter spaces. We combine our recent a-posteriori ...Report -
Multilevel Domain Uncertainty Quantification in Computational Electromagnetics
(2022)SAM Research ReportWe continue our study [Domain Uncertainty Quantification in Computational Electromagnetics, JUQ (2020), {\bf 8}:301--341] of the numerical approximation of time-harmonic electromagnetic fields for the Maxwell lossy cavity problem for uncertain geometries. We adopt the same affine-parametric shape parametrization framework, mapping the physical domains to a nominal polygonal domain with piecewise smooth maps. The regularity of the pullback ...Report -
Multiresolution Kernel Matrix Algebra
(2022)SAM Research ReportWe propose a sparse arithmetic for kernel matrices, enabling efficient scattered data analysis. The compression of kernel matrices by means of samplets yields sparse matrices such that assembly, addition, and multiplication of these matrices can be performed with essentially linear cost. Since the inverse of a kernel matrix is compressible, too, we have also fast access to the inverse kernel matrix by employing exact sparse selected ...Report