Search
Results
-
Analytic regularity for the Navier-Stokes equations in polygons with mixed boundary conditions
(2021)SAM Research ReportWe prove weighted analytic regularity of Leray-Hopf variational solutions for the stationary, incompressible Navier-Stokes Equations (NSE) in plane polygonal domains, subject to analytic body forces. We admit mixed boundary conditions which may change type at each vertex, under the assumption that homogeneous Dirichlet (''no-slip'') boundary conditions are prescribed on at least one side at each vertex of the domain. The weighted analytic ...Report -
Weighted analytic regularity for the integral fractional Laplacian in polyhedra
(2023)SAM Research ReportOn polytopal domains in 3D, we prove weighted analytic regularity of solutions to the Dirichlet problem for the integral fractional Laplacian with analytic right-hand side. Employing the Caffarelli-Silvestre extension allows to localize the problem and to decompose the regularity estimates into results on vertex, edge, face, vertex-edge, vertex-face, edge-face and vertex-edge-face neighborhoods of the boundary. Using tangential differentiability ...Report -
Deep Operator Network Approximation Rates for Lipschitz Operators
(2023)SAM Research ReportWe establish universality and expression rate bounds for a class of neural Deep Operator Networks (DON) emulating Lipschitz (or Hölder) continuous maps \(\mathcal G:\mathcal X\to\mathcal Y\) between (subsets of) separable Hilbert spaces \(\mathcal X\), \(\mathcal Y\). The DON architecture considered uses linear encoders \(\mathcal E\) and decoders \(\mathcal D\) via (biorthogonal) Riesz bases of \(\mathcal X\), \(\mathcal Y\), and an ...Report -
Exponential Convergence of Deep Operator Networks for Elliptic Partial Differential Equations
(2021)SAM Research ReportReport -
Deep learning in high dimension: ReLU network Expression Rates for Bayesian PDE inversion
(2020)SAM Research ReportWe establish dimension independent expression rates by deep ReLU networks for so-called (b,ε,X)-holomorphic functions. These are mappings from [−1,1]N→X, with X being a Banach space, that admit analytic extensions to certain polyellipses in each of the input variables. The significance of this function class has been established in previous works, where it was shown that functions of this type occur widely in uncertainty quantification ...Report -
Analyticity and sparsity in uncertainty quantification for PDEs with Gaussian random field inputs
(2022)SAM Research ReportWe establish summability results for coefficient sequences of Wiener-Hermite polynomial chaos expansions for countably-parametric solutions of linear elliptic and parabolic divergence-form partial differential equations with Gaussian random field inputs. The novel proof technique developed here is based on analytic continuation of parametric solutions into the complex domain. It differs from previous works that used bootstrap arguments ...Report -
Deep ReLU neural network expression for elliptic multiscale problems
(2020)SAM Research ReportReport -
hp-FEM for reaction-diffusion equations. II: Robust exponential convergence for multiple length scales in corner domains
(2020)SAM Research ReportReport -
ReLU Neural Network Galerkin BEM
(2022)SAM Research ReportWe introduce Neural Network (NN for short) approximation architectures for the numerical solution of Boundary Integral Equations (BIEs for short). We exemplify the proposed NN approach for the boundary reduction of the potential problem in two spatial dimensions. We adopt a Galerkin formulation based approach, in polygonal domains with a finite number of straight sides. Trial spaces used in the Galerkin discretization of the BIEs are built ...Report -