Search
Results
-
Weak error analysis for stochastic gradient descent optimization algorithms
(2020)SAM Research ReportStochastic gradient descent (SGD) type optimization schemes are fundamental ingredients in a large number of machine learning based algorithms. In particular, SGD type optimization schemes are frequently employed in applications involving natural language processing, object and face recognition,fraud detection, computational advertisement, and numerical approximations of partial differential equations. In mathematical convergence results ...Report -
Space-time deep neural network approximations for high-dimensional partial differential equations
(2020)SAM Research ReportIt is one of the most challenging issues in applied mathematics to approximately solve high-dimensional partial differential equations (PDEs) and most of the numerical approximation methods for PDEs in the scientific literature suffer from the so-called curse of dimensionality in the sense that the number of computational operations employed in the corresponding approximation scheme to obtain an approximation precision ε>0 grows exponentially ...Report -
On nonlinear Feynman-Kac formulas for viscosity solutions of semilinear parabolic partial differential equations
(2020)SAM Research ReportReport -
Overcoming the curse of dimensionality in the numerical approximation of high-dimensional semilinear elliptic partial differential equations
(2020)SAM Research ReportRecently, so-called full-history recursive multilevel Picard (MLP) approximation schemes have been introduced and shown to overcome the curse of dimensionality in the numerical approximation of semilinear parabolic partial differential equations (PDEs) with Lipschitz nonlinearities. The key contribution of this article is to introduce and analyze a new variant of MLP approximation schemes for certain semilinear elliptic PDEs with Lipschitz ...Report -
Counterexamples to local Lipschitz and local Hölder continuity with respect to the initial values for additive noise driven stochastic differential equations with smooth drift coefficient functions with at most polynomially growing derivatives
(2020)SAM Research ReportIn the recent article [A. Jentzen, B. Kuckuck, T. Müller-Gronbach, and L. Yaroslavtseva, arXiv:1904.05963 (2019)] it has been proved that the solutions to every additive noise driven stochastic differential equation (SDE) which has a drift coefficient function with at most polynomially growing first order partial derivatives and which admits a Lyapunov-type condition (ensuring the the existence of a unique solution to the SDE) depend in ...Report -
Efficient approximation of high-dimensional functions with deep neural networks
(2019)SAM Research ReportReport -
Generalised multilevel Picard approximations
(2019)SAM Research ReportIt is one of the most challenging problems in applied mathematics to approximatively solve high-dimensional partial differential equations (PDEs). In particular, most of the numerical approximation schemes studied in the scientific literature suffer under the curse of dimensionality in the sense that the number of computational operations needed to compute an approximation with an error of size at most $ \epsilon > 0$ grows at least ...Report -
Solving high-dimensional optimal stopping problems using deep learning
(2019)SAM Research ReportNowadays many financial derivatives which are traded on stock and futures exchanges, such as American or Bermudan options, are of early exercise type. Often the pricing of early exercise options gives rise to high-dimensional optimal stopping problems, since the dimension corresponds to the number of underlyings in the associated hedging portfolio. High-dimensional optimal stopping problems are, however, notoriously difficult to solve due ...Report -
Strong convergence rates on the whole probability space for space-time discrete numerical approximation schemes for stochastic Burgers equations
(2019)SAM Research ReportReport