Suche
Ergebnisse
-
Weak error analysis for stochastic gradient descent optimization algorithms
(2020)SAM Research ReportStochastic gradient descent (SGD) type optimization schemes are fundamental ingredients in a large number of machine learning based algorithms. In particular, SGD type optimization schemes are frequently employed in applications involving natural language processing, object and face recognition,fraud detection, computational advertisement, and numerical approximations of partial differential equations. In mathematical convergence results ...Report -
Space-time deep neural network approximations for high-dimensional partial differential equations
(2020)SAM Research ReportIt is one of the most challenging issues in applied mathematics to approximately solve high-dimensional partial differential equations (PDEs) and most of the numerical approximation methods for PDEs in the scientific literature suffer from the so-called curse of dimensionality in the sense that the number of computational operations employed in the corresponding approximation scheme to obtain an approximation precision ε>0 grows exponentially ...Report -
-
Strong convergence for explicit space-time discrete numerical approximation methods for stochastic Burgers equations
(2017)SAM Research ReportReport -
Strong convergence of full-discrete nonlinearity-truncated accelerated exponential Euler-type approximations for stochastic Kuramoto-Sivashinsky equations
(2017)SAM Research ReportThis article introduces and analyzes a new explicit, easily implementable, and full discrete accelerated exponential Euler-type approximation scheme for additive space-time white noise driven stochastic partial differential equations (SPDEs) with possibly non-globally monotone nonlinearities such as stochastic Kuramoto-Sivashinsky equations. The main result of this article proves that the proposed approximation scheme converges strongly ...Report