Deep ReLU Neural Network Approximation for stochastic differential equations with jumps
Metadata only
Date
2021-02Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
Deep neural networks (DNNs) with ReLU activation function are proved to be able to express viscosity solutions of linear partial integrodifferental equations (PIDEs) on state spaces of possibly high dimension d.Admissible PIDEs comprise Kolmogorov equations for high-dimensional diffusion, advection, and for pure jump Lévy processes. We prove for such PIDEs arising from a class of jump-diffusions on Rd, that for any compact K⊂Rd, there exist constants C,p,q>0 such that for every ε∈(0,1] and for every d∈N the nomalized (over K) DNN L2-expression error of viscosity solutions of the PIDE is of size ε with DNN size bounded by Cdpε−q. In particular, the constant C>0 is independent of d∈N and of ε∈(0,1] and depends only on the coefficients in the PIDE and the measure used to quantify the error. This establishes that ReLU DNNs can break the curse of dimensionality (CoD for short) for viscosity solutions of linear, possibly degenerate PIDEs corresponding to Markovian jump-diffusion processes. As a consequence of the employed techniques we also obtain that expectations of a large class of path-dependent functionals of the underlying jump-diffusion processes can be expressed without the CoD. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichOrganisational unit
03435 - Schwab, Christoph / Schwab, Christoph
More
Show all metadata
ETH Bibliography
yes
Altmetrics