Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms
dc.contributor.author
Grohs, Philipp
dc.contributor.author
Jentzen, Arnulf
dc.contributor.author
Salimova, Diyora
dc.date.accessioned
2022-08-02T15:16:40Z
dc.date.available
2022-07-10T03:31:04Z
dc.date.available
2022-08-01T10:43:31Z
dc.date.available
2022-08-02T15:12:40Z
dc.date.available
2022-08-02T15:16:40Z
dc.date.issued
2022-08
dc.identifier.issn
2662-2971
dc.identifier.issn
2662-2963
dc.identifier.other
10.1007/s42985-021-00100-z
en_US
dc.identifier.uri
http://hdl.handle.net/20.500.11850/557167
dc.description.abstract
In the past few years deep artificial neural networks (DNNs) have been successfully employed in a large number of computational problems including, e.g., language processing, image recognition, fraud detection, and computational advertisement. Recently, it has also been proposed in the scientific literature to reformulate high-dimensional partial differential equations (PDEs) as stochastic learning problems and to employ DNNs together with stochastic gradient descent methods to approximate the solutions of such high-dimensional PDEs. There are also a few mathematical convergence results in the scientific literature which show that DNNs can approximate solutions of certain PDEs without the curse of dimensionality in the sense that the number of real parameters employed to describe the DNN grows at most polynomially both in the PDE dimension d∈ N and the reciprocal of the prescribed approximation accuracy ε> 0. One key argument in most of these results is, first, to employ a Monte Carlo approximation scheme which can approximate the solution of the PDE under consideration at a fixed space-time point without the curse of dimensionality and, thereafter, to prove then that DNNs are flexible enough to mimic the behaviour of the employed approximation scheme. Having this in mind, one could aim for a general abstract result which shows under suitable assumptions that if a certain function can be approximated by any kind of (Monte Carlo) approximation scheme without the curse of dimensionality, then the function can also be approximated with DNNs without the curse of dimensionality. It is a subject of this article to make a first step towards this direction. In particular, the main result of this paper, roughly speaking, shows that if a function can be approximated by means of some suitable discrete approximation scheme without the curse of dimensionality and if there exist DNNs which satisfy certain regularity properties and which approximate this discrete approximation scheme without the curse of dimensionality, then the function itself can also be approximated with DNNs without the curse of dimensionality. Moreover, for the number of real parameters used to describe such approximating DNNs we provide an explicit upper bound for the optimal exponent of the dimension d∈ N of the function under consideration as well as an explicit lower bound for the optimal exponent of the prescribed approximation accuracy ε> 0. As an application of this result we derive that solutions of suitable Kolmogorov PDEs can be approximated with DNNs without the curse of dimensionality.
en_US
dc.language.iso
en
en_US
dc.publisher
Springer
en_US
dc.title
Deep neural network approximations for solutions of PDEs based on Monte Carlo algorithms
en_US
dc.type
Journal Article
dc.date.published
2022-06-08
ethz.journal.title
Partial Differential Equations and Applications
ethz.journal.volume
3
en_US
ethz.journal.issue
4
en_US
ethz.journal.abbreviated
Partial Differ. Equ. Appl.
ethz.pages.start
45
en_US
ethz.size
41 p.
en_US
ethz.grant
Higher order numerical approximation methods for stochastic partial differential equations
en_US
ethz.identifier.scopus
ethz.publication.place
Cham
en_US
ethz.publication.status
published
en_US
ethz.grant.agreementno
175699
ethz.grant.fundername
SNF
ethz.grant.funderDoi
10.13039/501100001711
ethz.grant.program
Projekte MINT
ethz.date.deposited
2022-07-10T03:31:09Z
ethz.source
SCOPUS
ethz.eth
yes
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2022-08-01T10:43:54Z
ethz.rosetta.lastUpdated
2023-02-07T04:59:31Z
ethz.rosetta.exportRequired
true
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=Deep%20neural%20network%20approximations%20for%20solutions%20of%20PDEs%20based%20on%20Monte%20Carlo%20algorithms&rft.jtitle=Partial%20Differential%20Equations%20and%20Applications&rft.date=2022-08&rft.volume=3&rft.issue=4&rft.spage=45&rft.issn=2662-2971&2662-2963&rft.au=Grohs,%20Philipp&Jentzen,%20Arnulf&Salimova,%20Diyora&rft.genre=article&rft_id=info:doi/10.1007/s42985-021-00100-z&
Files in this item
Files | Size | Format | Open in viewer |
---|---|---|---|
There are no files associated with this item. |
Publication type
-
Journal Article [132254]