Metadata only
Date
2021-04Type
- Report
ETH Bibliography
yes
Altmetrics
Abstract
We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks. Show more
Publication status
publishedExternal links
Journal / series
SAM Research ReportVolume
Publisher
Seminar for Applied Mathematics, ETH ZurichSubject
Deep learning; Neural networks; Tanh; Function approximationOrganisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha
More
Show all metadata
ETH Bibliography
yes
Altmetrics