On the approximation of functions by tanh neural networks
METADATA ONLY
Loading...
Author / Producer
Date
2021-04
Publication Type
Report
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
We derive bounds on the error, in high-order Sobolev norms, incurred in the approximation of Sobolev-regular as well as analytic functions by neural networks with the hyperbolic tangent activation function. These bounds provide explicit estimates on the approximation error with respect to the size of the neural networks. We show that tanh neural networks with only two hidden layers suffice to approximate functions at comparable or better rates than much deeper ReLU neural networks.
Permanent link
Publication status
published
Editor
Book title
Journal / series
Volume
2021-14
Pages / Article No.
Publisher
Seminar for Applied Mathematics, ETH Zurich
Event
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Deep learning; Neural networks; Tanh; Function approximation
Organisational unit
03851 - Mishra, Siddhartha / Mishra, Siddhartha