Show simple item record

dc.contributor.author
Rusch, T. Konstantin
dc.contributor.author
Mishra, Siddhartha
dc.date.accessioned
2021-06-24T11:15:09Z
dc.date.available
2021-06-24T10:37:44Z
dc.date.available
2021-06-24T11:15:09Z
dc.date.issued
2021-03
dc.identifier.uri
http://hdl.handle.net/20.500.11850/491165
dc.description.abstract
The design of recurrent neural networks (RNNs) to accurately process sequential inputs with long-time dependencies is very challenging on account of the exploding and vanishing gradient problem. To overcome this, we propose a novel RNN architecture which is based on a structure preserving discretization of a Hamiltonian system of second-order ordinary differential equations that models networks of oscillators. The resulting RNN is fast, invertible (in time), memory efficient and we derive rigorous bounds on the hidden state gradients to prove the mitigation of the exploding and vanishing gradient problem. A suite of experiments are presented to demonstrate that the proposed RNN provides state of the art performance on a variety of learning tasks with (very) long-time dependencies.
en_US
dc.language.iso
en
en_US
dc.publisher
Seminar for Applied Mathematics, ETH Zurich
en_US
dc.subject
Recurrent neural network
en_US
dc.subject
Hamiltonian systems
en_US
dc.subject
Gradient stability
en_US
dc.subject
Long-term dependencies
en_US
dc.title
UnICORNN: A recurrent model for learning very long time dependencies
en_US
dc.type
Report
ethz.journal.title
SAM Research Report
ethz.journal.volume
2021-10
en_US
ethz.size
30 p.
en_US
ethz.grant
Computation and analysis of statistical solutions of fluid flow
en_US
ethz.publication.place
Zurich
en_US
ethz.publication.status
published
en_US
ethz.leitzahl
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02000 - Dep. Mathematik / Dep. of Mathematics::02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics::03851 - Mishra, Siddhartha / Mishra, Siddhartha
en_US
ethz.leitzahl.certified
ETH Zürich::00002 - ETH Zürich::00012 - Lehre und Forschung::00007 - Departemente::02000 - Dep. Mathematik / Dep. of Mathematics::02501 - Seminar für Angewandte Mathematik / Seminar for Applied Mathematics::03851 - Mishra, Siddhartha / Mishra, Siddhartha
en_US
ethz.identifier.url
https://math.ethz.ch/sam/research/reports.html?id=952
ethz.grant.agreementno
770880
ethz.grant.fundername
EC
ethz.grant.funderDoi
10.13039/501100000780
ethz.grant.program
H2020
ethz.date.deposited
2021-06-24T10:42:02Z
ethz.source
FORM
ethz.eth
yes
en_US
ethz.identifier.internal
https://math.ethz.ch/sam/research/reports.html?id=952
en_US
ethz.availability
Metadata only
en_US
ethz.rosetta.installDate
2021-06-24T11:15:17Z
ethz.rosetta.lastUpdated
2021-06-24T11:15:17Z
ethz.rosetta.versionExported
true
ethz.COinS
ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.atitle=UnICORNN:%20A%20recurrent%20model%20for%20learning%20very%20long%20time%20dependencies&rft.jtitle=SAM%20Research%20Report&rft.date=2021-03&rft.volume=2021-10&rft.au=Rusch,%20T.%20Konstantin&Mishra,%20Siddhartha&rft.genre=report&
 Search print copy at ETH Library

Files in this item

FilesSizeFormatOpen in viewer

There are no files associated with this item.

Publication type

Show simple item record