STCN: Stochastic Temporal Convolutional Networks
OPEN ACCESS
Loading...
Author / Producer
Date
2023-05
Publication Type
Conference Paper
ETH Bibliography
yes
Citations
Altmetric
OPEN ACCESS
Data
Abstract
Convolutional architectures have recently been shown to be competitive on many
sequence modelling tasks when compared to the de-facto standard of recurrent neural networks (RNNs) while providing computational and modelling advantages due to inherent parallelism. However, currently, there remains a performance
gap to more expressive stochastic RNN variants, especially those with several layers of dependent random variables. In this work, we propose stochastic temporal convolutional networks (STCNs), a novel architecture that combines the computational advantages of temporal convolutional networks (TCN) with the representational power and robustness of stochastic latent spaces. In particular, we propose a hierarchy of stochastic latent variables that captures temporal dependencies at different time-scales. The architecture is modular and flexible due to the decoupling of the deterministic and stochastic layers. We show that the proposed architecture achieves state of the art log-likelihoods across several tasks. Finally, the model is capable of predicting high-quality synthetic samples over a long-range temporal horizon in modelling of handwritten text.
Permanent link
Publication status
published
External links
Editor
Book title
International Conference on Learning Representations (ICLR 2019)
Journal / series
Volume
3
Pages / Article No.
1636 - 1648
Publisher
Curran
Event
7th International Conference on Learning Representations (ICLR 2019)
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Organisational unit
03979 - Hilliges, Otmar (ehemalig) / Hilliges, Otmar (former)
Notes
Conference lecture held on May 7, 2019.
Funding
Related publications and datasets
Is identical to: