STCN: Stochastic Temporal Convolutional Networks


Loading...

Date

2023-05

Publication Type

Conference Paper

ETH Bibliography

yes

Citations

Altmetric

Data

Abstract

Convolutional architectures have recently been shown to be competitive on many sequence modelling tasks when compared to the de-facto standard of recurrent neural networks (RNNs) while providing computational and modelling advantages due to inherent parallelism. However, currently, there remains a performance gap to more expressive stochastic RNN variants, especially those with several layers of dependent random variables. In this work, we propose stochastic temporal convolutional networks (STCNs), a novel architecture that combines the computational advantages of temporal convolutional networks (TCN) with the representational power and robustness of stochastic latent spaces. In particular, we propose a hierarchy of stochastic latent variables that captures temporal dependencies at different time-scales. The architecture is modular and flexible due to the decoupling of the deterministic and stochastic layers. We show that the proposed architecture achieves state of the art log-likelihoods across several tasks. Finally, the model is capable of predicting high-quality synthetic samples over a long-range temporal horizon in modelling of handwritten text.

Publication status

published

External links

Editor

Book title

International Conference on Learning Representations (ICLR 2019)

Journal / series

Volume

3

Pages / Article No.

1636 - 1648

Publisher

Curran

Event

7th International Conference on Learning Representations (ICLR 2019)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

03979 - Hilliges, Otmar (ehemalig) / Hilliges, Otmar (former) check_circle

Notes

Conference lecture held on May 7, 2019.

Funding

Related publications and datasets

Is identical to:
Is new version of: https://openreview.net/forum?id=HkzSQhCcK7Is new version of: 10.48550/arXiv.1902.06568