High-dimensional distribution generation through deep neural networks


Date

2021

Publication Type

Journal Article

ETH Bibliography

yes

Citations

Altmetric

Data

Abstract

We show that every d-dimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 1-dimensional uniform input distribution. What is more, this is possible without incurring a cost—in terms of approximation error measured in Wasserstein-distance—relative to generating the d-dimensional target distribution from d independent random variables. This is enabled by a vast generalization of the space-filling approach discovered in Bailey and Telgarsky (in: Bengio (eds) Advances in neural information processing systems vol 31, pp 6489–6499. Curran Associates, Inc., Red Hook, 2018). The construction we propose elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we find that, for histogram target distributions, the number of bits needed to encode the corresponding generative network equals the fundamental limit for encoding probability distributions as dictated by quantization theory.

Publication status

published

Editor

Book title

Volume

2 (5)

Pages / Article No.

64

Publisher

Springer

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Deep learning; Neural networks; Generative networks; Space-filling curves; Quantization; Approximation theory

Organisational unit

03610 - Boelcskei, Helmut / Boelcskei, Helmut check_circle

Notes

Funding

Related publications and datasets