Diffusion models for probabilistic programming
METADATA ONLY
Loading...
Author / Producer
Date
2023-12-15
Publication Type
Conference Poster
ETH Bibliography
yes
Citations
Altmetric
METADATA ONLY
Data
Rights / License
Abstract
We propose Diffusion Model Variational Inference (DMVI), a novel method for automated approximate inference in probabilistic programming languages (PPLs). DMVI utilizes diffusion models as variational approximations to the true posterior distribution by deriving a novel bound to the marginal likelihood objective used in Bayesian modelling. DMVI is easy to implement, allows hassle-free inference in PPLs without the drawbacks of, e.g., variational inference using normalizing flows, and does not make any constraints on the underlying neural network model. We evaluate DMVI on a set of common Bayesian models and show that its posterior inferences are in general more accurate than those of contemporary methods used in PPLs while having a similar computational cost and requiring less manual tuning.
Permanent link
Publication status
unpublished
External links
Editor
Book title
Journal / series
Volume
Pages / Article No.
Publisher
Event
Workshop on Diffusion Models @ NeurIPS 2023
Edition / version
Methods
Software
Geographic location
Date collected
Date created
Subject
Machine Learning (cs.LG); Machine Learning (stat.ML); FOS: Computer and information sciences
Organisational unit
02286 - Swiss Data Science Center (SDSC) / Swiss Data Science Center (SDSC)
Notes
Funding
Related publications and datasets
Is new version of: 10.48550/ARXIV.2311.00474