Conditional Poisson Stochastic Beams


Date

2021-11

Publication Type

Conference Paper

ETH Bibliography

yes

Citations

Altmetric

Data

Abstract

Beam search is the default decoding strategy for many sequence generation tasks in NLP. The set of approximate K-best items returned by the algorithm is a useful summary of the distribution for many applications; however, the candidates typically exhibit high overlap and may give a highly biased estimate for expectations under our model. These problems can be addressed by instead using stochastic decoding strategies. In this work, we propose a new method for turning beam search into a stochastic process: Conditional Poisson stochastic beam search. Rather than taking the maximizing set at each iteration, we sample K candidates without replacement according to the conditional Poisson sampling design. We view this as a more natural alternative to Kool et al. (2019)’s stochastic beam search (SBS). Furthermore, we show how samples generated under the CPSBS design can be used to build consistent estimators and sample diverse sets from sequence models. In our experiments, we observe CPSBS produces lower variance and more efficient estimators than SBS, even showing improvements in high entropy settings.

Publication status

published

Book title

Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Journal / series

Volume

Pages / Article No.

664 - 681

Publisher

Association for Computational Linguistics

Event

Conference on Empirical Methods in Natural Language Processing (EMNLP 2021)

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

Organisational unit

09682 - Cotterell, Ryan / Cotterell, Ryan check_circle

Notes

Funding

Related publications and datasets