Metadata only
Date
2023-05Type
- Conference Paper
ETH Bibliography
yes
Altmetrics
Abstract
Words are not created equal. In fact, they form an aristocratic graph with a latent hierarchical structure that the next generation of unsupervised learned word embeddings should reveal. In this paper, justified by the notion of delta-hyperbolicity or tree-likeliness of a space, we propose to embed words in a Cartesian product of hyperbolic spaces which we theoretically connect to the Gaussian word embeddings and their Fisher geometry. This connection allows us to introduce a novel principled hypernymy score for word embeddings. Moreover, we adapt the well-known Glove algorithm to learn unsupervised word embeddings in this type of Riemannian manifolds. We further explain how to solve the analogy task using the Riemannian parallel transport that generalizes vector arithmetics to this new type of geometry. Empirically, based on extensive experiments, we prove that our embeddings, trained unsupervised, are the first to simultaneously outperform strong and popular baselines on the tasks of similarity, analogy and hypernymy detection. In particular, for word hypernymy, we obtain new state-of-the-art on fully unsupervised WBLESS classification accuracy. Show more
Publication status
publishedBook title
International Conference on Learning Representations (ICLR 2019)Volume
Pages / Article No.
Publisher
CurranEvent
Subject
word embeddings; hyperbolic spaces; poincare ball; hypernymy; analogy; similarity; gaussian embeddingsOrganisational unit
09462 - Hofmann, Thomas / Hofmann, Thomas
Related publications and datasets
Is new version of: https://openreview.net/forum?id=Ske5r3AqK7
Is new version of: https://doi.org/10.48550/arXiv.1810.06546
More
Show all metadata
ETH Bibliography
yes
Altmetrics