
Open access
Date
2019-08Type
- Journal Article
ETH Bibliography
yes
Altmetrics
Abstract
Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000364984Publication status
publishedExternal links
Journal / series
EntropyVolume
Pages / Article No.
Publisher
MDPISubject
data processing; dependence measure; relative α-entropy; Rényi divergence; Rényi entropyOrganisational unit
03529 - Lapidoth, Amos / Lapidoth, Amos
More
Show all metadata
ETH Bibliography
yes
Altmetrics