Two Measures of Dependence


Loading...

Date

2019-08

Publication Type

Journal Article

ETH Bibliography

yes

Citations

Altmetric

Data

Abstract

Two families of dependence measures between random variables are introduced. They are based on the Rényi divergence of order α and the relative α -entropy, respectively, and both dependence measures reduce to Shannon’s mutual information when their order α is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.

Publication status

published

Editor

Book title

Journal / series

Volume

21 (8)

Pages / Article No.

778

Publisher

MDPI

Event

Edition / version

Methods

Software

Geographic location

Date collected

Date created

Subject

data processing; dependence measure; relative α-entropy; Rényi divergence; Rényi entropy

Organisational unit

03529 - Lapidoth, Amos / Lapidoth, Amos check_circle

Notes

Funding

Related publications and datasets