Open access
Date
2024-03-06Type
- Conference Paper
ETH Bibliography
no
Altmetrics
Abstract
Consider a binary statistical hypothesis testing problem, where n independent and identically distributed random variables Z n are either distributed according to the null hypothesis P or the alternate hypothesis Q, and only P is known. A well-known test that is suitable for this case is the so-called Hoeffding test, which accepts P if the Kullback-Leibler (KL) divergence between the empirical distribution of Z n and P is below some threshold. In this work, we characterize the first and second-order terms of the type-II error probability for a fixed type-I error probability for the Hoeffding test as well as for divergence tests, where the KL divergence is replaced by a general divergence. We demonstrate that, irrespective of the divergence, divergence tests achieve the first-order term of the Neyman-Pearson test, which is the optimal test when both P and Q are known. In contrast, the second-order term of divergence tests is strictly worse than that of the Neyman-Pearson test. We further demonstrate that divergence tests with an invariant divergence achieve the same second-order term as the Hoeffding test, but divergence tests with a non-invariant divergence may outperform the Hoeffding test for some alternate hypotheses Q. Show more
Permanent link
https://doi.org/10.3929/ethz-b-000664581Publication status
publishedBook title
International Zurich Seminar on Information and Communication (IZS 2024). ProceedingsPages / Article No.
Publisher
ETH ZurichEvent
Organisational unit
02140 - Dep. Inf.technologie und Elektrotechnik / Dep. of Inform.Technol. Electrical Eng.
Related publications and datasets
Is part of: https://doi.org/10.3929/ethz-b-000664209
More
Show all metadata
ETH Bibliography
no
Altmetrics