A Primer on Alpha-Information Theory with Application to Leakage in Secrecy Systems
Abstract
We give an informative review of the notions of Rényi's αentropy and α-divergence, Arimoto's conditional α-entropy, and Sibson's α-information, with emphasis on the various relations between them. All these generalize Shannon's classical information measures corresponding to α = 1. We present results on data processing inequalities and provide some new generalizations of the classical Fano's inequality for any α > 0. This enables one to α-information as a information theoretic metric of leakage in secrecy systems. Such metric can bound the gain of an adversary in guessing some secret (any potentially random function of some sensitive dataset) from disclosed measurements, compared with the adversary's prior belief (without access to measurements).
Origin : Files produced by the author(s)