An Operational Characterization of Mutual Information in Algorithmic Information Theory

Abstract : We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For ℓ > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xℓ) by ℓ parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.
Complete list of metadatas

https://hal-lirmm.ccsd.cnrs.fr/lirmm-02297056
Contributor : Andrei Romashchenko <>
Submitted on : Wednesday, September 25, 2019 - 4:53:22 PM
Last modification on : Wednesday, January 29, 2020 - 1:15:17 AM

Links full text

Identifiers

Collections

Citation

Andrei Romashchenko, Marius Zimand. An Operational Characterization of Mutual Information in Algorithmic Information Theory. Journal of the ACM (JACM), Association for Computing Machinery, 2019, 66 (5), pp.1-42. ⟨10.1145/3356867⟩. ⟨lirmm-02297056⟩

Share

Metrics

Record views

48