An Operational Characterization of Mutual Information in Algorithmic Information Theory - LIRMM - Laboratoire d’Informatique, de Robotique et de Microélectronique de Montpellier Access content directly
Journal Articles Journal of the ACM (JACM) Year : 2019

An Operational Characterization of Mutual Information in Algorithmic Information Theory

Abstract

We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties—one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair—can establish via a probabilistic protocol with interaction on a public channel. For ℓ > 2, the longest shared secret that can be established from a tuple of strings (x1, …, xℓ) by ℓ parties—each one having one component of the tuple and the complexity profile of the tuple—is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.

Dates and versions

lirmm-02297056 , version 1 (25-09-2019)

Identifiers

Cite

Andrei Romashchenko, Marius Zimand. An Operational Characterization of Mutual Information in Algorithmic Information Theory. Journal of the ACM (JACM), 2019, 66 (5), pp.1-42. ⟨10.1145/3356867⟩. ⟨lirmm-02297056⟩
101 View
0 Download

Altmetric

Share

Gmail Facebook X LinkedIn More