ETH Zürich » Computer Science » Theory » Cryptography

Publications: Abstract

About the Mutual (Conditional) Information

Renato Renner and Ueli Maurer

In general, the mutual information between two random variables $X$ and $Y$, $I(X;Y)$, might be larger or smaller than their mutual information conditioned on some additional information $Z$, $I(X;Y|Z)$. Such additional information $Z$ can be seen as output of a channel $C$ taking as input $X$ and $Y$. It is thus a natural question, with applications in fields such as information theoretic cryptography, whether conditioning on the output $Z$ of a fixed channel $C$ can potentially increase the mutual information between the inputs $X$ and $Y$.

In this paper, we give a necessary, sufficient, and easily verifiable criterion for the channel $C$, i.e., the conditional probability distribution $P_{Z|XY}$, such that $I(X;Y) \geq I(X;Y|Z)$ for every distribution of the random variables $X$ and $Y$. Furthermore, the result is generalized to channels with $n$ inputs (for $n \in \mathbb{N}$), that is, to conditional probability distributions of the form $P_{Z | X_1 \cdots X_n}.