About the Mutual (Conditional) Information
Renato Renner and Ueli Maurer
In general, the mutual information between two random variables and , , might be larger or smaller than their mutual information conditioned on some additional information , . Such additional information can be seen as output of a channel taking as input and . It is thus a natural question, with applications in fields such as information theoretic cryptography, whether conditioning on the output of a fixed channel can potentially increase the mutual information between the inputs and .
In this paper, we give a necessary, sufficient, and easily verifiable criterion for the channel , i.e., the conditional probability distribution , such that for every distribution of the random variables and . Furthermore, the result is generalized to channels with inputs (for ), that is, to conditional probability distributions of the form .