# A Property of the Intrinsic Mutual Information

## Matthias Christandl and Renato Renner and Stefan Wolf

```
```In the setting where two parties knowing random variables X and Y,
respectively, want to generate a secret key by communication
accessible to an adversary who additionally knows a finite random
variable Z, the so-called intrinsic information between X and $Y$
given Z, proved useful for determining the number of extractable
secret key bits. Given a tripartite probability distribution P_XYZ,
this information measure is, however, hard to compute in general since
a minimization has to be made over all possible discrete-output
channels the adversary could use for processing her information Z. We
strongly simplify this by showing that it can, without loss of
generality, be assumed that the output alphabets of these channels
equal their input alphabet; this implies in particular that there
exists an optimal channel which achieves the minimum, since the set of
such channels is compact. The proofs of our results combine techniques
from point-set topology, measure theory, and convex geometry.