Mutual information

probability
Mutual information

Suppose we have two random variables $X$ and $Y$ over different domains $A$ and $B$. Then the mutual information is defined to be

$$I(X, Y) = H(Y) - H(Y \vert X).$$

Where $H(Y)$ is the information entropy of $Y$ and $H(Y \vert X)$ is conditional entropy.